WorldWideScience

Sample records for advanced analytical simulation

  1. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  2. Monte Carlo and analytic simulations in nanoparticle-enhanced radiation therapy

    Directory of Open Access Journals (Sweden)

    Paro AD

    2016-09-01

    Full Text Available Autumn D Paro,1 Mainul Hossain,2 Thomas J Webster,1,3,4 Ming Su1,4 1Department of Chemical Engineering, Northeastern University, Boston, MA, USA; 2NanoScience Technology Center and School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, Florida, USA; 3Excellence for Advanced Materials Research, King Abdulaziz University, Jeddah, Saudi Arabia; 4Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Science, Wenzhou Medical University, Zhejiang, People’s Republic of China Abstract: Analytical and Monte Carlo simulations have been used to predict dose enhancement factors in nanoparticle-enhanced X-ray radiation therapy. Both simulations predict an increase in dose enhancement in the presence of nanoparticles, but the two methods predict different levels of enhancement over the studied energy, nanoparticle materials, and concentration regime for several reasons. The Monte Carlo simulation calculates energy deposited by electrons and photons, while the analytical one only calculates energy deposited by source photons and photoelectrons; the Monte Carlo simulation accounts for electron–hole recombination, while the analytical one does not; and the Monte Carlo simulation randomly samples photon or electron path and accounts for particle interactions, while the analytical simulation assumes a linear trajectory. This study demonstrates that the Monte Carlo simulation will be a better choice to evaluate dose enhancement with nanoparticles in radiation therapy. Keywords: nanoparticle, dose enhancement, Monte Carlo simulation, analytical simulation, radiation therapy, tumor cell, X-ray 

  3. Behavioural effects of advanced cruise control use : a meta-analytic approach.

    NARCIS (Netherlands)

    Dragutinovic, N. Brookhuis, K.A. Hagenzieker, M.P. & Marchau, V.A.W.J.

    2006-01-01

    In this study, a meta-analytic approach was used to analyse effects of Advanced Cruise Control (ACC) on driving behaviour reported in seven driving simulator studies. The effects of ACC on three consistent outcome measures, namely, driving speed, headway and driver workload have been analysed. The

  4. Automated Deployment of Advanced Controls and Analytics in Buildings

    Science.gov (United States)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  5. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  6. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  7. Advanced Dynamics Analytical and Numerical Calculations with MATLAB

    CERN Document Server

    Marghitu, Dan B

    2012-01-01

    Advanced Dynamics: Analytical and Numerical Calculations with MATLAB provides a thorough, rigorous presentation of kinematics and dynamics while using MATLAB as an integrated tool to solve problems. Topics presented are explained thoroughly and directly, allowing fundamental principles to emerge through applications from areas such as multibody systems, robotics, spacecraft and design of complex mechanical devices. This book differs from others in that it uses symbolic MATLAB for both theory and applications. Special attention is given to solutions that are solved analytically and numerically using MATLAB. The illustrations and figures generated with MATLAB reinforce visual learning while an abundance of examples offer additional support. This book also: Provides solutions analytically and numerically using MATLAB Illustrations and graphs generated with MATLAB reinforce visual learning for students as they study Covers modern technical advancements in areas like multibody systems, robotics, spacecraft and des...

  8. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  9. Advanced Simulation Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Simulation Center consists of 10 individual facilities which provide missile and submunition hardware-in-the-loop simulation capabilities. The following...

  10. INFIL1D: a quasi-analytical model for simulating one-dimensional, constant flux infiltration

    International Nuclear Information System (INIS)

    Simmons, C.S.; McKeon, T.J.

    1984-04-01

    The program INFIL1D is designed to calculate approximate wetting-front advance into an unsaturated, uniformly moist, homogeneous soil profile, under constant surface-flux conditions. The code is based on a quasi-analytical method, which utilizes an assumed invariant functional relationship between reduced (normalized) flux and water content. The code uses general hydraulic property data in tabular form to simulate constant surface-flux infiltration. 10 references, 4 figures

  11. An analytical simulation technique for cone-beam CT and pinhole SPECT

    International Nuclear Information System (INIS)

    Zhang Xuezhu; Qi Yujin

    2011-01-01

    This study was aimed at developing an efficient simulation technique with an ordinary PC. The work involved derivation of mathematical operators, analytic phantom generations, and effective analytical projectors developing for cone-beam CT and pinhole SPECT imaging. The computer simulations based on the analytical projectors were developed by ray-tracing method for cone-beam CT and voxel-driven method for pinhole SPECT of degrading blurring. The 3D Shepp-Logan, Jaszczak and Defrise phantoms were used for simulation evaluations and image reconstructions. The reconstructed phantom images were of good accuracy with the phantoms. The results showed that the analytical simulation technique is an efficient tool for studying cone-beam CT and pinhole SPECT imaging. (authors)

  12. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  13. Modeling and analytical simulation of a smouldering carbonaceous ...

    African Journals Online (AJOL)

    Modeling and analytical simulation of a smouldering carbonaceous rod. A.A. Mohammed, R.O. Olayiwola, M Eseyin, A.A. Wachin. Abstract. Modeling of pyrolysis and combustion in a smouldering fuel bed requires the solution of flow, heat and mass transfer through porous media. This paper presents an analytical method ...

  14. Upgraded operator training by using advanced simulators

    International Nuclear Information System (INIS)

    Iwashita, Akira; Toeda, Susumu; Fujita, Eimitsu; Moriguchi, Iwao; Wada, Kouji

    1991-01-01

    BWR Operator Training Center Corporation (BTC) has been conducting the operator training for all BWR utilities in Japan using fullscope simulators. Corresponding to increasing quantitative demands and higher qualitative needs of operator training, BTC put advanced simulators in operation (BTC-2 simulator in 1983 and BTC-3 simulator in 1989). This paper describes the methods and the effects of upgraded training contents by using these advanced simulators. These training methods are applied to the 'Advanced Operator Training course,' the 'Operator Retraining Course' and also the 'Family (crew) Training Course.' (author)

  15. Advanced circuit simulation using Multisim workbench

    CERN Document Server

    Báez-López, David; Cervantes-Villagómez, Ofelia Delfina

    2012-01-01

    Multisim is now the de facto standard for circuit simulation. It is a SPICE-based circuit simulator which combines analog, discrete-time, and mixed-mode circuits. In addition, it is the only simulator which incorporates microcontroller simulation in the same environment. It also includes a tool for printed circuit board design.Advanced Circuit Simulation Using Multisim Workbench is a companion book to Circuit Analysis Using Multisim, published by Morgan & Claypool in 2011. This new book covers advanced analyses and the creation of models and subcircuits. It also includes coverage of transmissi

  16. Technical Basis for Physical Fidelity of NRC Control Room Training Simulators for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Minsk, Brian S.; Branch, Kristi M.; Bates, Edward K.; Mitchell, Mark R.; Gore, Bryan F.; Faris, Drury K.

    2009-10-09

    The objective of this study is to determine how simulator physical fidelity influences the effectiveness of training the regulatory personnel responsible for examination and oversight of operating personnel and inspection of technical systems at nuclear power reactors. It seeks to contribute to the U.S. Nuclear Regulatory Commission’s (NRC’s) understanding of the physical fidelity requirements of training simulators. The goal of the study is to provide an analytic framework, data, and analyses that inform NRC decisions about the physical fidelity requirements of the simulators it will need to train its staff for assignment at advanced reactors. These staff are expected to come from increasingly diverse educational and experiential backgrounds.

  17. Analytical vs. Simulation Solution Techniques for Pulse Problems in Non-linear Stochastic Dynamics

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.

    Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically-numerical tec......Advantages and disadvantages of available analytical and simulation techniques for pulse problems in non-linear stochastic dynamics are discussed. First, random pulse problems, both those which do and do not lead to Markov theory, are presented. Next, the analytical and analytically...

  18. Making advanced analytics work for you.

    Science.gov (United States)

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  19. Molecular dynamics simulations of matrix assisted laser desorption ionization: Matrix-analyte interactions

    International Nuclear Information System (INIS)

    Nangia, Shivangi; Garrison, Barbara J.

    2011-01-01

    There is synergy between matrix assisted laser desorption ionization (MALDI) experiments and molecular dynamics (MD) simulations. To understand analyte ejection from the matrix, MD simulations have been employed. Prior calculations show that the ejected analyte molecules remain solvated by the matrix molecules in the ablated plume. In contrast, the experimental data show free analyte ions. The main idea of this work is that analyte molecule ejection may depend on the microscopic details of analyte interaction with the matrix. Intermolecular matrix-analyte interactions have been studied by focusing on 2,5-dihydroxybenzoic acid (DHB; matrix) and amino acids (AA; analyte) using Chemistry at HARvard Molecular Mechanics (CHARMM) force field. A series of AA molecules have been studied to analyze the DHB-AA interaction. A relative scale of AA molecule affinity towards DHB has been developed.

  20. Advances in the Analytical Methods for Determining the Antioxidant ...

    African Journals Online (AJOL)

    Advances in the Analytical Methods for Determining the Antioxidant Properties of Honey: A Review. M Moniruzzaman, MI Khalil, SA Sulaiman, SH Gan. Abstract. Free radicals and reactive oxygen species (ROS) have been implicated in contributing to the processes of aging and disease. In an effort to combat free radical ...

  1. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  2. Analytical simulation of two dimensional advection dispersion ...

    African Journals Online (AJOL)

    The study was designed to investigate the analytical simulation of two dimensional advection dispersion equation of contaminant transport. The steady state flow condition of the contaminant transport where inorganic contaminants in aqueous waste solutions are disposed of at the land surface where it would migrate ...

  3. Analytical Simulation of Two Dimensional Advection Dispersion ...

    African Journals Online (AJOL)

    ADOWIE PERE

    ABSTRACT: The study was designed to investigate the analytical simulation of two dimensional advection dispersion equation of contaminant transport. The steady state flow condition of the contaminant transport where inorganic contaminants in aqueous waste solutions are disposed of at the land surface where it would ...

  4. Advanced Vadose Zone Simulations Using TOUGH

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.; Doughty, C.; Kowalsky, M.B.; Moridis, G.J.; Pan,L.; Xu, T.; Zhang, Y.; Pruess, K.

    2007-02-01

    The vadose zone can be characterized as a complex subsurfacesystem in which intricate physical and biogeochemical processes occur inresponse to a variety of natural forcings and human activities. Thismakes it difficult to describe, understand, and predict the behavior ofthis specific subsurface system. The TOUGH nonisothermal multiphase flowsimulators are well-suited to perform advanced vadose zone studies. Theconceptual models underlying the TOUGH simulators are capable ofrepresenting features specific to the vadose zone, and of addressing avariety of coupled phenomena. Moreover, the simulators are integratedinto software tools that enable advanced data analysis, optimization, andsystem-level modeling. We discuss fundamental and computationalchallenges in simulating vadose zone processes, review recent advances inmodeling such systems, and demonstrate some capabilities of the TOUGHsuite of codes using illustrative examples.

  5. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  6. Dynamic Simulations of Advanced Fuel Cycles

    International Nuclear Information System (INIS)

    Piet, Steven J.; Dixon, Brent W.; Jacobson, Jacob J.; Matthern, Gretchen E.; Shropshire, David E.

    2011-01-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the U.S. Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe 'lessons learned' from dynamic simulations but attempt to answer the 'so what' question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof.

  7. New hybrid voxelized/analytical primitive in Monte Carlo simulations for medical applications

    International Nuclear Information System (INIS)

    Bert, Julien; Lemaréchal, Yannick; Visvikis, Dimitris

    2016-01-01

    Monte Carlo simulations (MCS) applied in particle physics play a key role in medical imaging and particle therapy. In such simulations, particles are transported through voxelized phantoms derived from predominantly patient CT images. However, such voxelized object representation limits the incorporation of fine elements, such as artificial implants from CAD modeling or anatomical and functional details extracted from other imaging modalities. In this work we propose a new hYbrid Voxelized/ANalytical primitive (YVAN) that combines both voxelized and analytical object descriptions within the same MCS, without the need to simultaneously run two parallel simulations, which is the current gold standard methodology. Given that YVAN is simply a new primitive object, it does not require any modifications on the underlying MC navigation code. The new proposed primitive was assessed through a first simple MCS. Results from the YVAN primitive were compared against an MCS using a pure analytical geometry and the layer mass geometry concept. A perfect agreement was found between these simulations, leading to the conclusion that the new hybrid primitive is able to accurately and efficiently handle phantoms defined by a mixture of voxelized and analytical objects. In addition, two application-based evaluation studies in coronary angiography and intra-operative radiotherapy showed that the use of YVAN was 6.5% and 12.2% faster than the layered mass geometry method, respectively, without any associated loss of accuracy. However, the simplification advantages and differences in computational time improvements obtained with YVAN depend on the relative proportion of the analytical and voxelized structures used in the simulation as well as the size and number of triangles used in the description of the analytical object meshes. (paper)

  8. New hybrid voxelized/analytical primitive in Monte Carlo simulations for medical applications.

    Science.gov (United States)

    Bert, Julien; Lemaréchal, Yannick; Visvikis, Dimitris

    2016-05-07

    Monte Carlo simulations (MCS) applied in particle physics play a key role in medical imaging and particle therapy. In such simulations, particles are transported through voxelized phantoms derived from predominantly patient CT images. However, such voxelized object representation limits the incorporation of fine elements, such as artificial implants from CAD modeling or anatomical and functional details extracted from other imaging modalities. In this work we propose a new hYbrid Voxelized/ANalytical primitive (YVAN) that combines both voxelized and analytical object descriptions within the same MCS, without the need to simultaneously run two parallel simulations, which is the current gold standard methodology. Given that YVAN is simply a new primitive object, it does not require any modifications on the underlying MC navigation code. The new proposed primitive was assessed through a first simple MCS. Results from the YVAN primitive were compared against an MCS using a pure analytical geometry and the layer mass geometry concept. A perfect agreement was found between these simulations, leading to the conclusion that the new hybrid primitive is able to accurately and efficiently handle phantoms defined by a mixture of voxelized and analytical objects. In addition, two application-based evaluation studies in coronary angiography and intra-operative radiotherapy showed that the use of YVAN was 6.5% and 12.2% faster than the layered mass geometry method, respectively, without any associated loss of accuracy. However, the simplification advantages and differences in computational time improvements obtained with YVAN depend on the relative proportion of the analytical and voxelized structures used in the simulation as well as the size and number of triangles used in the description of the analytical object meshes.

  9. Higher geometry an introduction to advanced methods in analytic geometry

    CERN Document Server

    Woods, Frederick S

    2005-01-01

    For students of mathematics with a sound background in analytic geometry and some knowledge of determinants, this volume has long been among the best available expositions of advanced work on projective and algebraic geometry. Developed from Professor Woods' lectures at the Massachusetts Institute of Technology, it bridges the gap between intermediate studies in the field and highly specialized works.With exceptional thoroughness, it presents the most important general concepts and methods of advanced algebraic geometry (as distinguished from differential geometry). It offers a thorough study

  10. Advanced analytical techniques for boiling water reactor chemistry control

    Energy Technology Data Exchange (ETDEWEB)

    Alder, H P; Schenker, E [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-02-01

    The analytical techniques applied can be divided into 5 classes: OFF-LINE (discontinuous, central lab), AT-LINE (discontinuous, analysis near loop), ON-LINE (continuous, analysis in bypass). In all cases pressure and temperature of the water sample are reduced. In a strict sense only IN-LINE (continuous, flow disturbance) and NON-INVASIVE (continuous, no flow disturbance) techniques are suitable for direct process control; - the ultimate goal. An overview of the analytical techniques tested in the pilot loop is given. Apart from process and overall water quality control, standard for BWR operation, the main emphasis is on water impurity characterization (crud particles, hot filtration, organic carbon); on stress corrosion crackling control for materials (corrosion potential, oxygen concentration) and on the characterization of the oxide layer on austenites (impedance spectroscopy, IR-reflection). The above mentioned examples of advanced analytical techniques have the potential of in-line or non-invasive application. They are different stages of development and are described in more detail. 28 refs, 1 fig., 5 tabs.

  11. Analytical simulations in the field of two-phase flow

    International Nuclear Information System (INIS)

    Karwat, H.

    1978-01-01

    Power reactors are designed with engineered safeguards to cope with the consequences of possible failures or malfunctions. Experiments are carried out to verify the analytical simulations used in the design of these engineered safeguards. The paper discusses the basis for the verification of the analytical simulations, the requirements of corresponding experiments used to validitate the analysis and the necessary boundary conditions of the experiment as well as of the reactor systems. A detailed description of a typical boundary condition for real reactor systems is shown to be important, if experimental observations are to be interpreted correctly. Finally, the question will be addressed whether experiments on a larger scale than 1/1000 or 1/100 are necessary to extrapolate experimental observatons to a full scale reactor situation. (author)

  12. Model-based Engineering for the Integration of Manufacturing Systems with Advanced Analytics

    OpenAIRE

    Lechevalier , David; Narayanan , Anantha; Rachuri , Sudarsan; Foufou , Sebti; Lee , Y Tina

    2016-01-01

    Part 3: Interoperability and Systems Integration; International audience; To employ data analytics effectively and efficiently on manufacturing systems, engineers and data scientists need to collaborate closely to bring their domain knowledge together. In this paper, we introduce a domain-specific modeling approach to integrate a manufacturing system model with advanced analytics, in particular neural networks, to model predictions. Our approach combines a set of meta-models and transformatio...

  13. Proceedings of third national symposium on recent advances in analytical sciences

    International Nuclear Information System (INIS)

    2010-04-01

    The contributions made by analytical scientists have played critical roles in the areas ranging from the development of concepts and theories to a variety of practical applications such as mining, refining, fuel processing, fertilisers, food products, nano materials etc. The theme of the symposium 'Recent Advances in Analytical Sciences and Applications' is well significant in view of its importance in the design and development of new products as well as in the environmental monitoring and quality control in industrial manufacturing. Papers relevant to INIS are indexed separately

  14. Simulation of fission products behavior in severe accidents for advanced passive PWR

    International Nuclear Information System (INIS)

    Tong, L.L.; Huang, G.F.; Cao, X.W.

    2015-01-01

    Highlights: • A fission product analysis model based on thermal hydraulic module is developed. • An assessment method for fission product release and transport is constructed. • Fission products behavior during three modes of containment response is investigated. • Source term results for the three modes of containment response are obtained. - Abstract: Fission product behavior for common Pressurized Water Reactor (PWR) has been studied for many years, and some analytical tools have developed. However, studies specifically on the behavior of fission products related to advanced passive PWR is scarce. In the current study, design characteristics of advanced passive PWR influencing fission product behavior are investigated. An integrated fission products analysis model based on a thermal hydraulic module is developed, and the assessment method for fission products release and transport for advanced passive PWR is constructed. Three modes of containment response are simulated, including intact containment, containment bypass and containment overpressure failure. Fission products release from the core and corium, fission products transport and deposition in the Reactor Coolant System (RCS), fission products transport and deposition in the containment considering fission products retention in the in-containment refueling water storage tank (IRWST) and in the secondary side of steam generators (SGs) are simulated. Source term results of intact containment, containment bypass and containment overpressure failure are obtained, which can be utilized to evaluate the radiological consequences

  15. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  16. ANALYTICAL AND SIMULATION PLANNING MODEL OF URBAN PASSENGER TRANSPORT

    Directory of Open Access Journals (Sweden)

    Andrey Borisovich Nikolaev

    2017-09-01

    Full Text Available The article described the structure of the analytical and simulation models to make informed decisions in the planning of urban passenger transport. Designed UML diagram that describes the relationship of classes of the proposed model. A description of the main agents of the model developed in the simulation AnyLogic. Designed user interface integration with GIS map. Also provides simulation results that allow concluding about her health and the possibility of its use in solving planning problems of urban passenger transport.

  17. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  18. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  20. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  1. Recent advancements in medical simulation: patient-specific virtual reality simulation.

    Science.gov (United States)

    Willaert, Willem I M; Aggarwal, Rajesh; Van Herzeele, Isabelle; Cheshire, Nicholas J; Vermassen, Frank E

    2012-07-01

    Patient-specific virtual reality simulation (PSVR) is a new technological advancement that allows practice of upcoming real operations and complements the established role of VR simulation as a generic training tool. This review describes current developments in PSVR and draws parallels with other high-stake industries, such as aviation, military, and sports. A review of the literature was performed using PubMed and Internet search engines to retrieve data relevant to PSVR in medicine. All reports pertaining to PSVR were included. Reports on simulators that did not incorporate a haptic interface device were excluded from the review. Fifteen reports described 12 simulators that enabled PSVR. Medical procedures in the field of laparoscopy, vascular surgery, orthopedics, neurosurgery, and plastic surgery were included. In all cases, source data was two-dimensional CT or MRI data. Face validity was most commonly reported. Only one (vascular) simulator had undergone face, content, and construct validity. Of the 12 simulators, 1 is commercialized and 11 are prototypes. Five simulators have been used in conjunction with real patient procedures. PSVR is a promising technological advance within medicine. The majority of simulators are still in the prototype phase. As further developments unfold, the validity of PSVR will have to be examined much like generic VR simulation for training purposes. Nonetheless, similar to the aviation, military, and sport industries, operative performance and patient safety may be enhanced by the application of this novel technology.

  2. Improving the trust in results of numerical simulations and scientific data analytics

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck [Argonne National Lab. (ANL), Argonne, IL (United States); Constantinescu, Emil [Argonne National Lab. (ANL), Argonne, IL (United States); Hovland, Paul [Argonne National Lab. (ANL), Argonne, IL (United States); Peterka, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Carolyn [Argonne National Lab. (ANL), Argonne, IL (United States); Snir, Marc [Argonne National Lab. (ANL), Argonne, IL (United States); Wild, Stefan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-30

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation and scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general

  3. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  4. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  5. Discrete event simulation of the Defense Waste Processing Facility (DWPF) analytical laboratory

    International Nuclear Information System (INIS)

    Shanahan, K.L.

    1992-02-01

    A discrete event simulation of the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) analytical laboratory has been constructed in the GPSS language. It was used to estimate laboratory analysis times at process analytical hold points and to study the effect of sample number on those times. Typical results are presented for three different simultaneous representing increasing levels of complexity, and for different sampling schemes. Example equipment utilization time plots are also included. SRS DWPF laboratory management and chemists found the simulations very useful for resource and schedule planning

  6. Using Big Data Analytics to Advance Precision Radiation Oncology.

    Science.gov (United States)

    McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore

    2018-06-01

    Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Analytical solutions and particle simulations of cross-field plasma sheaths

    International Nuclear Information System (INIS)

    Gerver, M.J.; Parker, S.E.; Theilhaber, K.

    1989-01-01

    Particles simulations have been made of an infinite plasma slab, bounded by absorbing conducting walls, with a magnetic field parallel to the walls. The simulations have been either 1-D, or 2-D, with the magnetic field normal to the simulation plane. Initially, the plasma has a uniform density between the walls, and there is a uniform source of ions and electrons to replace particles lost to the walls. In the 1-D case, there is no diffusion of the particle guiding centers, and the plasma remains uniform in density and potential over most of the slab, with sheaths about a Debye length wide where the potential rises to the wall potential. In the 2-D case, the density profile becomes parabolic, going almost to zero at the walls, and there is a quasineutral presheath in the bulk of the plasma, in addition to sheaths near the walls. Analytic expressions are found for the density and potential profiles in both cases, including, in the 2-D case, the magnetic presheath due to finite ion Larmor radius, and the effects of the guiding center diffusion rate being either much less than or much grater than the energy diffusion rate. These analytic expressions are shown to agree with the simulations. A 1-D simulation with Monte Carlo guiding center diffusion included gives results that are good agreement with the much more expensive 2-D simulation. 17 refs., 10 figs

  8. Advances in social simulation 2015

    CERN Document Server

    Verbrugge, Rineke; Flache, Andreas; Roo, Gert; Hoogduin, Lex; Hemelrijk, Charlotte

    2017-01-01

    This book highlights recent developments in the field, presented at the Social Simulation 2015 conference in Groningen, The Netherlands. It covers advances both in applications and methods of social simulation. Societal issues addressed range across complexities in economic systems, opinion dynamics and civil violence, changing mobility patterns, different land-use, transition in the energy system, food production and consumption, ecosystem management and historical processes. Methodological developments cover how to use empirical data in validating models in general, formalization of behavioral theory in agent behavior, construction of artificial populations for experimentation, replication of models, and agent-based models that can be run in a web browser. Social simulation is a rapidly evolving field. Social scientists are increasingly interested in social simulation as a tool to tackle the complex non-linear dynamics of society. Furthermore, the software and hardware tools available for social simulation ...

  9. Universality and Realistic Extensions to the Semi-Analytic Simulation Principle in GNSS Signal Processing

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2012-06-01

    Full Text Available Semi-analytic simulation principle in GNSS signal processing bypasses the bit-true operations at high sampling frequency. Instead, signals at the output branches of the integrate&dump blocks are successfully modeled, thus making extensive Monte Carlo simulations feasible. Methods for simulations of code and carrier tracking loops with BPSK, BOC signals have been introduced in the literature. Matlab toolboxes were designed and published. In this paper, we further extend the applicability of the approach. Firstly, we describe any GNSS signal as a special instance of linear multi-dimensional modulation. Thereby, we state universal framework for classification of differently modulated signals. Using such description, we derive the semi-analytic models generally. Secondly, we extend the model for realistic scenarios including delay in the feed back, slowly fading multipath effects, finite bandwidth, phase noise, and a combination of these. Finally, a discussion on connection of this semi-analytic model and position-velocity-time estimator is delivered, as well as comparison of theoretical and simulated characteristics, produced by a prototype simulator developed at CTU in Prague.

  10. Coupling impedance of an in-vacuum undulator: Measurement, simulation, and analytical estimation

    Science.gov (United States)

    Smaluk, Victor; Fielder, Richard; Blednykh, Alexei; Rehm, Guenther; Bartolini, Riccardo

    2014-07-01

    One of the important issues of the in-vacuum undulator design is the coupling impedance of the vacuum chamber, which includes tapered transitions with variable gap size. To get complete and reliable information on the impedance, analytical estimate, numerical simulations and beam-based measurements have been performed at Diamond Light Source, a forthcoming upgrade of which includes introducing additional insertion device (ID) straights. The impedance of an already existing ID vessel geometrically similar to the new one has been measured using the orbit bump method. The measurement results in comparison with analytical estimations and numerical simulations are discussed in this paper.

  11. Fast 2D Fluid-Analytical Simulation of IEDs and Plasma Uniformity in Multi-frequency CCPs

    Science.gov (United States)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-10-01

    A fast 2D axisymmetric fluid-analytical model using the finite elements tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency argon capacitively coupled plasmas (CCPs). A bulk fluid plasma model which solves the time-dependent plasma fluid equations is coupled with an analytical sheath model which solves for the sheath parameters. The fluid-analytical results are used as input to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the wafer electrode. Each fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 minutes. The 2D multi-frequency fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel plate discharge, showing good agreement. Fluid-analytical simulations of a 2/60/162 MHz argon CCP with a typical asymmetric reactor geometry were also conducted. The low 2 MHz frequency controlled the sheath width and voltage while the higher frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. Adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge enhanced the plasma uniformity. This work was supported by the Department of Energy Office of Fusion Energy Science Contract DE-SC000193, and in part by gifts from Lam Research Corporation and Micron Corporation.

  12. Coupling impedance of an in-vacuum undulator: Measurement, simulation, and analytical estimation

    Directory of Open Access Journals (Sweden)

    Victor Smaluk

    2014-07-01

    Full Text Available One of the important issues of the in-vacuum undulator design is the coupling impedance of the vacuum chamber, which includes tapered transitions with variable gap size. To get complete and reliable information on the impedance, analytical estimate, numerical simulations and beam-based measurements have been performed at Diamond Light Source, a forthcoming upgrade of which includes introducing additional insertion device (ID straights. The impedance of an already existing ID vessel geometrically similar to the new one has been measured using the orbit bump method. The measurement results in comparison with analytical estimations and numerical simulations are discussed in this paper.

  13. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  14. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    Science.gov (United States)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  15. Free-boundary simulations of ITER advanced scenarios

    International Nuclear Information System (INIS)

    Besseghir, K.

    2013-06-01

    The successful operation of ITER advanced scenarios is likely to be a major step forward in the development of controlled fusion as a power production source. ITER advanced scenarios raise specific challenges that are not encountered in presently-operated tokamaks. In this thesis, it is argued that ITER advanced operation may benefit from optimal control techniques. Optimal control ensures high performance operation while guaranteeing tokamak integrity. The application of optimal control techniques for ITER operation is assessed and it is concluded that robust optimisation is appropriate for ITER operation of advanced scenarios. Real-time optimisation schemes are discussed and it is concluded that the necessary conditions of optimality tracking approach may potentially be appropriate for ITER operation, thus offering a viable closed-loop optimal control approach. Simulations of ITER advanced operation are necessary in order to assess the present ITER design and uncover the main difficulties that may be encountered during advanced operation. The DINA-CH and CRONOS full tokamak simulator is used to simulate the operation of the ITER hybrid and steady-state scenarios. It is concluded that the present ITER design is appropriate for performing a hybrid scenario pulse lasting more than 1000 sec, with a flat-top plasma current of 12 MA, and a fusion gain of Q ≅ 8. Similarly, a steady-state scenario without internal transport barrier, with a flat-top plasma current of 10 MA, and with a fusion gain of Q ≅ 5 can be realised using the present ITER design. The sensitivity of the advanced scenarios with respect to transport models and physical assumption is assessed using CRONOS. It is concluded that the hybrid scenario and the steady-state scenario are highly sensitive to the L-H transition timing, to the value of the confinement enhancement factor, to the heating and current drive scenario during ramp-up, and, to a lesser extent, to the density peaking and pedestal

  16. Free-boundary simulations of ITER advanced scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Besseghir, K.

    2013-06-15

    The successful operation of ITER advanced scenarios is likely to be a major step forward in the development of controlled fusion as a power production source. ITER advanced scenarios raise specific challenges that are not encountered in presently-operated tokamaks. In this thesis, it is argued that ITER advanced operation may benefit from optimal control techniques. Optimal control ensures high performance operation while guaranteeing tokamak integrity. The application of optimal control techniques for ITER operation is assessed and it is concluded that robust optimisation is appropriate for ITER operation of advanced scenarios. Real-time optimisation schemes are discussed and it is concluded that the necessary conditions of optimality tracking approach may potentially be appropriate for ITER operation, thus offering a viable closed-loop optimal control approach. Simulations of ITER advanced operation are necessary in order to assess the present ITER design and uncover the main difficulties that may be encountered during advanced operation. The DINA-CH and CRONOS full tokamak simulator is used to simulate the operation of the ITER hybrid and steady-state scenarios. It is concluded that the present ITER design is appropriate for performing a hybrid scenario pulse lasting more than 1000 sec, with a flat-top plasma current of 12 MA, and a fusion gain of Q ≅ 8. Similarly, a steady-state scenario without internal transport barrier, with a flat-top plasma current of 10 MA, and with a fusion gain of Q ≅ 5 can be realised using the present ITER design. The sensitivity of the advanced scenarios with respect to transport models and physical assumption is assessed using CRONOS. It is concluded that the hybrid scenario and the steady-state scenario are highly sensitive to the L-H transition timing, to the value of the confinement enhancement factor, to the heating and current drive scenario during ramp-up, and, to a lesser extent, to the density peaking and pedestal

  17. Simulation of advanced ultrasound systems using Field II

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2004-01-01

    impulse responses is explained. A simulation example for a synthetic aperture spread spectrum flow systems is described. It is shown how the advanced coded excitation can be set up, and how the simulation can be parallelized to reduce the simulation time from 17 months to 391 hours using a 32 CPU Linux...

  18. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  19. Lessons Learned From Dynamic Simulations of Advanced Fuel Cycles

    International Nuclear Information System (INIS)

    Piet, Steven J.; Dixon, Brent W.; Jacobson, Jacob J.; Matthern, Gretchen E.; Shropshire, David E.

    2009-01-01

    Years of performing dynamic simulations of advanced nuclear fuel cycle options provide insights into how they could work and how one might transition from the current once-through fuel cycle. This paper summarizes those insights from the context of the 2005 objectives and goals of the Advanced Fuel Cycle Initiative (AFCI). Our intent is not to compare options, assess options versus those objectives and goals, nor recommend changes to those objectives and goals. Rather, we organize what we have learned from dynamic simulations in the context of the AFCI objectives for waste management, proliferation resistance, uranium utilization, and economics. Thus, we do not merely describe 'lessons learned' from dynamic simulations but attempt to answer the 'so what' question by using this context. The analyses have been performed using the Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics (VISION). We observe that the 2005 objectives and goals do not address many of the inherently dynamic discriminators among advanced fuel cycle options and transitions thereof

  20. Quantitative Comparison of Ternary Eutectic Phase-Field Simulations with Analytical 3D Jackson-Hunt Approaches

    Science.gov (United States)

    Steinmetz, Philipp; Kellner, Michael; Hötzer, Johannes; Nestler, Britta

    2018-02-01

    For the analytical description of the relationship between undercoolings, lamellar spacings and growth velocities during the directional solidification of ternary eutectics in 2D and 3D, different extensions based on the theory of Jackson and Hunt are reported in the literature. Besides analytical approaches, the phase-field method has been established to study the spatially complex microstructure evolution during the solidification of eutectic alloys. The understanding of the fundamental mechanisms controlling the morphology development in multiphase, multicomponent systems is of high interest. For this purpose, a comparison is made between the analytical extensions and three-dimensional phase-field simulations of directional solidification in an ideal ternary eutectic system. Based on the observed accordance in two-dimensional validation cases, the experimentally reported, inherently three-dimensional chain-like pattern is investigated in extensive simulation studies. The results are quantitatively compared with the analytical results reported in the literature, and with a newly derived approach which uses equal undercoolings. A good accordance of the undercooling-spacing characteristics between simulations and the analytical Jackson-Hunt apporaches are found. The results show that the applied phase-field model, which is based on the Grand potential approach, is able to describe the analytically predicted relationship between the undercooling and the lamellar arrangements during the directional solidification of a ternary eutectic system in 3D.

  1. 35. Conference of the DVM Working Group on Fracture Processes: Advances in fracture and damage mechanics - simulation methods of fracture mechanics

    International Nuclear Information System (INIS)

    2003-01-01

    Subjects of the meeting were: Simulation of fatigue crack growth in real strucures using FEA (M. Fulland, Paderborn); Modelling of ductile crack growth (W. Brocks, Geesthacht); Advances in non-local modelling of ductile damage (F. Reusch et al., Berlin, Dortmund); Fracture mechanics of ceramics (D. Munz, Karlsruhe); From materials testing to vehicle crash testing (J.G. Blauel, Freiburg); Analytical simulation of crack growth in thin-walled structures (U. Zerbst, Geesthacht); The influence of intrinsic stresses on fatigue crack growth (C. Dalle Donne etc., Cologne, Dortmund, Pisa, and M. Sander, Paderborn); Fracture mechanical strength calculation in case of mixed mode loads on cracks (H.A. Richard, Paderborn); Numeric simulation of intrinsic stresses during welding (C. Veneziano, Freiburg); New research fields of the Fraunhofer-Institut fuer Werkstoffmechanik (P. Gumbsch, Head of the Institute, Freiburg); Modern developments and advances in fracture and damage mechanics; Numeric and experimental simulation of crack propagation and damage processes; Exemplary damage cases; Fracture mechanics in product development; Failure characteristics of lightweight constructional materials and joints [de

  2. Advanced ST plasma scenario simulations for NSTX

    International Nuclear Information System (INIS)

    Kessel, C.E.; Synakowski, E.J.; Gates, D.A.; Kaye, S.M.; Menard, J.; Phillips, C.K.; Taylor, G.; Wilson, R.; Harvey, R.W.; Mau, T.K.

    2005-01-01

    Integrated scenario simulations are done for NSTX that address four primary milestones for developing advanced ST configurations: high β and high β N inductive discharges to study all aspects of ST physics in the high beta regime; non-inductively sustained discharges for flattop times greater than the skin time to study the various current drive techniques; non-inductively sustained discharges at high βfor flattop times much greater than a skin time which provides the integrated advanced ST target for NSTX; and non-solenoidal startup and plasma current rampup. The simulations done here use the Tokamak Simulation Code (TSC) and are based on a discharge 109070. TRANSP analysis of the discharge provided the thermal diffusivities for electrons and ions, the neutral beam (NB) deposition profile and other characteristics. CURRAY is used to calculate the High Harmonic Fast Wave (HHFW) heating depositions and current drive. GENRAY/CQL3D is used to establish the heating and CD deposition profiles for electron Bernstein waves (EBW). Analysis of the ideal MHD stability is done with JSOLVER, BALMSC, and PEST2. The simulations indicate that the integrated advanced ST plasma is reachable, obtaining stable plasmas with β ∼ 40% at β N 's of 7.7-9, I P = 1.0 MA and B T = 0.35 T. The plasma is 100% non-inductive and has a flattop of 4 skin times. The resulting global energy confinement corresponds to a multiplier of H 98(y,2 ) = 1.5. The simulations have demonstrated the importance of HHFW heating and CD, EBW off-axis CD, strong plasma shaping, density control, and early heating/H-mode transition for producing and optimizing these plasma configurations (author)

  3. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  4. Advanced Analytics service to enhance workflow control at the ATLAS Production System

    CERN Document Server

    Titov, Mikhail; The ATLAS collaboration

    2018-01-01

    Modern workload management systems that are responsible for central data production and processing in High Energy and Nuclear Physics experiments have highly complicated architectures and require a specialized control service for resource and processing components balancing. Such a service represents a comprehensive set of analytical tools, management utilities and monitoring views aimed at providing a deep understanding of internal processes, and is considered as an extension for situational awareness analytic service. Its key points are analysis of task processing, e.g., selection and regulation of key task features that affect its processing the most; modeling of processed data lifecycles for further analysis, e.g., generate guidelines for particular stage of data processing; and forecasting processes with focus on data and tasks states as well as on the management system itself, e.g., to detect the source of any potential malfunction. The prototype of the advanced analytics service will be an essential pa...

  5. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  6. AN ADVANCED PLACEMENT COURSE IN ANALYTIC GEOMETRY AND CALCULUS (MATHEMATICS XV X AP).

    Science.gov (United States)

    DEROLF, JOHN J.; MIENTKA, WALTER E.

    THIS TEXT ON ANALYTIC GEOMETRY AND CALCULUS IS A CORRESPONDENCE COURSE DESIGNED FOR ADVANCED PLACEMENT OF HIGH SCHOOL STUDENTS IN COLLEGE. EACH OF THE 21 LESSONS INCLUDES READING ASSIGNMENTS AND LISTS OF PROBLEMS TO BE WORKED. IN ADDITION, SUPPLEMENTARY EXPLANATIONS AND COMMENTS ARE INCLUDED THAT (1) PROVIDE ILLUSTRATIVE EXAMPLES OF CONCEPTS AND…

  7. Long-time analytic approximation of large stochastic oscillators: Simulation, analysis and inference.

    Directory of Open Access Journals (Sweden)

    Giorgos Minas

    2017-07-01

    Full Text Available In order to analyse large complex stochastic dynamical models such as those studied in systems biology there is currently a great need for both analytical tools and also algorithms for accurate and fast simulation and estimation. We present a new stochastic approximation of biological oscillators that addresses these needs. Our method, called phase-corrected LNA (pcLNA overcomes the main limitations of the standard Linear Noise Approximation (LNA to remain uniformly accurate for long times, still maintaining the speed and analytically tractability of the LNA. As part of this, we develop analytical expressions for key probability distributions and associated quantities, such as the Fisher Information Matrix and Kullback-Leibler divergence and we introduce a new approach to system-global sensitivity analysis. We also present algorithms for statistical inference and for long-term simulation of oscillating systems that are shown to be as accurate but much faster than leaping algorithms and algorithms for integration of diffusion equations. Stochastic versions of published models of the circadian clock and NF-κB system are used to illustrate our results.

  8. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  9. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam

    International Nuclear Information System (INIS)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-01-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  10. Kinetics of transformations nucleated on random parallel planes: analytical modelling and computer simulation

    International Nuclear Information System (INIS)

    Rios, Paulo R; Assis, Weslley L S; Ribeiro, Tatiana C S; Villa, Elena

    2012-01-01

    In a classical paper, Cahn derived expressions for the kinetics of transformations nucleated on random planes and lines. He used those as a model for nucleation on the boundaries, edges and vertices of a polycrystal consisting of equiaxed grains. In this paper it is demonstrated that Cahn's expression for random planes may be used in situations beyond the scope envisaged in Cahn's original paper. For instance, we derived an expression for the kinetics of transformations nucleated on random parallel planes that is identical to that formerly obtained by Cahn considering random planes. Computer simulation of transformations nucleated on random parallel planes is carried out. It is shown that there is excellent agreement between simulated results and analytical solutions. Such an agreement is to be expected if both the simulation and the analytical solution are correct. (paper)

  11. Advanced ST Plasma Scenario Simulations for NSTX

    International Nuclear Information System (INIS)

    Kessel, C.E.; Synakowski, E.J.; Gates, D.A.; Harvey, R.W.; Kaye, S.M.; Mau, T.K.; Menard, J.; Phillips, C.K.; Taylor, G.; Wilson, R.

    2004-01-01

    Integrated scenario simulations are done for NSTX [National Spherical Torus Experiment] that address four primary milestones for developing advanced ST configurations: high β and high β N inductive discharges to study all aspects of ST physics in the high-beta regime; non-inductively sustained discharges for flattop times greater than the skin time to study the various current-drive techniques; non-inductively sustained discharges at high β for flattop times much greater than a skin time which provides the integrated advanced ST target for NSTX; and non-solenoidal start-up and plasma current ramp-up. The simulations done here use the Tokamak Simulation Code (TSC) and are based on a discharge 109070. TRANSP analysis of the discharge provided the thermal diffusivities for electrons and ions, the neutral-beam (NB) deposition profile, and other characteristics. CURRAY is used to calculate the High Harmonic Fast Wave (HHFW) heating depositions and current drive. GENRAY/CQL3D is used to establish the heating and CD [current drive] deposition profiles for electron Bernstein waves (EBW). Analysis of the ideal-MHD stability is done with JSOLVER, BALMSC, and PEST2. The simulations indicate that the integrated advanced ST plasma is reachable, obtaining stable plasmas with β ∼ 40% at β N 's of 7.7-9, I P = 1.0 MA, and B T = 0.35 T. The plasma is 100% non-inductive and has a flattop of 4 skin times. The resulting global energy confinement corresponds to a multiplier of H 98(y,2) 1.5. The simulations have demonstrated the importance of HHFW heating and CD, EBW off-axis CD, strong plasma shaping, density control, and early heating/H-mode transition for producing and optimizing these plasma configurations

  12. Insights from advanced analytics at the Veterans Health Administration.

    Science.gov (United States)

    Fihn, Stephan D; Francis, Joseph; Clancy, Carolyn; Nielson, Christopher; Nelson, Karin; Rumsfeld, John; Cullen, Theresa; Bates, Jack; Graham, Gail L

    2014-07-01

    Health care has lagged behind other industries in its use of advanced analytics. The Veterans Health Administration (VHA) has three decades of experience collecting data about the veterans it serves nationwide through locally developed information systems that use a common electronic health record. In 2006 the VHA began to build its Corporate Data Warehouse, a repository for patient-level data aggregated from across the VHA's national health system. This article provides a high-level overview of the VHA's evolution toward "big data," defined as the rapid evolution of applying advanced tools and approaches to large, complex, and rapidly changing data sets. It illustrates how advanced analysis is already supporting the VHA's activities, which range from routine clinical care of individual patients--for example, monitoring medication administration and predicting risk of adverse outcomes--to evaluating a systemwide initiative to bring the principles of the patient-centered medical home to all veterans. The article also shares some of the challenges, concerns, insights, and responses that have emerged along the way, such as the need to smoothly integrate new functions into clinical workflow. While the VHA is unique in many ways, its experience may offer important insights for other health care systems nationwide as they venture into the realm of big data. Project HOPE—The People-to-People Health Foundation, Inc.

  13. Simulation and Data Analytics for Mobile Road Weather Sensors

    Science.gov (United States)

    Chettri, S. R.; Evans, J. D.; Tislin, D.

    2016-12-01

    Numerous algorithmic and theoretical considerations arise in simulating a vehicle-based weather observation network known as the Mobile Platform Environmental Data (MoPED). MoPED integrates sensor data from a fleet of commercial vehicles (about 600 at last count, with thousands more to come) as they travel interstate, state and local routes and metropolitan areas throughout the conterminous United States. The MoPED simulator models a fleet of anywhere between 1000-10,000 vehicles that travel a highway network encoded in a geospatial database, starting and finishing at random times and moving at randomly-varying speeds. Virtual instruments aboard these vehicles interpolate surface weather parameters (such as temperature and pressure) from the High-Resolution Rapid Refresh (HRRR) data series, an hourly, coast-to-coast 3km grid of weather parameters modeled by the National Centers for Environmental Prediction. Whereas real MoPED sensors have noise characteristics that lead to drop-outs, drift, or physically unrealizable values, our simulation introduces a variety of noise distributions into the parameter values inferred from HRRR (Fig. 1). Finally, the simulator collects weather readings from the National Weather Service's Automated Surface Observation System (ASOS, comprised of over 800 airports around the country) for comparison, validation, and analytical experiments. The simulator's MoPED-like weather data stream enables studies like the following: Experimenting with data analysis and calibration methods - e.g., by comparing noisy vehicle data with ASOS "ground truth" in close spatial and temporal proximity (e.g., 10km, 10 min) (Fig. 2). Inter-calibrating different vehicles' sensors when they pass near each other. Detecting spatial structure in the surface weather - such as dry lines, sudden changes in humidity that accompany severe weather - and estimating how many vehicles are needed to reliably map these structures and their motion. Detecting bottlenecks in the

  14. Simulation training in neurosurgery: advances in education and practice

    Science.gov (United States)

    Konakondla, Sanjay; Fong, Reginald; Schirmer, Clemens M

    2017-01-01

    The current simulation technology used for neurosurgical training leaves much to be desired. Significant efforts are thoroughly exhausted in hopes of developing simulations that translate to give learners the “real-life” feel. Though a respectable goal, this may not be necessary as the application for simulation in neurosurgical training may be most useful in early learners. The ultimate uniformly agreeable endpoint of improved outcome and patient safety drives these investments. We explore the development, availability, educational taskforces, cost burdens and the simulation advancements in neurosurgical training. The technologies can be directed at achieving early resident milestones placed by the Accreditation Council for Graduate Medical Education. We discuss various aspects of neurosurgery disciplines with specific technologic advances of simulation software. An overview of the scholarly landscape of the recent publications in the realm of medical simulation and virtual reality pertaining to neurologic surgery is provided. We analyze concurrent concept overlap between PubMed headings and provide a graphical overview of the associations between these terms. PMID:28765716

  15. Advanced technology for BWR operator training simulator

    International Nuclear Information System (INIS)

    Shibuya, Akira; Fujita, Eimitsu; Nakao, Toshihiko; Nakabaru, Mitsugu; Asaoka, Kouchi.

    1991-01-01

    This paper describes an operator training simulator for BWR nuclear power plants which went into service recently. The simulator is a full scope replica type simulator which faithfully replicates the control room environment of the reference plant with six main control panels and twelve auxiliary ones. In comparison with earlier simulators, the scope of the simulation is significantly extended in both width and depth. The simulation model is also refined in order to include operator training according to sympton-based emergency procedure guidelines to mitigate the results in accident cases. In particular, the core model and the calculational model of the radiation intensity distribution, if radioactive materials were released, are improved. As for simulator control capabilities by which efficient and effective training can be achieved, various advanced designs are adopted allowing easy use of the simulators. (author)

  16. Discrete event simulation methods applied to advanced importance measures of repairable components in multistate network flow systems

    International Nuclear Information System (INIS)

    Huseby, Arne B.; Natvig, Bent

    2013-01-01

    Discrete event models are frequently used in simulation studies to model and analyze pure jump processes. A discrete event model can be viewed as a system consisting of a collection of stochastic processes, where the states of the individual processes change as results of various kinds of events occurring at random points of time. We always assume that each event only affects one of the processes. Between these events the states of the processes are considered to be constant. In the present paper we use discrete event simulation in order to analyze a multistate network flow system of repairable components. In order to study how the different components contribute to the system, it is necessary to describe the often complicated interaction between component processes and processes at the system level. While analytical considerations may throw some light on this, a simulation study often allows the analyst to explore more details. By producing stable curve estimates for the development of the various processes, one gets a much better insight in how such systems develop over time. These methods are particulary useful in the study of advanced importancez measures of repairable components. Such measures can be very complicated, and thus impossible to calculate analytically. By using discrete event simulations, however, this can be done in a very natural and intuitive way. In particular significant differences between the Barlow–Proschan measure and the Natvig measure in multistate network flow systems can be explored

  17. Semi-analytical treatment of fracture/matrix flow in a dual-porosity simulator for unsaturated fractured rock masses

    International Nuclear Information System (INIS)

    Zimmerman, R.W.; Bodvarsson, G.S.

    1992-04-01

    A semi-analytical dual-porosity simulator for unsaturated flow in fractured rock masses has been developed. Fluid flow between the fracture network and the matrix blocks is described by analytical expressions that have been derived from approximate solutions to the imbibition equation. These expressions have been programmed into the unsaturated flow simulator, TOUGH, as a source/sink term. Flow processes are then simulated using only fracture elements in the computational grid. The modified code is used to simulate flow along single fractures, and infiltration into pervasively fractured formations

  18. Efficient Online Processing for Advanced Analytics

    OpenAIRE

    El Seidy, Mohamed Elsayed Mohamed Ahmed

    2017-01-01

    With the advent of emerging technologies and the Internet of Things, the importance of online data analytics has become more pronounced. Businesses and companies are adopting approaches that provide responsive analytics to stay competitive in the global marketplace. Online analytics allow data analysts to promptly react to patterns or to gain preliminary insights from early results that aid in research, decision making, and effective strategy planning. The growth of data-velocity in a variety...

  19. Advancements in simulations of lattice quantum chromodynamics

    International Nuclear Information System (INIS)

    Lippert, T.

    2008-01-01

    An introduction to lattice QCD with emphasis on advanced fermion formulations and their simulation is given. In particular, overlap fermions will be presented, a quite novel fermionic discretization scheme that is able to exactly preserve chiral symmetry on the lattice. I will discuss efficiencies of state-of-the-art algorithms on highly scalable supercomputers and I will show that, due to many algorithmic improvements, overlap simulations will soon become feasible for realistic physical lattice sizes. Finally I am going to sketch the status of some current large scale lattice QCD simulations. (author)

  20. Fast 2D hybrid fluid-analytical simulation of inductive/capacitive discharges

    International Nuclear Information System (INIS)

    Kawamura, E; Lieberman, M A; Graves, D B

    2011-01-01

    A fast two-dimensional (2D) hybrid fluid-analytical transform coupled plasma reactor model was developed using the finite elements simulation tool COMSOL. Both inductive and capacitive coupling of the source coils to the plasma are included in the model, as well as a capacitive bias option for the wafer electrode. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model. The vacuum sheath of variable thickness is modeled with a fixed-width sheath of variable dielectric constant. The sheath heating is treated as an incoming heat flux at the plasma-sheath boundary, and a dissipative term is added to the sheath dielectric constant. A gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The simulation results, over a range of input powers, are in good agreement with a chlorine reactor experimental study.

  1. Exploring the Managerial Dilemmas Encountered by Advanced Analytical Equipment Providers in Developing Service-led Growth Strategies

    DEFF Research Database (Denmark)

    Raja, Jawwad; Frandsen, Thomas; Mouritsen, Jan

    2017-01-01

    This paper examines the dilemmas encountered by manufacturers of advanced analytical equipment in developing service-led growth strategies to expand their business in pursuit of more attractive revenue models. It does so by adopting a case-based research approach. The findings detail the challenges...... faced in providing advanced services to customers’ R & D functions, while simultaneously attempting to scale up these services for a production context. The emergent complexities of operating in multiple arenas in order to explore and exploit technologies in different contexts—along the three...... trajectories of serviceability, scalability and solutions—with a view to expanding markets and developing solution-based business models, are discussed. It is argued that manufacturers of analytical equipment encounter certain dilemmas, as managing the different trajectories involves different needs...

  2. Analytical Modelling and Simulation of Photovoltaic Panels and Arrays

    Directory of Open Access Journals (Sweden)

    H. Bourdoucen

    2007-12-01

    Full Text Available In this paper, an analytical model for PV panels and arrays based on extracted physical parameters of solar cells is developed. The proposed model has the advantage of simplifying mathematical modelling for different configurations of cells and panels without losing efficiency of PV system operation. The effects of external parameters, mainly temperature and solar irradiance have been considered in the modelling. Due to their critical effects on the operation of the panel, effects of series and shunt resistances were also studied. The developed analytical model has been easily implemented, simulated and validated using both Spice and Matlab packages for different series and parallel configurations of cells and panels. The results obtained with these two programs are in total agreement, which make the proposed model very useful for researchers and designers for quick and accurate sizing of PV panels and arrays.

  3. PROBING THE ROLE OF DYNAMICAL FRICTION IN SHAPING THE BSS RADIAL DISTRIBUTION. I. SEMI-ANALYTICAL MODELS AND PRELIMINARY N-BODY SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Miocchi, P.; Lanzoni, B.; Ferraro, F. R.; Dalessandro, E.; Alessandrini, E. [Dipartimento di Fisica e Astronomia, Università di Bologna, Viale Berti Pichat 6/2, I-40127 Bologna (Italy); Pasquato, M.; Lee, Y.-W. [Department of Astronomy and Center for Galaxy Evolution Research, Yonsei University, Seoul 120-749 (Korea, Republic of); Vesperini, E. [Department of Astronomy, Indiana University, Bloomington, IN 47405 (United States)

    2015-01-20

    We present semi-analytical models and simplified N-body simulations with 10{sup 4} particles aimed at probing the role of dynamical friction (DF) in determining the radial distribution of blue straggler stars (BSSs) in globular clusters. The semi-analytical models show that DF (which is the only evolutionary mechanism at work) is responsible for the formation of a bimodal distribution with a dip progressively moving toward the external regions of the cluster. However, these models fail to reproduce the formation of the long-lived central peak observed in all dynamically evolved clusters. The results of N-body simulations confirm the formation of a sharp central peak, which remains as a stable feature over time regardless of the initial concentration of the system. In spite of noisy behavior, a bimodal distribution forms in many cases, with the size of the dip increasing as a function of time. In the most advanced stages, the distribution becomes monotonic. These results are in agreement with the observations. Also, the shape of the peak and the location of the minimum (which, in most of cases, is within 10 core radii) turn out to be consistent with observational results. For a more detailed and close comparison with observations, including a proper calibration of the timescales of the dynamical processes driving the evolution of the BSS spatial distribution, more realistic simulations will be necessary.

  4. TNO-ADVANCE: a modular power train simulation and design tool

    NARCIS (Netherlands)

    Venne, J.W.C. van de; Hendriksen, P.; Smokers, R.T.M.; Verkiel, M.

    1998-01-01

    To support its activities in the field of conventional and hybrid vehicles, TNO has developed ADVANCE, a modular simulation tool for the design and evaluation of advanced power trains. In this paper the various features and the potential of ADVANCE are described and illustrated by means of two case

  5. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  6. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    Science.gov (United States)

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  7. ATC-lab(Advanced): an air traffic control simulator with realism and control.

    Science.gov (United States)

    Fothergill, Selina; Loft, Shayne; Neal, Andrew

    2009-02-01

    ATC-lab(Advanced) is a new, publicly available air traffic control (ATC) simulation package that provides both realism and experimental control. ATC-lab(Advanced) simulations are realistic to the extent that the display features (including aircraft performance) and the manner in which participants interact with the system are similar to those used in an operational environment. Experimental control allows researchers to standardize air traffic scenarios, control levels of realism, and isolate specific ATC tasks. Importantly, ATC-lab(Advanced) also provides the programming control required to cost effectively adapt simulations to serve different research purposes without the need for technical support. In addition, ATC-lab(Advanced) includes a package for training participants and mathematical spreadsheets for designing air traffic events. Preliminary studies have demonstrated that ATC-lab(Advanced) is a flexible tool for applied and basic research.

  8. An on-line advanced plant simulator (OLAPS)

    International Nuclear Information System (INIS)

    Samuels, J.W.

    1989-01-01

    A PC based on-line advanced plant simulator (OLAPS) for high quality simulations of Portland General Electric's Trojan Nuclear Facility is presented. OLAPS is designed to simulate the thermal-hydraulics of the primary system including core, steam generators, pumps, piping and pressurizer. The simulations are based on a five equation model that has two mass equations, two energy equations, two energy equations, and one momentum equation with a drift flux model to provide closure. A regionwise point reactor kinetics model is used to model the neutron kinetics in the core. The conservation equations, constitutive models and the numerical methods used to solve them are described. OLAPS results are compared with data from chapter 15 of the Trojan Nuclear Facility's final safety analysis report

  9. Simulation training in neurosurgery: advances in education and practice

    Directory of Open Access Journals (Sweden)

    Konakondla S

    2017-07-01

    Full Text Available Sanjay Konakondla, Reginald Fong, Clemens M Schirmer Department of Neurosurgery and Neuroscience Institute, Geisinger Medical Center, Geisinger Health System, Danville, PA, USA Abstract: The current simulation technology used for neurosurgical training leaves much to be desired. Significant efforts are thoroughly exhausted in hopes of developing simulations that translate to give learners the “real-life” feel. Though a respectable goal, this may not be necessary as the application for simulation in neurosurgical training may be most useful in early learners. The ultimate uniformly agreeable endpoint of improved outcome and patient safety drives these investments. We explore the development, availability, educational taskforces, cost burdens and the simulation advancements in neurosurgical training. The technologies can be directed at achieving early resident milestones placed by the Accreditation Council for Graduate Medical Education. We discuss various aspects of neurosurgery disciplines with specific technologic advances of simulation software. An overview of the scholarly landscape of the recent publications in the realm of medical simulation and virtual reality pertaining to neurologic surgery is provided. We analyze concurrent concept overlap between PubMed headings and provide a graphical overview of the associations between these terms. Keywords: residency education, simulation, neurosurgery training, virtual reality, haptic feedback, task analysis, ACGME 

  10. Comparison between laboratory measurements, simulations, and analytical predictions of the transverse wall impedance at low frequencies

    CERN Document Server

    Roncarolo, F; Kroyer, T; Metral, E; Mounet, N; Salvant, B; Zotter, B

    2009-01-01

    The prediction of the transverse wall beam impedance at the first unstable betatron line (8 kHz) of the CERN Large Hadron Collider (LHC) is of paramount importance for understanding and controlling the related coupled-bunch instabilities. Until now only novel analytical formulas were available at this frequency. Recently, laboratory measurements and numerical simulations were performed to cross-check the analytical predictions. The experimental results based on the measurement of the variation of a probe coil inductance in the presence of (i) sample graphite plates, (ii) stand-alone LHC collimator jaws, and (iii) a full LHC collimator assembly are presented in detail. The measurement results are compared to both analytical theories and simulations. In addition, the consequences for the understanding of the LHC impedance are discussed.

  11. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  12. Non-dissipative kinetic simulation and analytical solution of three-mode equations of ion temperature gradient instability

    International Nuclear Information System (INIS)

    Watanabe, T.-H.; Sugama, H.; Sato, T.

    1999-12-01

    A non-dissipative drift kinetic simulation scheme, which rigorously satisfies the time-reversibility, is applied to the three-mode coupling problem of the ion temperature gradient (ITG) instability. It is found from the simulation that the three-mode ITG system repeats growth and decay with a period which shows a logarithmic divergence for infinitesimal initial perturbations. Accordingly, time average of the mode amplitude vanishes, as the initial amplitude approaches to zero. An exact solution is analytically given for a class of initial conditions. An excellent agreement is confirmed between the analytical solution and numerical results. The results obtained here provide a useful reference for basic benchmarking of theories and simulation of the ITG modes. (author)

  13. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  14. Fast 2D fluid-analytical simulation of ion energy distributions and electromagnetic effects in multi-frequency capacitive discharges

    Science.gov (United States)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-12-01

    A fast 2D axisymmetric fluid-analytical plasma reactor model using the finite elements simulation tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency capacitive argon discharges. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model, which solves for the sheath parameters. The time-independent Helmholtz equation is used to solve for the fields and a gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The results of the fluid-analytical model are used as inputs to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the target electrode. Each 2D fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 min. The multi-frequency 2D fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel-plate discharge, showing good agreement. We also conducted fluid-analytical simulations of a multi-frequency argon capacitively coupled plasma (CCP) with a typical asymmetric reactor geometry at 2/60/162 MHz. The low frequency 2 MHz power controlled the sheath width and sheath voltage while the high frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. We noticed that adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge can enhance the plasma uniformity. We found that multiple frequencies were not only useful for controlling IEDs but also plasma uniformity in CCP reactors.

  15. Backward wave oscillators with rippled wall resonators: Analytic theory and numerical simulation

    International Nuclear Information System (INIS)

    Swegle, J.A.; Poukey, J.W.

    1985-01-01

    The 3-D analytic theory is based on the approximation that the device is infinitely long. In the absence of an electron beam, the theory is exact and allows us to compute the dispersion characteristics of the cold structure. With the inclusion of a thin electron beam, we can compute the growth rates resulting from the interaction between a waveguide mode of the structure and the slower space charge wave on the beam. In the limit of low beam currents, the full dispersion relation based on an electromagnetic analysis can be placed in correspondence with the circuit theory of Pierce. Numerical simulations permit us to explore the saturated, large amplitude operating regime for TM axisymmetric modes. The scaling of operating frequency, peak power, and operating efficiency with beam and resonator parameters is examined. The analytic theory indicates that growth rates are largest for the TM 01 modes and decrease with both the radial and azimuthal mode numbers. Another interesting trend is that for a fixed cathode voltage and slow wave structure, growth rates peak for a beam current below the space charge limiting value and decrease for both larger and smaller currents. The simulations show waves that grow from noise without any input signal, so that the system functions as an oscillator. The TM 01 mode predominates in all simulations. While a minimum device length is required for the start of oscillations, it appears that if the slow wave structure is too long, output power is decreased by a transfer of wave energy back to the electrons. Comparisons have been made between the analytical and numerical results, as well as with experimental data obtained at Sandia National Laboratories

  16. Analytical methods for heat transfer and fluid flow problems

    CERN Document Server

    Weigand, Bernhard

    2015-01-01

    This book describes useful analytical methods by applying them to real-world problems rather than solving the usual over-simplified classroom problems. The book demonstrates the applicability of analytical methods even for complex problems and guides the reader to a more intuitive understanding of approaches and solutions. Although the solution of Partial Differential Equations by numerical methods is the standard practice in industries, analytical methods are still important for the critical assessment of results derived from advanced computer simulations and the improvement of the underlying numerical techniques. Literature devoted to analytical methods, however, often focuses on theoretical and mathematical aspects and is therefore useless to most engineers. Analytical Methods for Heat Transfer and Fluid Flow Problems addresses engineers and engineering students. The second edition has been updated, the chapters on non-linear problems and on axial heat conduction problems were extended. And worked out exam...

  17. Numerical and analytical simulation of the production process of ZrO2 hollow particles

    Science.gov (United States)

    Safaei, Hadi; Emami, Mohsen Davazdah

    2017-12-01

    In this paper, the production process of hollow particles from the agglomerated particles is addressed analytically and numerically. The important parameters affecting this process, in particular, the initial porosity level of particles and the plasma gun types are investigated. The analytical model adopts a combination of quasi-steady thermal equilibrium and mechanical balance. In the analytical model, the possibility of a solid core existing in agglomerated particles is examined. In this model, a range of particle diameters (50μm ≤ D_{p0} ≤ 160 μ m) and various initial porosities ( 0.2 ≤ p ≤ 0.7) are considered. The numerical model employs the VOF technique for two-phase compressible flows. The production process of hollow particles from the agglomerated particles is simulated, considering an initial diameter of D_{p0} = 60 μm and initial porosity of p = 0.3, p = 0.5, and p = 0.7. Simulation results of the analytical model indicate that the solid core diameter is independent of the initial porosity, whereas the thickness of the particle shell strongly depends on the initial porosity. In both models, a hollow particle may hardly develop at small initial porosity values ( p disintegrates at high initial porosity values ( p > 0.6.

  18. Molecular simulations and visualization: introduction and overview.

    Science.gov (United States)

    Hirst, Jonathan D; Glowacki, David R; Baaden, Marc

    2014-01-01

    Here we provide an introduction and overview of current progress in the field of molecular simulation and visualization, touching on the following topics: (1) virtual and augmented reality for immersive molecular simulations; (2) advanced visualization and visual analytic techniques; (3) new developments in high performance computing; and (4) applications and model building.

  19. Applying advanced analytics to guide emergency department operational decisions: A proof-of-concept study examining the effects of boarding.

    Science.gov (United States)

    Andrew Taylor, R; Venkatesh, Arjun; Parwani, Vivek; Chekijian, Sharon; Shapiro, Marc; Oh, Andrew; Harriman, David; Tarabar, Asim; Ulrich, Andrew

    2018-01-04

    Emergency Department (ED) leaders are increasingly confronted with large amounts of data with the potential to inform and guide operational decisions. Routine use of advanced analytic methods may provide additional insights. To examine the practical application of available advanced analytic methods to guide operational decision making around patient boarding. Retrospective analysis of the effect of boarding on ED operational metrics from a single site between 1/2015 and 1/2017. Times series were visualized through decompositional techniques accounting for seasonal trends, to determine the effect of boarding on ED performance metrics and to determine the impact of boarding "shocks" to the system on operational metrics over several days. There were 226,461 visits with the mean (IQR) number of visits per day was 273 (258-291). Decomposition of the boarding count time series illustrated an upward trend in the last 2-3 quarters as well as clear seasonal components. All performance metrics were significantly impacted (pstudy regarding the use of advanced analytics in daily ED operations, time series analysis provided multiple useful insights into boarding and its impact on performance metrics. Copyright © 2018. Published by Elsevier Inc.

  20. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  1. Advance in research on aerosol deposition simulation methods

    International Nuclear Information System (INIS)

    Liu Keyang; Li Jingsong

    2011-01-01

    A comprehensive analysis of the health effects of inhaled toxic aerosols requires exact data on airway deposition. A knowledge of the effect of inhaled drugs is essential to the optimization of aerosol drug delivery. Sophisticated analytical deposition models can be used for the computation of total, regional and generation specific deposition efficiencies. The continuously enhancing computer seem to allow us to study the particle transport and deposition in more and more realistic airway geometries with the help of computational fluid dynamics (CFD) simulation method. In this article, the trends in aerosol deposition models and lung models, and the methods for achievement of deposition simulations are also reviewed. (authors)

  2. Distinct neural substrates of visuospatial and verbal-analytic reasoning as assessed by Raven's Advanced Progressive Matrices

    NARCIS (Netherlands)

    Chen, Zhencai; De Beuckelaer, A.; Wang, Xu; Liu, Jia

    2017-01-01

    Recent studies revealed spontaneous neural activity to be associated with fluid intelligence (gF) which is commonly assessed by Raven’s Advanced Progressive Matrices, and embeds two types of reasoning: visuospatial and verbal-analytic reasoning. With resting-state fMRI data, using global brain

  3. Recent Advances in Analytical Pyrolysis to Investigate Organic Materials in Heritage Science.

    Science.gov (United States)

    Degano, Ilaria; Modugno, Francesca; Bonaduce, Ilaria; Ribechini, Erika; Colombini, Maria Perla

    2018-06-18

    The molecular characterization of organic materials in samples from artworks and historical objects traditionally entailed qualitative and quantitative analyses by HPLC and GC. Today innovative approaches based on analytical pyrolysis enable samples to be analysed without any chemical pre-treatment. Pyrolysis, which is often considered as a screening technique, shows previously unexplored potential thanks to recent instrumental developments. Organic materials that are macromolecular in nature, or undergo polymerization upon curing and ageing can now be better investigated. Most constituents of paint layers and archaeological organic substances contain major insoluble and chemically non-hydrolysable fractions that are inaccessible to GC or HPLC. To date, molecular scientific investigations of the organic constituents of artworks and historical objects have mostly focused on the minor constituents of the sample. This review presents recent advances in the qualitative and semi-quantitative analyses of organic materials in heritage objects based on analytical pyrolysis coupled with mass spectrometry. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The Consortium for Advanced Simulation of Light Water Reactors

    International Nuclear Information System (INIS)

    Szilard, Ronaldo; Zhang, Hongbin; Kothe, Douglas; Turinsky, Paul

    2011-01-01

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is a DOE Energy Innovation Hub for modeling and simulation of nuclear reactors. It brings together an exceptionally capable team from national labs, industry and academia that will apply existing modeling and simulation capabilities and develop advanced capabilities to create a usable environment for predictive simulation of light water reactors (LWRs). This environment, designated as the Virtual Environment for Reactor Applications (VERA), will incorporate science-based models, state-of-the-art numerical methods, modern computational science and engineering practices, and uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs). It will couple state-of-the-art fuel performance, neutronics, thermal-hydraulics (T-H), and structural models with existing tools for systems and safety analysis and will be designed for implementation on both today's leadership-class computers and the advanced architecture platforms now under development by the DOE. CASL focuses on a set of challenge problems such as CRUD induced power shift and localized corrosion, grid-to-rod fretting fuel failures, pellet clad interaction, fuel assembly distortion, etc. that encompass the key phenomena limiting the performance of PWRs. It is expected that much of the capability developed will be applicable to other types of reactors. CASL's mission is to develop and apply modeling and simulation capabilities to address three critical areas of performance for nuclear power plants: (1) reduce capital and operating costs per unit energy by enabling power uprates and plant lifetime extension, (2) reduce nuclear waste volume generated by enabling higher fuel burnup, and (3) enhance nuclear safety by enabling high-fidelity predictive capability for component performance.

  5. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  6. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  7. Comparison of NMR simulations of porous media derived from analytical and voxelized representations.

    Science.gov (United States)

    Jin, Guodong; Torres-Verdín, Carlos; Toumelin, Emmanuel

    2009-10-01

    We develop and compare two formulations of the random-walk method, grain-based and voxel-based, to simulate the nuclear-magnetic-resonance (NMR) response of fluids contained in various models of porous media. The grain-based approach uses a spherical grain pack as input, where the solid surface is analytically defined without an approximation. In the voxel-based approach, the input is a computer-tomography or computer-generated image of reconstructed porous media. Implementation of the two approaches is largely the same, except for the representation of porous media. For comparison, both approaches are applied to various analytical and digitized models of porous media: isolated spherical pore, simple cubic packing of spheres, and random packings of monodisperse and polydisperse spheres. We find that spin magnetization decays much faster in the digitized models than in their analytical counterparts. The difference in decay rate relates to the overestimation of surface area due to the discretization of the sample; it cannot be eliminated even if the voxel size decreases. However, once considering the effect of surface-area increase in the simulation of surface relaxation, good quantitative agreement is found between the two approaches. Different grain or pore shapes entail different rates of increase of surface area, whereupon we emphasize that the value of the "surface-area-corrected" coefficient may not be universal. Using an example of X-ray-CT image of Fontainebleau rock sample, we show that voxel size has a significant effect on the calculated surface area and, therefore, on the numerically simulated magnetization response.

  8. Advancing Material Models for Automotive Forming Simulations

    International Nuclear Information System (INIS)

    Vegter, H.; An, Y.; Horn, C.H.L.J. ten; Atzema, E.H.; Roelofsen, M.E.

    2005-01-01

    Simulations in automotive industry need more advanced material models to achieve highly reliable forming and springback predictions. Conventional material models implemented in the FEM-simulation models are not capable to describe the plastic material behaviour during monotonic strain paths with sufficient accuracy. Recently, ESI and Corus co-operate on the implementation of an advanced material model in the FEM-code PAMSTAMP 2G. This applies to the strain hardening model, the influence of strain rate, and the description of the yield locus in these models. A subsequent challenge is the description of the material after a change of strain path.The use of advanced high strength steels in the automotive industry requires a description of plastic material behaviour of multiphase steels. The simplest variant is dual phase steel consisting of a ferritic and a martensitic phase. Multiphase materials also contain a bainitic phase in addition to the ferritic and martensitic phase. More physical descriptions of strain hardening than simple fitted Ludwik/Nadai curves are necessary.Methods to predict plastic behaviour of single-phase materials use a simple dislocation interaction model based on the formed cells structures only. At Corus, a new method is proposed to predict plastic behaviour of multiphase materials have to take hard phases into account, which deform less easily. The resulting deformation gradients create geometrically necessary dislocations. Additional micro-structural information such as morphology and size of hard phase particles or grains is necessary to derive the strain hardening models for this type of materials.Measurements available from the Numisheet benchmarks allow these models to be validated. At Corus, additional measured values are available from cross-die tests. This laboratory test can attain critical deformations by large variations in blank size and processing conditions. The tests are a powerful tool in optimising forming simulations prior

  9. An analytical model for radioactive pollutant release simulation in the atmospheric boundary layer

    International Nuclear Information System (INIS)

    Weymar, Guilherme J.; Vilhena, Marco T.; Bodmann, Bardo E.J.; Buske, Daniela; Quadros, Regis

    2013-01-01

    Simulations of emission of radioactive substances in the atmosphere from the Brazilian nuclear power plant Angra 1 are a necessary tool for control and elaboration of emergency plans as a preventive action for possible accidents. In the present work we present an analytical solution for radioactive pollutant dispersion in the atmosphere, solving the time-dependent three-dimensional advection-diffusion equation. The experiment here used as a reference in the simulations consisted of the controlled releases of radioactive tritiated water vapor from the meteorological tower close to the power plant at Itaorna Beach. The wind profile was determined using experimental meteorological data and the micrometeorological parameters were calculated from empirical equations obtained in the literature. We report on a novel analytical formulation for the concentration of products of a radioactive chain released in the atmospheric boundary layer and solve the set of coupled equations for each chain radionuclide by the GILTT solution, assuming the decay of all progenitors radionuclide for each equation as source term. Further we report on numerical simulations, as an explicit but fictitious example and consider three radionuclides in the radioactive chain of Uranium 235. (author)

  10. An analytical model for radioactive pollutant release simulation in the atmospheric boundary layer

    Energy Technology Data Exchange (ETDEWEB)

    Weymar, Guilherme J.; Vilhena, Marco T.; Bodmann, Bardo E.J., E-mail: guicefetrs@gmail.com, E-mail: mtmbvilhena@gmail.com, E-mail: bejbodmann@gmail.com [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Buske, Daniela; Quadros, Regis, E-mail: danielabuske@gmail.com, E-mail: quadros99@gmail.com [Universidade Federal de Pelotas (UFPel), Capao do Leao, RS (Brazil). Programa de Pos-Graduacao em Modelagem Matematica

    2013-07-01

    Simulations of emission of radioactive substances in the atmosphere from the Brazilian nuclear power plant Angra 1 are a necessary tool for control and elaboration of emergency plans as a preventive action for possible accidents. In the present work we present an analytical solution for radioactive pollutant dispersion in the atmosphere, solving the time-dependent three-dimensional advection-diffusion equation. The experiment here used as a reference in the simulations consisted of the controlled releases of radioactive tritiated water vapor from the meteorological tower close to the power plant at Itaorna Beach. The wind profile was determined using experimental meteorological data and the micrometeorological parameters were calculated from empirical equations obtained in the literature. We report on a novel analytical formulation for the concentration of products of a radioactive chain released in the atmospheric boundary layer and solve the set of coupled equations for each chain radionuclide by the GILTT solution, assuming the decay of all progenitors radionuclide for each equation as source term. Further we report on numerical simulations, as an explicit but fictitious example and consider three radionuclides in the radioactive chain of Uranium 235. (author)

  11. Advances in functional brain imaging technology and developmental neuro-psychology: their applications in the Jungian analytic domain.

    Science.gov (United States)

    Petchkovsky, Leon

    2017-06-01

    Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.

  12. Advanced on-site conceptual simulator for Forsmark 3

    International Nuclear Information System (INIS)

    Johansson, G.; Sjoestrand, K.

    1984-01-01

    On-site conceptual simulators have been extensively used at Swedish nuclear power plants. Despite having access to identical replica simulators, both the Swedish State Power Board and the Swedish private power industry have ordered conceptual simulators during 1982. The motivation has been that a complete training programme requires access to both a replica and a conceptual simulator. The replica simulator is perfect for training in control room behaviour but less appropriate for ensuring deeper process understanding. On the other hand, the conceptual simulator is not well suited for getting the personnel acquainted with the control room but is perfect for extending their knowledge of the plant processes. In order to give a realistic description of these processes, the conceptual simulator model must be fairly advanced. The Forsmark 3 conceptual simulator simulates the entire primary system, including the details of the steam and feedwater systems. Considerable attention has also been devoted to the presentation of calculated variables. For example, all the variables in the data base (approx. 6600) can be presented on colour-graphic CRTs as functions of time. (author)

  13. Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning.

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in embodied theories of cognition and affect, which advocate a close coupling between thought and action. It uses machine-learned computational models to automatically infer mental states associated with engagement (e.g., interest, flow) from machine-readable behavioral and physiological signals (e.g., facial expressions, eye tracking, click-stream data) and from aspects of the environmental context. We present15 case studies that illustrate the potential of the AAA approach for measuring engagement in digital learning environments. We discuss strengths and weaknesses of the AAA approach, concluding that it has significant promise to catalyze engagement research.

  14. Time parallelization of advanced operation scenario simulations of ITER plasma

    International Nuclear Information System (INIS)

    Samaddar, D; Casper, T A; Kim, S H; Houlberg, W A; Berry, L A; Elwasif, W R; Batchelor, D

    2013-01-01

    This work demonstrates that simulations of advanced burning plasma operation scenarios can be successfully parallelized in time using the parareal algorithm. CORSICA -an advanced operation scenario code for tokamak plasmas is used as a test case. This is a unique application since the parareal algorithm has so far been applied to relatively much simpler systems except for the case of turbulence. In the present application, a computational gain of an order of magnitude has been achieved which is extremely promising. A successful implementation of the Parareal algorithm to codes like CORSICA ushers in the possibility of time efficient simulations of ITER plasmas.

  15. Modelling of ballistic low energy ion solid interaction - conventional analytic theories versus computer simulations

    International Nuclear Information System (INIS)

    Littmark, U.

    1994-01-01

    The ''philosophy'' behind, and the ''psychology'' of the development from analytic theory to computer simulations in the field of atomic collisions in solids is discussed and a few examples of achievements and perspectives are given. (orig.)

  16. Advanced Helmet Mounted Display (AHMD) for simulator applications

    Science.gov (United States)

    Sisodia, Ashok; Riser, Andrew; Bayer, Michael; McGuire, James P.

    2006-05-01

    The Advanced Helmet Mounted Display (AHMD), augmented reality visual system first presented at last year's Cockpit and Future Displays for Defense and Security conference, has now been evaluated in a number of military simulator applications and by L-3 Link Simulation and Training. This paper presents the preliminary results of these evaluations and describes current and future simulator and training applications for HMD technology. The AHMD blends computer-generated data (symbology, synthetic imagery, enhanced imagery) with the actual and simulated visible environment. The AHMD is designed specifically for highly mobile deployable, minimum resource demanding reconfigurable virtual training systems to satisfy the military's in-theater warrior readiness objective. A description of the innovative AHMD system and future enhancements will be discussed.

  17. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    International Nuclear Information System (INIS)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10 6 obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement

  18. Plastic deformation of crystals: analytical and computer simulation studies of dislocation glide

    Energy Technology Data Exchange (ETDEWEB)

    Altintas, S.

    1978-05-01

    The plastic deformation of crystals is usually accomplished through the motion of dislocations. The glide of a dislocation is impelled by the applied stress and opposed by microstructural defects such as point defects, voids, precipitates and other dislocations. The planar glide of a dislocation through randomly distributed obstacles is considered. The objective of the present research work is to calculate the critical resolved shear stress (CRSS) for athermal glide and the velocity of the dislocation at finite temperature as a function of the applied stress and the nature and strength of the obstacles. Dislocation glide through mixtures of obstacles has been studied analytically and by computer simulation. Arrays containing two kinds of obstacles as well as square distribution of obstacle strengths are considered. The critical resolved shear stress for an array containing obstacles with a given distribution of strengths is calculated using the sum of the quadratic mean of the stresses for the individual obstacles and is found to be in good agreement with the computer simulation data. Computer simulation of dislocation glide through randomly distributed obstacles containing up to 10/sup 6/ obstacles show that the CRSS decreases as the size of the array increases and approaches a limiting value. Histograms of forces and of segment lengths are obtained and compared with theoretical predictions. Effects of array shape and boundary conditions on the dislocation glide are also studied. Analytical and computer simulation results are compared with experimental results obtained on precipitation-, irradiation-, forest-, and impurity cluster-hardening systems and are found to be in good agreement.

  19. Simulated herbivory advances autumn phenology in Acer rubrum.

    Science.gov (United States)

    Forkner, Rebecca E

    2014-05-01

    To determine the degree to which herbivory contributes to phenotypic variation in autumn phenology for deciduous trees, red maple (Acer rubrum) branches were subjected to low and high levels of simulated herbivory and surveyed at the end of the season to assess abscission and degree of autumn coloration. Overall, branches with simulated herbivory abscised ∼7 % more leaves at each autumn survey date than did control branches within trees. While branches subjected to high levels of damage showed advanced phenology, abscission rates did not differ from those of undamaged branches within trees because heavy damage induced earlier leaf loss on adjacent branch nodes in this treatment. Damaged branches had greater proportions of leaf area colored than undamaged branches within trees, having twice the amount of leaf area colored at the onset of autumn and having ~16 % greater leaf area colored in late October when nearly all leaves were colored. When senescence was scored as the percent of all leaves abscised and/or colored, branches in both treatments reached peak senescence earlier than did control branches within trees: dates of 50 % senescence occurred 2.5 days earlier for low herbivory branches and 9.7 days earlier for branches with high levels of simulated damage. These advanced rates are of the same time length as reported delays in autumn senescence and advances in spring onset due to climate warming. Thus, results suggest that should insect damage increase as a consequence of climate change, it may offset a lengthening of leaf life spans in some tree species.

  20. DCMS: A data analytics and management system for molecular simulation.

    Science.gov (United States)

    Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni

    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.

  1. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Science.gov (United States)

    2010-01-01

    ... minimum of 4 hours of training each year to become familiar with the operator's advanced simulation training program, or changes to it, and to emphasize their respective roles in the program. Training for...

  2. Advanced web metrics with Google Analytics

    CERN Document Server

    Clifton, Brian

    2012-01-01

    Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which

  3. Predictive Simulation of Material Failure Using Peridynamics -- Advanced Constitutive Modeling, Verification and Validation

    Science.gov (United States)

    2016-03-31

    AFRL-AFOSR-VA-TR-2016-0309 Predictive simulation of material failure using peridynamics- advanced constitutive modeling, verification , and validation... Self -explanatory. 8. PERFORMING ORGANIZATION REPORT NUMBER. Enter all unique alphanumeric report numbers assigned by the performing organization, e.g...for public release. Predictive simulation of material failure using peridynamics-advanced constitutive modeling, verification , and validation John T

  4. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  5. Evaluation of FTIR-based analytical methods for the analysis of simulated wastes

    International Nuclear Information System (INIS)

    Rebagay, T.V.; Cash, R.J.; Dodd, D.A.; Lockrem, L.L.; Meacham, J.E.; Winkelman, W.D.

    1994-01-01

    Three FTIR-based analytical methods that have potential to characterize simulated waste tank materials have been evaluated. These include: (1) fiber optics, (2) modular transfer optic using light guides equipped with non-contact sampling peripherals, and (3) photoacoustic spectroscopy. Pertinent instrumentation and experimental procedures for each method are described. The results show that the near-infrared (NIR) region of the infrared spectrum is the region of choice for the measurement of moisture in waste simulants. Differentiation of the NIR spectrum, as a preprocessing steps, will improve the analytical result. Preliminary data indicate that prominent combination bands of water and the first overtone band of the ferrocyanide stretching vibration may be utilized to measure water and ferrocyanide species simultaneously. Both near-infrared and mid-infrared spectra must be collected, however, to measure ferrocyanide species unambiguously and accurately. For ease of sample handling and the potential for field or waste tank deployment, the FTIR-Fiber Optic method is preferred over the other two methods. Modular transfer optic using light guides and photoacoustic spectroscopy may be used as backup systems and for the validation of the fiber optic data

  6. Hybrid and electric advanced vehicle systems (heavy) simulation

    Science.gov (United States)

    Hammond, R. A.; Mcgehee, R. K.

    1981-01-01

    A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.

  7. Vision and Displays for Military and Security Applications The Advanced Deployable Day/Night Simulation Project

    CERN Document Server

    Niall, Keith K

    2010-01-01

    Vision and Displays for Military and Security Applications presents recent advances in projection technologies and associated simulation technologies for military and security applications. Specifically, this book covers night vision simulation, semi-automated methods in photogrammetry, and the development and evaluation of high-resolution laser projection technologies for simulation. Topics covered include: advances in high-resolution projection, advances in image generation, geographic modeling, and LIDAR imaging, as well as human factors research for daylight simulation and for night vision devices. This title is ideal for optical engineers, simulator users and manufacturers, geomatics specialists, human factors researchers, and for engineers working with high-resolution display systems. It describes leading-edge methods for human factors research, and it describes the manufacture and evaluation of ultra-high resolution displays to provide unprecedented pixel density in visual simulation.

  8. Interim Service ISDN Satellite (ISIS) simulator development for advanced satellite designs and experiments

    Science.gov (United States)

    Pepin, Gerard R.

    1992-01-01

    The simulation development associated with the network models of both the Interim Service Integrated Services Digital Network (ISDN) Satellite (ISIS) and the Full Service ISDN Satellite (FSIS) architectures is documented. The ISIS Network Model design represents satellite systems like the Advanced Communications Technology Satellite (ACTS) orbiting switch. The FSIS architecture, the ultimate aim of this element of the Satellite Communications Applications Research (SCAR) Program, moves all control and switching functions on-board the next generation ISDN communications satellite. The technical and operational parameters for the advanced ISDN communications satellite design will be obtained from the simulation of ISIS and FSIS engineering software models for their major subsystems. Discrete event simulation experiments will be performed with these models using various traffic scenarios, design parameters, and operational procedures. The data from these simulations will be used to determine the engineering parameters for the advanced ISDN communications satellite.

  9. Advances in downstream processing of biologics - Spectroscopy: An emerging process analytical technology.

    Science.gov (United States)

    Rüdt, Matthias; Briskot, Till; Hubbuch, Jürgen

    2017-03-24

    Process analytical technologies (PAT) for the manufacturing of biologics have drawn increased interest in the last decade. Besides being encouraged by the Food and Drug Administration's (FDA's) PAT initiative, PAT promises to improve process understanding, reduce overall production costs and help to implement continuous manufacturing. This article focuses on spectroscopic tools for PAT in downstream processing (DSP). Recent advances and future perspectives will be reviewed. In order to exploit the full potential of gathered data, chemometric tools are widely used for the evaluation of complex spectroscopic information. Thus, an introduction into the field will be given. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Monte Carlo simulation models of breeding-population advancement.

    Science.gov (United States)

    J.N. King; G.R. Johnson

    1993-01-01

    Five generations of population improvement were modeled using Monte Carlo simulations. The model was designed to address questions that are important to the development of an advanced generation breeding population. Specifically we addressed the effects on both gain and effective population size of different mating schemes when creating a recombinant population for...

  11. Assessment of passive drag in swimming by numerical simulation and analytical procedure.

    Science.gov (United States)

    Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A

    2018-03-01

    The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2  = 0.98) and after log-log transformation (R 2  = 0.99). The C D also obtained a very high adjustment for both absolute (R 2  = 0.97) and log-log plots (R 2  = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.

  12. Requirements for advanced simulation of nuclear reactor and chemicalseparation plants.

    Energy Technology Data Exchange (ETDEWEB)

    Palmiotti, G.; Cahalan, J.; Pfeiffer, P.; Sofu, T.; Taiwo, T.; Wei,T.; Yacout, A.; Yang, W.; Siegel, A.; Insepov, Z.; Anitescu, M.; Hovland,P.; Pereira, C.; Regalbuto, M.; Copple, J.; Willamson, M.

    2006-12-11

    This report presents requirements for advanced simulation of nuclear reactor and chemical processing plants that are of interest to the Global Nuclear Energy Partnership (GNEP) initiative. Justification for advanced simulation and some examples of grand challenges that will benefit from it are provided. An integrated software tool that has its main components, whenever possible based on first principles, is proposed as possible future approach for dealing with the complex problems linked to the simulation of nuclear reactor and chemical processing plants. The main benefits that are associated with a better integrated simulation have been identified as: a reduction of design margins, a decrease of the number of experiments in support of the design process, a shortening of the developmental design cycle, and a better understanding of the physical phenomena and the related underlying fundamental processes. For each component of the proposed integrated software tool, background information, functional requirements, current tools and approach, and proposed future approaches have been provided. Whenever possible, current uncertainties have been quoted and existing limitations have been presented. Desired target accuracies with associated benefits to the different aspects of the nuclear reactor and chemical processing plants were also given. In many cases the possible gains associated with a better simulation have been identified, quantified, and translated into economical benefits.

  13. Computer aided design of Langasite resonant cantilevers: analytical models and simulations

    Science.gov (United States)

    Tellier, C. R.; Leblois, T. G.; Durand, S.

    2010-05-01

    Analytical models for the piezoelectric excitation and for the wet micromachining of resonant cantilevers are proposed. Firstly, computations of metrological performances of micro-resonators allow us to select special cuts and special alignment of the cantilevers. Secondly the self-elaborated simulator TENSOSIM based on the kinematic and tensorial model furnishes etching shapes of cantilevers. As the result the number of selected cuts is reduced. Finally the simulator COMSOL® is used to evaluate the influence of final etching shape on metrological performances and especially on the resonance frequency. Changes in frequency are evaluated and deviating behaviours of structures with less favourable built-ins are tested showing that the X cut is the best cut for LGS resonant cantilevers vibrating in flexural modes (type 1 and type 2) or in torsion mode.

  14. Markov chains analytic and Monte Carlo computations

    CERN Document Server

    Graham, Carl

    2014-01-01

    Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec

  15. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  16. Analytical simulation of RBS spectra of nanowire samples

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, Nuno P., E-mail: nunoni@ctn.ist.utl.pt [Centro de Ciências e Tecnologias Nucleares, Instituto Superior Técnico, Universidade de Lisboa, E.N. 10 ao km 139,7, 2695-066 Bobadela LRS (Portugal); García Núñez, C. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Redondo-Cubero, A. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Centro de Micro-Análisis de Materiales, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Shen, G.; Kung, P. [Department of Electrical and Computer Engineering, The University of Alabama, AL 35487 (United States); Pau, J.L. [Laboratorio de Electrónica y Semiconductores, Departamento de Física Aplicada, Universidad Autónoma de Madrid, 28049 Madrid (Spain)

    2016-03-15

    Almost all, if not all, general purpose codes for analysis of Ion Beam Analysis data have been originally developed to handle laterally homogeneous samples only. This is the case of RUMP, NDF, SIMNRA, and even of the Monte Carlo code Corteo. General-purpose codes usually include only limited support for lateral inhomogeneity. In this work, we show analytical simulations of samples that consist of a layer of parallel oriented nanowires on a substrate, using a model implemented in NDF. We apply the code to real samples, made of vertical ZnO nanowires on a sapphire substrate. Two configurations of the nanowires were studied: 40 nm diameter, 4.1 μm height, 3.5% surface coverage; and 55 nm diameter, 1.1 μm height, 42% surface coverage. We discuss the accuracy and limits of applicability of the analysis.

  17. Accelerating development of advanced inverters :

    Energy Technology Data Exchange (ETDEWEB)

    Neely, Jason C.; Gonzalez, Sigifredo; Ropp, Michael; Schutz, Dustin

    2013-11-01

    The high penetration of utility interconnected photovoltaic (PV) systems is causing heightened concern over the effect that variable renewable generation will have on the electrical power system (EPS). These concerns have initiated the need to amend the utility interconnection standard to allow advanced inverter control functionalities that provide: (1) reactive power control for voltage support, (2) real power control for frequency support and (3) better tolerance of grid disturbances. These capabilities are aimed at minimizing the negative impact distributed PV systems may have on EPS voltage and frequency. Unfortunately, these advanced control functions may interfere with island detection schemes, and further development of advanced inverter functions requires a study of the effect of advanced functions on the efficacy of antiislanding schemes employed in industry. This report summarizes the analytical, simulation and experimental work to study interactions between advanced inverter functions and anti-islanding schemes being employed in distributed PV systems.

  18. The Osseus platform: a prototype for advanced web-based distributed simulation

    Science.gov (United States)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  19. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Patrick [Kitware, Inc., Clifton Park, NY (United States)

    2017-01-30

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energy advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.

  20. The Vienna LTE-advanced simulators up and downlink, link and system level simulation

    CERN Document Server

    Rupp, Markus; Taranetz, Martin

    2016-01-01

    This book introduces the Vienna Simulator Suite for 3rd-Generation Partnership Project (3GPP)-compatible Long Term Evolution-Advanced (LTE-A) simulators and presents applications to demonstrate their uses for describing, designing, and optimizing wireless cellular LTE-A networks. Part One addresses LTE and LTE-A link level techniques. As there has been high demand for the downlink (DL) simulator, it constitutes the central focus of the majority of the chapters. This part of the book reports on relevant highlights, including single-user (SU), multi-user (MU) and single-input-single-output (SISO) as well as multiple-input-multiple-output (MIMO) transmissions. Furthermore, it summarizes the optimal pilot pattern for high-speed communications as well as different synchronization issues. One chapter is devoted to experiments that show how the link level simulator can provide input to a testbed. This section also uses measurements to present and validate fundamental results on orthogonal frequency division multiple...

  1. Analytic analysis of auxetic metamaterials through analogy with rigid link systems

    OpenAIRE

    Rayneau-Kirkhope, Daniel; Zhang, Chengzhao; Theran, Louis; Dias, Marcelo A.

    2017-01-01

    Recent progress in advanced additive manufacturing techniques has stimulated the growth of the field of mechanical metamaterials. One area particular interest in this subject is the creation of auxetic material properties through elastic instability. This paper focuses on a novel methodology in the analysis of auxetic metamaterials through analogy with rigid link lattice systems. Our analytic methodology gives extremely good agreement with finite element simulations for both the onset of elas...

  2. Advance simulation capability for environmental management (ASCEM) - 59065

    International Nuclear Information System (INIS)

    Dixon, Paul; Keating, Elizabeth; Moulton, David; Williamson, Mark; Collazo, Yvette; Gerdes, Kurt; Freshley, Mark; Gorton, Ian; Meza, Juan

    2012-01-01

    The United States Department Energy (DOE) Office of Environmental Management (EM) determined that uniform application of advanced modeling in the subsurface could help reduce the cost and risks associated with its environmental cleanup mission. In response to this determination, the EM Office of Technology Innovation and Development (OTID), Groundwater and Soil Remediation (GW and S) began the program Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for integrating data and scientific understanding to enable prediction of contaminant fate and transport in natural and engineered systems. This initiative supports the reduction of uncertainties and risks associated with EM?s environmental cleanup and closure programs through better understanding and quantifying the subsurface flow and contaminant transport behavior in complex geological systems. This involves the long-term performance of engineered components, including cementitious materials in nuclear waste disposal facilities that may be sources for future contamination of the subsurface. This paper describes the ASCEM tools and approach and the ASCEM programmatic accomplishments completed in 2010 including recent advances and technology transfer. The US Department of Energy Office of Environmental Management has begun development of an Advanced Simulation Capability for Environmental Management, (ASCEM). This program will provide predictions of the end states of contaminated areas allowing for cost and risk reduction of EM remedial activities. ASCEM will provide the tools and approaches necessary to standardize risk and performance assessments across the DOE complex. Through its Phase One demonstration, the ASCEM team has shown value to the EM community in the areas of High Performance Computing, Data Management, Visualization, and Uncertainty Quantification. In 2012, ASCEM will provide an initial limited release of a community code for

  3. Simulation of reactive geochemical transport in groundwater using a semi-analytical screening model

    Science.gov (United States)

    McNab, Walt W.

    1997-10-01

    A reactive geochemical transport model, based on a semi-analytical solution to the advective-dispersive transport equation in two dimensions, is developed as a screening tool for evaluating the impact of reactive contaminants on aquifer hydrogeochemistry. Because the model utilizes an analytical solution to the transport equation, it is less computationally intensive than models based on numerical transport schemes, is faster, and it is not subject to numerical dispersion effects. Although the assumptions used to construct the model preclude consideration of reactions between the aqueous and solid phases, thermodynamic mineral saturation indices are calculated to provide qualitative insight into such reactions. Test problems involving acid mine drainage and hydrocarbon biodegradation signatures illustrate the utility of the model in simulating essential hydrogeochemical phenomena.

  4. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  5. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  6. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    Energy Technology Data Exchange (ETDEWEB)

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  7. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    Science.gov (United States)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  8. Advanced Engineering Environments: Implications for Aerospace Manufacturing

    Science.gov (United States)

    Thomas, D.

    2001-01-01

    There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.

  9. Evaluation of the Inertial Response of Variable-Speed Wind Turbines Using Advanced Simulation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Scholbrock, Andrew K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Muljadi, Eduard [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Scholbrock, Andrew K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wang, Xiao [Northeastern University; Gao, Wenzhong [University of Denver; Yan, Weihang [University of Denver; Wang, Jianhui [Northeastern University

    2017-08-09

    In this paper, we focus on the temporary frequency support effect provided by wind turbine generators (WTGs) through the inertial response. With the implemented inertial control methods, the WTG is capable of increasing its active power output by releasing parts of the stored kinetic energy when the frequency excursion occurs. The active power can be boosted temporarily above the maximum power points, but the rotor speed deceleration follows and an active power output deficiency occurs during the restoration of rotor kinetic energy. In this paper, we evaluate and compare the inertial response induced by two distinct inertial control methods using advanced simulation. In the first stage, the proposed inertial control methods are analyzed in offline simulation. Using an advanced wind turbine simulation program, FAST with TurbSim, the response of the researched wind turbine is comprehensively evaluated under turbulent wind conditions, and the impact on the turbine mechanical components are assessed. In the second stage, the inertial control is deployed on a real 600-kW wind turbine, the three-bladed Controls Advanced Research Turbine, which further verifies the inertial control through a hardware-in-the-loop simulation. Various inertial control methods can be effectively evaluated based on the proposed two-stage simulation platform, which combines the offline simulation and real-time hardware-in-the-loop simulation. The simulation results also provide insights in designing inertial control for WTGs.

  10. Simulation of an Electromagnetic Acoustic Transducer Array by Using Analytical Method and FDTD

    Directory of Open Access Journals (Sweden)

    Yuedong Xie

    2016-01-01

    Full Text Available Previously, we developed a method based on FEM and FDTD for the study of an Electromagnetic Acoustic Transducer Array (EMAT. This paper presents a new analytical solution to the eddy current problem for the meander coil used in an EMAT, which is adapted from the classic Deeds and Dodd solution originally intended for circular coils. The analytical solution resulting from this novel adaptation exploits the large radius extrapolation and shows several advantages over the finite element method (FEM, especially in the higher frequency regime. The calculated Lorentz force density from the analytical EM solver is then coupled to the ultrasonic simulations, which exploit the finite-difference time-domain (FDTD method to describe the propagation of ultrasound waves, in particular for Rayleigh waves. Radiation pattern obtained with Hilbert transform on time-domain waveforms is proposed to characterise the sensor in terms of its beam directivity and field distribution along the steering angle, which can produce performance parameters for an EMAT array, facilitating the optimum design of such sensors.

  11. Big data, advanced analytics and the future of comparative effectiveness research.

    Science.gov (United States)

    Berger, Marc L; Doban, Vitalii

    2014-03-01

    The intense competition that accompanied the growth of internet-based companies ushered in the era of 'big data' characterized by major innovations in processing of very large amounts of data and the application of advanced analytics including data mining and machine learning. Healthcare is on the cusp of its own era of big data, catalyzed by the changing regulatory and competitive environments, fueled by growing adoption of electronic health records, as well as efforts to integrate medical claims, electronic health records and other novel data sources. Applying the lessons from big data pioneers will require healthcare and life science organizations to make investments in new hardware and software, as well as in individuals with different skills. For life science companies, this will impact the entire pharmaceutical value chain from early research to postcommercialization support. More generally, this will revolutionize comparative effectiveness research.

  12. Multi-purpose use of the advanced CANDU compact simulator

    International Nuclear Information System (INIS)

    Lam, K.Y.; MacBeth, M.J.

    1997-01-01

    A near full-scope dynamic model of a CANDU-PHWR (Canadian Deuterium Uranium Pressurized Heavy Water) nuclear power plant was constructed as a multi-purpose advanced Compact Simulator using CASSIM (Cassiopeia Simulation) development system. This Compact Simulator has played an integral part in the design and verification of the CANDU 900 MW control centre mock-up located in the Atomic Energy of Canada (AECL) design office, providing CANDU plant process dynamic data to the Plant Display System (PDS) and the Distributed Control System (DCS), as well as mock-up panel devices. As a design tool, the Compact Simulator is intended to be used for control strategy development, human factors studies, analysis of overall plant control performance, tuning estimates for major control loops. As a plant commissioning and operational strategy development tool, the simulation is intended to be used to evaluate routine and non-routine operational procedures, practice 'what-if' scenarios for operational strategy development, practice malfunction recovery procedures and verify human factors activities

  13. A CRITICAL STUDY AND COMPARISON OF MANUFACTURING SIMULATION SOFTWARES USING ANALYTIC HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    ASHU GUPTA

    2010-03-01

    Full Text Available In a period of continuous change in global business environment, organizations, large and small, are finding it increasingly difficult to deal with, and adjust to the demands for such change. Simulation is a powerful tool for allowing designers imagines new systems and enabling them to both quantify and observe behavior. Currently the market offers a variety of simulation software packages. Some are less expensive than others. Some are generic and can be used in a wide variety of application areas while others are more specific. Some have powerful features for modeling while others provide only basic features. Modeling approaches and strategies are different for different packages. Companies are seeking advice about the desirable features of software for manufacturing simulation, depending on the purpose of its use. Because of this, the importance of an adequate approach to simulation software evaluation and comparison is apparent. This paper presents a critical evaluation of four widely used manufacturing simulators: NX-IDEAS, Star-CD, Micro Saint Sharp and ProModel. Following a review of research into simulation software evaluation, an evaluation and comparison of the above simulators is performed. This paper illustrates and assesses the role the Analytic Hierarchy Process (AHP played in simulation software evaluation and selection. The main purpose of this evaluation and comparison is to discover the suitability of certain types of simulators for particular purposes.

  14. A Distributed Simulation Facility to Support Human Factors Research in Advanced Air Transportation Technology

    Science.gov (United States)

    Amonlirdviman, Keith; Farley, Todd C.; Hansman, R. John, Jr.; Ladik, John F.; Sherer, Dana Z.

    1998-01-01

    A distributed real-time simulation of the civil air traffic environment developed to support human factors research in advanced air transportation technology is presented. The distributed environment is based on a custom simulation architecture designed for simplicity and flexibility in human experiments. Standard Internet protocols are used to create the distributed environment, linking all advanced cockpit simulator, all Air Traffic Control simulator, and a pseudo-aircraft control and simulation management station. The pseudo-aircraft control station also functions as a scenario design tool for coordinating human factors experiments. This station incorporates a pseudo-pilot interface designed to reduce workload for human operators piloting multiple aircraft simultaneously in real time. The application of this distributed simulation facility to support a study of the effect of shared information (via air-ground datalink) on pilot/controller shared situation awareness and re-route negotiation is also presented.

  15. Course on Advanced Analytical Chemistry and Chromatography

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Fristrup, Peter; Nielsen, Kristian Fog

    2011-01-01

    Methods of analytical chemistry constitute an integral part of decision making in chemical research, and students must master a high degree of knowledge, in order to perform reliable analysis. At DTU departments of chemistry it was thus decided to develop a course that was attractive to master...... students of different direction of studies, to Ph.D. students and to professionals that need an update of their current state of skills and knowledge. A course of 10 ECTS points was devised with the purpose of introducing students to analytical chemistry and chromatography with the aim of including theory...

  16. Application of modified analytical function for approximation and computer simulation of diffraction profile

    International Nuclear Information System (INIS)

    Marrero, S. I.; Turibus, S. N.; Assis, J. T. De; Monin, V. I.

    2011-01-01

    Data processing of the most of diffraction experiments is based on determination of diffraction line position and measurement of broadening of diffraction profile. High precision and digitalisation of these procedures can be resolved by approximation of experimental diffraction profiles by analytical functions. There are various functions for these purposes both simples, like Gauss function, but no suitable for wild range of experimental profiles and good approximating functions but complicated for practice using, like Vougt or PersonVII functions. Proposed analytical function is modified Cauchy function which uses two variable parameters allowing describing any experimental diffraction profile. In the presented paper modified function was applied for approximation of diffraction lines of steels after various physical and mechanical treatments and simulation of diffraction profiles applied for study of stress gradients and distortions of crystal structure. (Author)

  17. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  18. Equipping simulators with an advanced thermal hydraulics model EDF's experience

    International Nuclear Information System (INIS)

    Soldermann, R.; Poizat, F.; Sekri, A.; Faydide, B.; Dumas, J.M.

    1997-01-01

    The development of an accelerated version of the advanced CATHARe-1 thermal hydraulics code designed for EDF training simulators (CATHARE-SIMU) was successfully completed as early as 1991. Its successful integration as the principal model of the SIPA Post-Accident Simulator meant that its use could be extended to full-scale simulators as part of the renovation of the stock of existing simulators. In order to further extend the field of application to accidents occurring in shutdown states requiring action and to catch up with developments in respect of the CATHARE code, EDF initiated the SCAR Project designed to adapt CATHARE-2 to simulator requirements (acceleration, parallelization of the computation and extension of the simulation range). In other respects, the installation of SIPA on workstations means that the authors can envisage the application of this remarkable training facility to the understanding of thermal hydraulics accident phenomena

  19. Scientific Discovery through Advanced Computing in Plasma Science

    Science.gov (United States)

    Tang, William

    2005-03-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations

  20. Advanced computations in plasma physics

    International Nuclear Information System (INIS)

    Tang, W.M.

    2002-01-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  1. Analytic modeling, simulation and interpretation of broadband beam coupling impedance bench measurements

    Energy Technology Data Exchange (ETDEWEB)

    Niedermayer, U., E-mail: niedermayer@temf.tu-darmstadt.de [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); Eidam, L. [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); Boine-Frankenheim, O. [Institut für Theorie Elektromagnetischer Felder (TEMF), Technische Universität Darmstadt, Schloßgartenstraße 8, 64289 Darmstadt (Germany); GSI Helmholzzentrum für Schwerionenforschung, Planckstraße 1, 64291 Darmstadt (Germany)

    2015-03-11

    First, a generalized theoretical approach towards beam coupling impedances and stretched-wire measurements is introduced. Applied to a circular symmetric setup, this approach allows to compare beam and wire impedances. The conversion formulas for TEM scattering parameters from measurements to impedances are thoroughly analyzed and compared to the analytical beam impedance solution. A proof of validity for the distributed impedance formula is given. The interaction of the beam or the TEM wave with dispersive material such as ferrite is discussed. The dependence of the obtained beam impedance on the relativistic velocity β is investigated and found as material property dependent. Second, numerical simulations of wakefields and scattering parameters are compared. The applicability of scattering parameter conversion formulas for finite device length is investigated. Laboratory measurement results for a circularly symmetric test setup, i.e. a ferrite ring, are shown and compared to analytic and numeric models. The optimization of the measurement process and error reduction strategies are discussed.

  2. Advancements Made to the Wingman Software-in-the-Loop (SIL) Simulation: How to Operate the SIL

    Science.gov (United States)

    2017-12-01

    then comparing the positions in the simulation . This required going through the mesh generation and conversion process multiple times. b. One of the...ARL-TR-8254 ● DEC 2017 US Army Research Laboratory Advancements Made to the Wingman Software-in-the-Loop (SIL) Simulation : How...TR-8254 ● DEC 2017 US Army Research Laboratory Advancements Made to the Wingman Software-in-the-Loop (SIL) Simulation : How to Operate the SIL

  3. A Virtual Engineering Framework for Simulating Advanced Power System

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  4. Editorial: Advances in Health Education Applying E-Learning, Simulations and Distance Technologies

    Directory of Open Access Journals (Sweden)

    Andre W. Kushniruk

    2011-03-01

    Full Text Available This special issue of the KM&EL international journal is dedicated to coverage of novel advances in health professional education applying e-Learning, simulations and distance education technologies. Modern healthcare is beginning to be transformed through the emergence of new information technologies and rapid advances in health informatics. Advances such as electronic health record systems (EHRs, clinical decision support systems and other advanced information systems such as public health surveillance systems are rapidly being deployed worldwide. The education of health professionals such as medical, nursing and allied health professionals will require an improved understanding of these technologies and how they will transform their healthcare practice. However, currently there is a lack of integration of knowledge and skills related to such technology in health professional education. In this issue of the journal we present articles that describe a set of novel approaches to integrating essential health information technology into the education of health professionals, as well as the use of advanced information technologies and e-Learning approaches for improving health professional education. The approaches range from use of simulations to development of novel Web-based platforms for allowing students to interact with the technologies and healthcare practices that are rapidly changing healthcare.

  5. MERRA Analytic Services

    Science.gov (United States)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  6. Workplace Skills Taught in a Simulated Analytical Department

    Science.gov (United States)

    Sonchik Marine, Susan

    2001-11-01

    Integration of workplace skills into the academic setting is paramount for any chemical technology program. In addition to the expected chemistry content, courses must build proficiency in oral and written communication skills, computer skills, laboratory safety, and logical troubleshooting. Miami University's Chemical Technology II course is set up as a contract analytical laboratory. Students apply the advanced sampling techniques, quality assurance, standard methods, and statistical analyses they have studied. For further integration of workplace skills, weekly "department meetings" are held where the student, as members of the department, report on their work in process, present completed projects, and share what they have learned and what problems they have encountered. Information is shared between the experienced members of the department and those encountering problems or starting a new project. The instructor as department manager makes announcements, reviews company and department status, and assigns work for the coming week. The department members report results to clients in formal reports or in short memos. Factors affecting the success of the "department meeting" approach include the formality of the meeting room, use of an official agenda, the frequency, time, and duration of the meeting, and accountability of the students.

  7. Effect of Advanced Trauma Life Support program on medical interns' performance in simulated trauma patient management.

    Science.gov (United States)

    Ahmadi, Koorosh; Sedaghat, Mohammad; Safdarian, Mahdi; Hashemian, Amir-Masoud; Nezamdoust, Zahra; Vaseie, Mohammad; Rahimi-Movaghar, Vafa

    2013-01-01

    Since appropriate and time-table methods in trauma care have an important impact on patients'outcome, we evaluated the effect of Advanced Trauma Life Support (ATLS) program on medical interns' performance in simulated trauma patient management. A descriptive and analytical study before and after the training was conducted on 24 randomly selected undergraduate medical interns from Imam Reza Hospital in Mashhad, Iran. On the first day, we assessed interns' clinical knowledge and their practical skill performance in confronting simulated trauma patients. After 2 days of ATLS training, we performed the same study and evaluated their score again on the fourth day. The two findings, pre- and post- ATLS periods, were compared through SPSS version 15.0 software. P values less than 0.05 were considered statistically significant. Our findings showed that interns'ability in all the three tasks improved after the training course. On the fourth day after training, there was a statistically significant increase in interns' clinical knowledge of ATLS procedures, the sequence of procedures and skill performance in trauma situations (P less than 0.001, P equal to 0.016 and P equal to 0.01 respectively). ATLS course has an important role in increasing clinical knowledge and practical skill performance of trauma care in medical interns.

  8. An analytical method to simulate the H I 21-cm visibility signal for intensity mapping experiments

    Science.gov (United States)

    Sarkar, Anjan Kumar; Bharadwaj, Somnath; Marthi, Visweshwar Ram

    2018-01-01

    Simulations play a vital role in testing and validating H I 21-cm power spectrum estimation techniques. Conventional methods use techniques like N-body simulations to simulate the sky signal which is then passed through a model of the instrument. This makes it necessary to simulate the H I distribution in a large cosmological volume, and incorporate both the light-cone effect and the telescope's chromatic response. The computational requirements may be particularly large if one wishes to simulate many realizations of the signal. In this paper, we present an analytical method to simulate the H I visibility signal. This is particularly efficient if one wishes to simulate a large number of realizations of the signal. Our method is based on theoretical predictions of the visibility correlation which incorporate both the light-cone effect and the telescope's chromatic response. We have demonstrated this method by applying it to simulate the H I visibility signal for the upcoming Ooty Wide Field Array Phase I.

  9. Analytic simulation of the Poincare surface of sections for the diamagnetic Kepler problem

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, H; Harada, A; Okazaki, Y [Kyoto Univ. (Japan). Dept. of Physics

    1984-11-11

    The Poincare surface-of-section analysis which the authors previously reported on the diamagnetic Kepler problem (classical hydrogen atom in a uniform magnetic field) in a transition region from regular to chaotic motions is simulated by an analytic means, by taking intersections of the energy integral and the approximate integral ..lambda.. of Solovev to obtain sections of the two separate regions of the motion that exist in the limit of a weak magnetic field (B ..-->.. 0). The origin of the unique hyperbolic point and the separatrix around which the onset of chaos takes place are thus identified. The invariant tori arising near the full chaos are shown to be simulated by this method but with modified parameter values in the expression ..lambda...

  10. Analytic simulation of the Poincare surface of sections for the diamagnetic Kepler problem

    International Nuclear Information System (INIS)

    Hasegawa, H.; Harada, A.; Okazaki, Y.

    1984-01-01

    The Poincare surface-of-section analysis which the authors previously reported on the diamagnetic Kepler problem (classical hydrogen atom in a uniform magnetic field) in a transition region from regular to chaotic motions is simulated by an analytic means, by taking intersections of the energy integral and the approximate integral Λ of Solovev to obtain sections of the two separate regions of the motion that exist in the limit of a weak magnetic field (B → 0). The origin of the unique hyperbolic point and the separatrix around which the onset of chaos takes place are thus identified. The invariant tori arising near the full chaos are shown to be simulated by this method but with modified parameter values in the expression Λ. (author)

  11. Advances in Computational Fluid-Structure Interaction and Flow Simulation Conference

    CERN Document Server

    Takizawa, Kenji

    2016-01-01

    This contributed volume celebrates the work of Tayfun E. Tezduyar on the occasion of his 60th birthday. The articles it contains were born out of the Advances in Computational Fluid-Structure Interaction and Flow Simulation (AFSI 2014) conference, also dedicated to Prof. Tezduyar and held at Waseda University in Tokyo, Japan on March 19-21, 2014. The contributing authors represent a group of international experts in the field who discuss recent trends and new directions in computational fluid dynamics (CFD) and fluid-structure interaction (FSI). Organized into seven distinct parts arranged by thematic topics, the papers included cover basic methods and applications of CFD, flows with moving boundaries and interfaces, phase-field modeling, computer science and high-performance computing (HPC) aspects of flow simulation, mathematical methods, biomedical applications, and FSI. Researchers, practitioners, and advanced graduate students working on CFD, FSI, and related topics will find this collection to be a defi...

  12. Advanced 3D Photocathode Modeling and Simulations Final Report

    International Nuclear Information System (INIS)

    Dimitre A Dimitrov; David L Bruhwiler

    2005-01-01

    High brightness electron beams required by the proposed Next Linear Collider demand strong advances in photocathode electron gun performance. Significant improvement in the production of such beams with rf photocathode electron guns is hampered by the lack high-fidelity simulations. The critical missing piece in existing gun codes is a physics-based, detailed treatment of the very complex and highly nonlinear photoemission process

  13. A Multi-Projector Calibration Method for Virtual Reality Simulators with Analytically Defined Screens

    Directory of Open Access Journals (Sweden)

    Cristina Portalés

    2017-06-01

    Full Text Available The geometric calibration of projectors is a demanding task, particularly for the industry of virtual reality simulators. Different methods have been developed during the last decades to retrieve the intrinsic and extrinsic parameters of projectors, most of them being based on planar homographies and some requiring an extended calibration process. The aim of our research work is to design a fast and user-friendly method to provide multi-projector calibration on analytically defined screens, where a sample is shown for a virtual reality Formula 1 simulator that has a cylindrical screen. The proposed method results from the combination of surveying, photogrammetry and image processing approaches, and has been designed by considering the spatial restrictions of virtual reality simulators. The method has been validated from a mathematical point of view, and the complete system—which is currently installed in a shopping mall in Spain—has been tested by different users.

  14. Advancing Simulation-Based Education in Pain Medicine.

    Science.gov (United States)

    Singh, Naileshni; Nielsen, Alison A; Copenhaver, David J; Sheth, Samir J; Li, Chin-Shang; Fishman, Scott M

    2018-02-27

    The Accreditation Council for Graduate Medical Education (ACGME) has recently implemented milestones and competencies as a framework for training fellows in Pain Medicine, but individual programs are left to create educational platforms and assessment tools that meet ACGME standards. In this article, we discuss the concept of milestone-based competencies and the inherent challenges for implementation in pain medicine. We consider simulation-based education (SBE) as a potential tool for the field to meet ACGME goals through advancing novel learning opportunities, engaging in clinically relevant scenarios, and mastering technical and nontechnical skills. The sparse literature on SBE in pain medicine is highlighted, and we describe our pilot experience, which exemplifies a nascent effort that encountered early difficulties in implementing and refining an SBE program. The many complexities in offering a sophisticated simulated pain curriculum that is valid, reliable, feasible, and acceptable to learners and teachers may only be overcome with coordinated and collaborative efforts among pain medicine training programs and governing institutions.

  15. Advances in the realtime simulation of synthetic clutter for radar testing and evaluation

    CSIR Research Space (South Africa)

    Strydom, JJ

    2010-10-01

    Full Text Available measures. Recent developments in processing power have allowed for a ground clutter simulation capability to be added to this list. RadaR ClutteR Simulation Radar clutter simulation is computationally expensive as a single range line can contain... and correlation functions require more processing power to simulate. RefeRenCeS [1] B. Manz, ?DRFMs Grow to Meet New Threats,? The Journal of Electronic Defense, August 2010, pp. 43-48. K-8430 [www.kashan.co.za] Advances in the Realtime Simulation...

  16. A simulation-based analytic model of radio galaxies

    Science.gov (United States)

    Hardcastle, M. J.

    2018-04-01

    I derive and discuss a simple semi-analytical model of the evolution of powerful radio galaxies which is not based on assumptions of self-similar growth, but rather implements some insights about the dynamics and energetics of these systems derived from numerical simulations, and can be applied to arbitrary pressure/density profiles of the host environment. The model can qualitatively and quantitatively reproduce the source dynamics and synchrotron light curves derived from numerical modelling. Approximate corrections for radiative and adiabatic losses allow it to predict the evolution of radio spectral index and of inverse-Compton emission both for active and `remnant' sources after the jet has turned off. Code to implement the model is publicly available. Using a standard model with a light relativistic (electron-positron) jet, subequipartition magnetic fields, and a range of realistic group/cluster environments, I simulate populations of sources and show that the model can reproduce the range of properties of powerful radio sources as well as observed trends in the relationship between jet power and radio luminosity, and predicts their dependence on redshift and environment. I show that the distribution of source lifetimes has a significant effect on both the source length distribution and the fraction of remnant sources expected in observations, and so can in principle be constrained by observations. The remnant fraction is expected to be low even at low redshift and low observing frequency due to the rapid luminosity evolution of remnants, and to tend rapidly to zero at high redshift due to inverse-Compton losses.

  17. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  18. Analytical framework for borehole heat exchanger (BHE) simulation influenced by horizontal groundwater flow and complex top boundary conditions

    Science.gov (United States)

    Rivera, Jaime; Blum, Philipp; Bayer, Peter

    2015-04-01

    Borehole heat exchangers (BHE) are the most widely used technologies for tapping low-enthalpy energy resources in the shallow subsurface. Analysis of these systems requires a proper simulation of the relevant processes controlling the transfer of heat between the BHE and the ground. Among the available simulation approaches, analytical methods are broadly accepted, especially when low computational costs and comprehensive analyses are demanded. Moreover, these methods constitute the benchmark solutions to evaluate the performance of more complex numerical models. Within the spectrum of existing (semi-)analytical models, those based on the superposition of problem-specific Green's functions are particularly appealing. Green's functions can be derived, for instance, for nodal or line sources with constant or transient strengths. In the same manner, functional forms can be obtained for scenarios with complex top boundary conditions whose temperature may vary in space and time. Other relevant processes, such as advective heat transport, mechanical dispersion and heat transfer through the unsaturated zone could be incorporated as well. A keystone of the methodology is that individual solutions can be added up invoking the superposition principle. This leads to a flexible and robust framework for studying the interaction of multiple processes on thermal plumes of BHEs. In this contribution, we present a new analytical framework and its verification via comparison with a numerical model. It simulates a BHE as a line source, and it integrates both horizontal groundwater flow and the effect of top boundary effects due to variable land use. All these effects may be implemented as spatially and temporally variable. For validation, the analytical framework is successfully applied to study cases where highly resolved temperature data is available.

  19. An Advanced Analytical Chemistry Experiment Using Gas Chromatography-Mass Spectrometry, MATLAB, and Chemometrics to Predict Biodiesel Blend Percent Composition

    Science.gov (United States)

    Pierce, Karisa M.; Schale, Stephen P.; Le, Trang M.; Larson, Joel C.

    2011-01-01

    We present a laboratory experiment for an advanced analytical chemistry course where we first focus on the chemometric technique partial least-squares (PLS) analysis applied to one-dimensional (1D) total-ion-current gas chromatography-mass spectrometry (GC-TIC) separations of biodiesel blends. Then, we focus on n-way PLS (n-PLS) applied to…

  20. MAESTRO: Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology

    Science.gov (United States)

    Barthe, Jean; Hugon, Régis; Nicolai, Jean Philippe

    2007-12-01

    The integrated project MAESTRO (Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology) under contract with the European Commission in life sciences FP6 (LSHC-CT-2004-503564), concerns innovative research to develop and validate in clinical conditions, advanced methods and equipment needed in cancer treatment for new modalities in high-conformal external radiotherapy using electrons, photons and protons beams of high energy.

  1. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Phillips, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wampler, Cheryl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meisner, Robert [National Nuclear Security Administration (NNSA), Washington, DC (United States)

    2010-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality, and scientific details); to quantify critical margins and uncertainties; and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  2. The Advanced Gamma-ray Imaging System (AGIS): Simulation Studies

    Science.gov (United States)

    Fegan, Stephen; Buckley, J. H.; Bugaev, S.; Funk, S.; Konopelko, A.; Maier, G.; Vassiliev, V. V.; Simulation Studies Working Group; AGIS Collaboration

    2008-03-01

    The Advanced Gamma-ray Imaging System (AGIS) is a concept for the next generation instrument in ground-based very high energy gamma-ray astronomy. It has the goal of achieving significant improvement in sensitivity over current experiments. We present the results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  3. Analytical chemistry requirements for advanced reactors

    International Nuclear Information System (INIS)

    Jayashree, S.; Velmurugan, S.

    2015-01-01

    The nuclear power industry has been developing and improving reactor technology for more than five decades. Newer advanced reactors now being built have simpler designs which reduce capital cost. The greatest departure from most designs now in operation is that many incorporate passive or inherent safety features which require no active controls or operational intervention to avoid accidents in the event of malfunction, and may rely on gravity, natural convection or resistance to high temperatures. India is developing the Advanced Heavy Water Reactor (AHWR) in its plan to utilise thorium in nuclear power program

  4. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shujiang [ORNL; Kline, Keith L [ORNL; Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Dr Jeff A [ORNL; Post, Wilfred M [ORNL; Brandt, Craig C [ORNL; Wullschleger, Stan D [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  5. Advanced hybrid transient stability and EMT simulation for VSC-HVDC systems

    NARCIS (Netherlands)

    Van Der Meer, A.A.; Gibescu, M.; Van Der Meijden, M.A.M.M.; Kling, W.L.; Ferreira, J.A.

    2015-01-01

    This paper deals with advanced hybrid transient stability and electromagnetic-transient (EMT) simulation of combined ac/dc power systems containing large amounts of renewable energy sources interfaced through voltage-source converter-high-voltage direct current (VSC-HVDC). The concerning transient

  6. Analytical solution and numerical simulation of the liquid nitrogen freezing-temperature field of a single pipe

    Science.gov (United States)

    Cai, Haibing; Xu, Liuxun; Yang, Yugui; Li, Longqi

    2018-05-01

    Artificial liquid nitrogen freezing technology is widely used in urban underground engineering due to its technical advantages, such as simple freezing system, high freezing speed, low freezing temperature, high strength of frozen soil, and absence of pollution. However, technical difficulties such as undefined range of liquid nitrogen freezing and thickness of frozen wall gradually emerge during the application process. Thus, the analytical solution of the freezing-temperature field of a single pipe is established considering the freezing temperature of soil and the constant temperature of freezing pipe wall. This solution is then applied in a liquid nitrogen freezing project. Calculation results show that the radius of freezing front of liquid nitrogen is proportional to the square root of freezing time. The radius of the freezing front also decreases with decreased the freezing temperature, and the temperature gradient of soil decreases with increased distance from the freezing pipe. The radius of cooling zone in the unfrozen area is approximately four times the radius of the freezing front. Meanwhile, the numerical simulation of the liquid nitrogen freezing-temperature field of a single pipe is conducted using the Abaqus finite-element program. Results show that the numerical simulation of soil temperature distribution law well agrees with the analytical solution, further verifies the reliability of the established analytical solution of the liquid nitrogen freezing-temperature field of a single pipe.

  7. A Big Data and Learning Analytics Approach to Process-Level Feedback in Cognitive Simulations.

    Science.gov (United States)

    Pecaric, Martin; Boutis, Kathy; Beckstead, Jason; Pusic, Martin

    2017-02-01

    Collecting and analyzing large amounts of process data for the purposes of education can be considered a big data/learning analytics (BD/LA) approach to improving learning. However, in the education of health care professionals, the application of BD/LA is limited to date. The authors discuss the potential advantages of the BD/LA approach for the process of learning via cognitive simulations. Using the lens of a cognitive model of radiograph interpretation with four phases (orientation, searching/scanning, feature detection, and decision making), they reanalyzed process data from a cognitive simulation of pediatric ankle radiography where 46 practitioners from three expertise levels classified 234 cases online. To illustrate the big data component, they highlight the data available in a digital environment (time-stamped, click-level process data). Learning analytics were illustrated using algorithmic computer-enabled approaches to process-level feedback.For each phase, the authors were able to identify examples of potentially useful BD/LA measures. For orientation, the trackable behavior of re-reviewing the clinical history was associated with increased diagnostic accuracy. For searching/scanning, evidence of skipping views was associated with an increased false-negative rate. For feature detection, heat maps overlaid on the radiograph can provide a metacognitive visualization of common novice errors. For decision making, the measured influence of sequence effects can reflect susceptibility to bias, whereas computer-generated path maps can provide insights into learners' diagnostic strategies.In conclusion, the augmented collection and dynamic analysis of learning process data within a cognitive simulation can improve feedback and prompt more precise reflection on a novice clinician's skill development.

  8. Advances in ICT for business, industry and public sector

    CERN Document Server

    Olszak, Celina; Pełech-Pilichowski, Tomasz

    2015-01-01

    This contributed volume is a result of discussions held at ABICT’13(4th International Workshop on Advances in Business ICT) in Krakow, September 8-11, 2013. The book focuses on Advances in Business ICT approached from a multidisciplinary perspective and demonstrates different ideas and tools for developing and supporting organizational creativity, as well as advances in decision support systems.This book is an interesting resource for researchers, analysts and IT professionals including software designers. The book comprises eleven chapters presenting research results on business analytics in organization, business processes modeling, problems with processing big data, nonlinear time structures and nonlinear time ontology application, simulation profiling, signal processing (including change detection problems), text processing and risk analysis.    

  9. Advanced radiometric and interferometric milimeter-wave scene simulations

    Science.gov (United States)

    Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.

    1993-01-01

    Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.

  10. Simplified analytical model to simulate radionuclide release from radioactive waste trenches

    International Nuclear Information System (INIS)

    Sa, Bernardete Lemes Vieira de

    2001-01-01

    In order to evaluate postclosure off-site doses from low-level radioactive waste disposal facilities, a computer code was developed to simulate the radionuclide released from waste form, transport through vadose zone and transport in the saturated zone. This paper describes the methodology used to model these process. The radionuclide released from the waste is calculated using a model based on first order kinetics and the transport through porous media was determined using semi-analytical solution of the mass transport equation, considering the limiting case of unidirectional convective transport with three-dimensional dispersion in an isotropic medium. The results obtained in this work were compared with other codes, showing good agreement. (author)

  11. Distinct neural substrates of visuospatial and verbal-analytic reasoning as assessed by Raven's Advanced Progressive Matrices.

    Science.gov (United States)

    Chen, Zhencai; De Beuckelaer, Alain; Wang, Xu; Liu, Jia

    2017-11-24

    Recent studies revealed spontaneous neural activity to be associated with fluid intelligence (gF) which is commonly assessed by Raven's Advanced Progressive Matrices, and embeds two types of reasoning: visuospatial and verbal-analytic reasoning. With resting-state fMRI data, using global brain connectivity (GBC) analysis which averages functional connectivity of a voxel in relation to all other voxels in the brain, distinct neural correlates of these two reasoning types were found. For visuospatial reasoning, negative correlations were observed in both the primary visual cortex (PVC) and the precuneus, and positive correlations were observed in the temporal lobe. For verbal-analytic reasoning, negative correlations were observed in the right inferior frontal gyrus (rIFG), dorsal anterior cingulate cortex and temporoparietal junction, and positive correlations were observed in the angular gyrus. Furthermore, an interaction between GBC value and type of reasoning was found in the PVC, rIFG and the temporal lobe. These findings suggest that visuospatial reasoning benefits more from elaborate perception to stimulus features, whereas verbal-analytic reasoning benefits more from feature integration and hypothesis testing. In sum, the present study offers, for different types of reasoning in gF, first empirical evidence of separate neural substrates in the resting brain.

  12. Hydrophilic and amphiphilic water pollutants: using advanced analytical methods for classic and emerging contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Giger, Walter [GRC, Giger Research Consulting, Zurich (Switzerland); Eawag, Swiss Federal Institute of Aquatic Science and Technology, Duebendorf (Switzerland)

    2009-01-15

    Organic pollutants are a highly relevant topic in environmental science and technology. This article briefly reviews historic developments, and then focuses on the current state of the art and future perspectives on the qualitative and quantitative trace determination of polar organic contaminants, which are of particular concern in municipal and industrial wastewater effluents, ambient surface waters, run-off waters, atmospheric waters, groundwaters and drinking waters. The pivotal role of advanced analytical methods is emphasized and an overview of some contaminant classes is presented. Some examples of polar water pollutants, which are discussed in a bit more detail here, are chosen from projects tackled by the research group led by the author of this article. (orig.)

  13. Harmonic oscillator in heat bath: Exact simulation of time-lapse-recorded data and exact analytical benchmark statistics

    DEFF Research Database (Denmark)

    Nørrelykke, Simon F; Flyvbjerg, Henrik

    2011-01-01

    The stochastic dynamics of the damped harmonic oscillator in a heat bath is simulated with an algorithm that is exact for time steps of arbitrary size. Exact analytical results are given for correlation functions and power spectra in the form they acquire when computed from experimental time...

  14. Monte Carlo simulations to advance characterisation of landmines by pulsed fast/thermal neutron analysis

    NARCIS (Netherlands)

    Maucec, M.; Rigollet, C.

    The performance of a detection system based on the pulsed fast/thermal neutron analysis technique was assessed using Monte Carlo simulations. The aim was to develop and implement simulation methods, to support and advance the data analysis techniques of the characteristic gamma-ray spectra,

  15. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  16. Recent advances in analytical determination of cisplatin and its hydrolysis products

    International Nuclear Information System (INIS)

    Ramos Rodriguez, Yalexmiy; Hernandez Castro, Carlos

    2009-01-01

    Cisplatin (cis-diaminedichloroplatinum [II] is a coordination compound, used in the treatment of several solid tumors. Cisplatin and its hydrolysis products exhibit a great pharmacological effect but are very toxic and probably carcinogenic. The present review summarizes the most important advances in the last years in the techniques employed for the detection and quantification of cisplatin and its hydrolysis products and in the different matrixes studied. The new emerging techniques and their modifications recently developed, fundamentally the combined use of detection and separation techniques for the analysis of platinum species and their impact on the speed, sensitivity and specificity of the analytical determination, with regard to the techniques used in the last century are discussed. High-Performance Liquid Chromatography and Capillary Electrophoresis, coupled with detection methods such as Mass Spectrometry, Inductively Coupled Plasma-Mass Spectrometry, Atomic Absorption Spectrometry and more recently, High-Field Asymmetric Waveform Ion Mobility Spectrometry are the methods more employed. The analysis of cisplatin and its hydrolysis products in new and more complex matrixes is also presented

  17. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    Science.gov (United States)

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Y., E-mail: yican.wu@fds.org.cn [Inst. of Nuclear Energy Safety Technology, Hefei, Anhui (China)

    2015-07-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  19. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    International Nuclear Information System (INIS)

    Wu, Y.

    2015-01-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  20. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Jeff A. {Cyber Sciences} [ORNL; Post, Wilfred M [ORNL; Wang, Dali [ORNL; Wullschleger, Stan D [ORNL; Kline, Keith L [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL; Kang, Shujiang [ORNL

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  1. The role of numerical simulation for the development of an advanced HIFU system

    Science.gov (United States)

    Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro

    2014-10-01

    High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.

  2. Discussion and analytical test for inclusion of advanced field and boundary condition in theory of free electron lasers

    Science.gov (United States)

    Niknejadi, Pardis; Madey, John M. J.

    2017-09-01

    By the covariant statement of the distance in space-time separating transmitter and receivers, the emission and absorption of the retarded and advanced waves are all simultaneous. In other words, for signals carried on electromagnetic waves (advanced or retarded) the invariant interval (cdt) 2 -dr2 between the emission of a wave and it's absorption at the non-reflecting boundary is always identically zero. Utilizing this principle, we have previously explained the advantages of including the coherent radiation reaction force as a part of the solution to the boundary value problem for FELs that radiate into "free space" (Self Amplified Spontaneous Emission (SASE) FELs) and discussed how the advanced field of the absorber can interact with the radiating particles at the time of emission. Here we present an analytical test which verifies that a multilayer mirror can act as a band pass filter and can contribute to microbunching in the electron beam. Here we will discuss motivation, conditions and requirements, and method for testing this effect.

  3. SIROCCO project: 15 advanced instructor desk and 4 simulated control room for 900MW and 1300MW EDF power plant simulators

    International Nuclear Information System (INIS)

    Alphonse, J.; Roth, P.; Sicard, Y.; Rudelli, P.

    2006-01-01

    This presentation describes the fifteen advanced instructors station and four simulated control delivered to EDF in the frame of the SIROCCO project by the Consortium formed by ATOS Origin, CORYS Tess, for the Electricite de France (EDF). These instructor stations are installed on fifteen replica training simulators located on different sites throughout France for the purposes of improving the job-related training of the EDF PWR nuclear power plant operating teams. This covers all 900 MW and 1300MW nuclear power plant of EDF. The simulated control rooms are installed on maintenance platform located at EDF and the consortium facilities. The consortium uses it to maintain and upgrade the simulators. EDF uses it to validate the upgrade delivered by the consortium before on site installation and to perform engineering analysis. This presentation sets out successively: - The major advantages of the generic and configurable connected module concept for flexible and quick adaptation to different simulators; - The innovative functionalities of the advanced Instructor Desk (IS) which make the instructor's tasks of preparation, monitoring and postanalysis of a training session easier and more homogeneous; - The use of the Simulated Control Room (SCR) for training purposes but also for those of maintenance and design studies for upgrades of existing control rooms

  4. The Advanced Gamma-ray Imaging System (AGIS): Simulation Studies

    OpenAIRE

    Maier, G.; Collaboration, for the AGIS

    2009-01-01

    The Advanced Gamma-ray Imaging System (AGIS) is a next-generation ground-based gamma-ray observatory being planned in the U.S. The anticipated sensitivity of AGIS is about one order of magnitude better than the sensitivity of current observatories, allowing it to measure gammaray emmission from a large number of Galactic and extra-galactic sources. We present here results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance - collect...

  5. A novel fast and accurate pseudo-analytical simulation approach for MOAO

    KAUST Repository

    Gendron, É .; Charara, Ali; Abdelfattah, Ahmad; Gratadour, D.; Keyes, David E.; Ltaief, Hatem; Morel, C.; Vidal, F.; Sevin, A.; Rousset, G.

    2014-01-01

    Multi-object adaptive optics (MOAO) is a novel adaptive optics (AO) technique for wide-field multi-object spectrographs (MOS). MOAO aims at applying dedicated wavefront corrections to numerous separated tiny patches spread over a large field of view (FOV), limited only by that of the telescope. The control of each deformable mirror (DM) is done individually using a tomographic reconstruction of the phase based on measurements from a number of wavefront sensors (WFS) pointing at natural and artificial guide stars in the field. We have developed a novel hybrid, pseudo-analytical simulation scheme, somewhere in between the end-to- end and purely analytical approaches, that allows us to simulate in detail the tomographic problem as well as noise and aliasing with a high fidelity, and including fitting and bandwidth errors thanks to a Fourier-based code. Our tomographic approach is based on the computation of the minimum mean square error (MMSE) reconstructor, from which we derive numerically the covariance matrix of the tomographic error, including aliasing and propagated noise. We are then able to simulate the point-spread function (PSF) associated to this covariance matrix of the residuals, like in PSF reconstruction algorithms. The advantage of our approach is that we compute the same tomographic reconstructor that would be computed when operating the real instrument, so that our developments open the way for a future on-sky implementation of the tomographic control, plus the joint PSF and performance estimation. The main challenge resides in the computation of the tomographic reconstructor which involves the inversion of a large matrix (typically 40 000 × 40 000 elements). To perform this computation efficiently, we chose an optimized approach based on the use of GPUs as accelerators and using an optimized linear algebra library: MORSE providing a significant speedup against standard CPU oriented libraries such as Intel MKL. Because the covariance matrix is

  6. A novel fast and accurate pseudo-analytical simulation approach for MOAO

    KAUST Repository

    Gendron, É.

    2014-08-04

    Multi-object adaptive optics (MOAO) is a novel adaptive optics (AO) technique for wide-field multi-object spectrographs (MOS). MOAO aims at applying dedicated wavefront corrections to numerous separated tiny patches spread over a large field of view (FOV), limited only by that of the telescope. The control of each deformable mirror (DM) is done individually using a tomographic reconstruction of the phase based on measurements from a number of wavefront sensors (WFS) pointing at natural and artificial guide stars in the field. We have developed a novel hybrid, pseudo-analytical simulation scheme, somewhere in between the end-to- end and purely analytical approaches, that allows us to simulate in detail the tomographic problem as well as noise and aliasing with a high fidelity, and including fitting and bandwidth errors thanks to a Fourier-based code. Our tomographic approach is based on the computation of the minimum mean square error (MMSE) reconstructor, from which we derive numerically the covariance matrix of the tomographic error, including aliasing and propagated noise. We are then able to simulate the point-spread function (PSF) associated to this covariance matrix of the residuals, like in PSF reconstruction algorithms. The advantage of our approach is that we compute the same tomographic reconstructor that would be computed when operating the real instrument, so that our developments open the way for a future on-sky implementation of the tomographic control, plus the joint PSF and performance estimation. The main challenge resides in the computation of the tomographic reconstructor which involves the inversion of a large matrix (typically 40 000 × 40 000 elements). To perform this computation efficiently, we chose an optimized approach based on the use of GPUs as accelerators and using an optimized linear algebra library: MORSE providing a significant speedup against standard CPU oriented libraries such as Intel MKL. Because the covariance matrix is

  7. New Developments in the Simulation of Advanced Accelerator Concepts

    International Nuclear Information System (INIS)

    Paul, K.; Cary, J.R.; Cowan, B.; Bruhwiler, D.L.; Geddes, C.G.R.; Mullowney, P.J.; Messmer, P.; Esarey, E.; Cormier-Michel, E.; Leemans, W.P.; Vay, J.-L.

    2008-01-01

    Improved computational methods are essential to the diverse and rapidly developing field of advanced accelerator concepts. We present an overview of some computational algorithms for laser-plasma concepts and high-brightness photocathode electron sources. In particular, we discuss algorithms for reduced laser-plasma models that can be orders of magnitude faster than their higher-fidelity counterparts, as well as important on-going efforts to include relevant additional physics that has been previously neglected. As an example of the former, we present 2D laser wakefield accelerator simulations in an optimal Lorentz frame, demonstrating and gt;10 GeV energy gain of externally injected electrons over a 2 m interaction length, showing good agreement with predictions from scaled simulations and theory, with a speedup factor of ∼2,000 as compared to standard particle-in-cell.

  8. Interoperable mesh and geometry tools for advanced petascale simulations

    International Nuclear Information System (INIS)

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  9. Issues affecting advanced passive light-water reactor safety analysis

    International Nuclear Information System (INIS)

    Beelman, R.J.; Fletcher, C.D.; Modro, S.M.

    1992-01-01

    Next generation commercial reactor designs emphasize enhanced safety through improved safety system reliability and performance by means of system simplification and reliance on immutable natural forces for system operation. Simulating the performance of these safety systems will be central to analytical safety evaluation of advanced passive reactor designs. Yet the characteristically small driving forces of these safety systems pose challenging computational problems to current thermal-hydraulic systems analysis codes. Additionally, the safety systems generally interact closely with one another, requiring accurate, integrated simulation of the nuclear steam supply system, engineered safeguards and containment. Furthermore, numerical safety analysis of these advanced passive reactor designs wig necessitate simulation of long-duration, slowly-developing transients compared with current reactor designs. The composite effects of small computational inaccuracies on induced system interactions and perturbations over long periods may well lead to predicted results which are significantly different than would otherwise be expected or might actually occur. Comparisons between the engineered safety features of competing US advanced light water reactor designs and analogous present day reactor designs are examined relative to the adequacy of existing thermal-hydraulic safety codes in predicting the mechanisms of passive safety. Areas where existing codes might require modification, extension or assessment relative to passive safety designs are identified. Conclusions concerning the applicability of these codes to advanced passive light water reactor safety analysis are presented

  10. Plasma physics via particle simulation

    International Nuclear Information System (INIS)

    Birdsall, C.K.

    1981-01-01

    Plasmas are studied by following the motion of many particles in applied and self fields, analytically, experimentally and computationally. Plasmas for magnetic fusion energy devices are very hot, nearly collisionless and magnetized, with scale lengths of many ion gyroradii and Debye lengths. The analytic studies of such plasmas are very difficult as the plasma is nonuniform, anisotropic and nonlinear. The experimental studies have become very expensive in time and money, as the size, density and temperature approach fusion reactor values. Computational studies using many particles and/or fluids have complemented both theories and experiments for many years and have progressed to fully three dimensional electromagnetic models, albeit with hours of running times on the fastest largest computers. Particle simulation methods are presented in some detail, showing particle advance from acceleration to velocity to position, followed by calculation of the fields from charge and current densities and then further particle advance, and so on. Limitations due to the time stepping and use of a spatial grid are given, to avoid inaccuracies and instabilities. Examples are given for an electrostatic program in one dimension of an orbit averaging program, and for a three dimensional electromagnetic program. Applications of particle simulations of plasmas in magnetic and inertial fusion devices continue to grow, as well as to plasmas and beams in peripheral devices, such as sources, accelerators, and converters. (orig.)

  11. The Advanced Gamma-ray Imaging System (AGIS)-Simulation Studies

    Science.gov (United States)

    Maier, G.; Buckley, J.; Bugaev, V.; Fegan, S.; Funk, S.; Konopelko, A.; Vassiliev, V. V.

    2008-12-01

    The Advanced Gamma-ray Imaging System (AGIS) is a US-led concept for a next-generation instrument in ground-based very-high-energy gamma-ray astronomy. The most important design requirement for AGIS is a sensitivity of about 10 times greater than current observatories like Veritas, H.E.S.S or MAGIC. We present results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  12. Planning of development strategy for establishment of advanced simulation of nuclear system

    International Nuclear Information System (INIS)

    Chung, Bubdong; Ko, Wonil; Kwon Junhyun

    2013-12-01

    In this product, the long term development plan in each technical area has been prosed with the plan of coupled code system. The consolidated code system for safety analysis has been proposing for future needs. The computing hardware needed for te advanced simulation is also proposing. The best approach for future safety analysis simulation capabilities may be a dual-path program. i. e. the development programs for an integrated analysis tool and multi-scale/multi-physic analysis tools, where the former aims at reducing uncertainty and the latter at enhancing accuracy. Integrated analysis tool with risk informed safety margin quantification It requires a significant extension of the phenomenological and geometric capabilities of existing reactor safety analysis software, capable of detailed simulations that reduce the uncertainties. Multi-scale, multi-physics analysis tools. Simplifications of complex phenomenological models and dependencies have been made in current safety analyses to accommodate computer hardware limitations. With the advent of modern computer hardware, these limitations may be removed to permit greater accuracy in representation of physical behavior of materials in design basis and beyond design basis conditions, and hence more accurate assessment of the true safety margins based on first principle methodology. The proposals can be utilized to develop the advanced simulation project and formulation of organization and establishment of high performance computing system in KAERI

  13. Development and utilization of simulator training replay system

    International Nuclear Information System (INIS)

    Suzuki, Koichi; Noji, Kunio

    1998-01-01

    The BWR Operator Training Center Corporation (BTC) has introduced an advanced training system called the Simulator Training Replay System. The intention of introducing this system is to enhance the effectiveness of simulator training synthetically by means of; (i) sufficient analytical pre- and post-studies in the classroom, thus, enabling instructors to use the classroom as a means of explanation and discussion with an optimized system which is closely correlated with the full-scope simulator and (ii) sufficient practical operation training using a full-scope simulator without excessive suppression of time. With this system, operational data and video images during simulator training can be reproduced in the classroom. Instructors use this system with their trainees before and after simulator training for pre- and post-studies in the classroom. (author)

  14. Validation of the USNTPS simulator for the advanced flight controls design exercise

    OpenAIRE

    Jurta, Daniel S.

    2005-01-01

    This thesis explores the fidelity of the ground based simulator used at USNTPS during the Advanced Flight Controls Design exercise. A Simulink model is developed as a test platform and used to compare the longitudinal flight characteristics of the simulator. The model is also compared to the same characteristics of a Learjet in the approach configuration. The Simulink model is modified with the aim of yielding a better training aid for the students as well as providing a means of comparison b...

  15. Analytical and numerical study of graphite IG110 parts in advanced reactor under high temperature and irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Jinling, E-mail: Jinling_Gao@yeah.net; Yao, Wenjuan, E-mail: wj_yao@yeah.net; Ma, Yudong

    2016-08-15

    Graphical abstract: An analytical model and a numerical procedure are developed to study the mechanical response of IG-110 graphite bricks in HTGR subjected to high temperature and irradiation. The calculation results show great accordance with each other. Rational suggestions on the calculation and design of the IG-110 graphite structure are proposed based on the sensitivity analyses including temperature, irradiation dimensional change, creep and Poisson’s ratio. - Highlights: • Analytical solution of stress and displacement of IG-110 graphite components in HTGR. • Finite element procedure developed for stress analysis of HTGR graphite component. • Parameters analysis of mechanical response of graphite components during the whole life of the reflector. - Abstract: Structural design of nuclear power plant project is an important sub-discipline of civil engineering. Especially after appearance of the fourth generation advanced high temperature gas cooled reactor, structural mechanics in reactor technology becomes a popular subject in structural engineering. As basic ingredients of reflector in reactor, graphite bricks are subjected to high temperature and irradiation and the stress field of graphite structures determines integrity of reflector and makes a great difference to safety of whole structure. In this paper, based on assumptions of elasticity, side reflector is regarded approximately as a straight cylinder structure and primary creep strain is ignored. An analytical study on stress of IG110 graphite parts is present. Meanwhile, a finite element procedure for calculating stresses in the IG110 graphite structure exposed in the high temperature and irradiation is developed. Subsequently, numerical solution of stress in IG110 graphite structure is obtained. Analytical solution agrees well with numerical solution, which indicates that analytical derivation is accurate. Finally, influence of temperature, irradiation dimensional change, creep and Poisson

  16. Advances in thermal hydraulic and neutronic simulation for reactor analysis and safety

    International Nuclear Information System (INIS)

    Tentner, A.M.; Blomquist, R.N.; Canfield, T.R.; Ewing, T.F.; Garner, P.L.; Gelbard, E.M.; Gross, K.C.; Minkoff, M.; Valentin, R.A.

    1993-01-01

    This paper describes several large-scale computational models developed at Argonne National Laboratory for the simulation and analysis of thermal-hydraulic and neutronic events in nuclear reactors and nuclear power plants. The impact of advanced parallel computing technologies on these computational models is emphasized

  17. Recent Advances in the Analysis of Macromolecular Interactions Using the Matrix-Free Method of Sedimentation in the Analytical Ultracentrifuge

    Directory of Open Access Journals (Sweden)

    Stephen E. Harding

    2015-03-01

    Full Text Available Sedimentation in the analytical ultracentrifuge is a matrix free solution technique with no immobilisation, columns, or membranes required and can be used to study self-association and complex or “hetero”-interactions, stoichiometry, reversibility and interaction strength of a wide variety of macromolecular types and across a very large dynamic range (dissociation constants from 10−12 M to 10−1 M. We extend an earlier review specifically highlighting advances in sedimentation velocity and sedimentation equilibrium in the analytical ultracentrifuge applied to protein interactions and mucoadhesion and to review recent applications in protein self-association (tetanus toxoid, agrin, protein-like carbohydrate association (aminocelluloses, carbohydrate-protein interactions (polysaccharide-gliadin, nucleic-acid protein (G-duplexes, nucleic acid-carbohydrate (DNA-chitosan and finally carbohydrate-carbohydrate (xanthan-chitosan and a ternary polysaccharide complex interactions.

  18. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    Science.gov (United States)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M

  19. Analytical validation of a novel multiplex test for detection of advanced adenoma and colorectal cancer in symptomatic patients.

    Science.gov (United States)

    Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce

    2018-05-30

    Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as

  20. SIMULATION OF ANALYTICAL TRANSIENT WAVE DUE TO DOWNWARD BOTTOM THRUST

    Directory of Open Access Journals (Sweden)

    Sugih Sudharma Tjandra

    2015-11-01

    Full Text Available Generation process is an important part of understanding waves, especially tsunami. Large earthquake under the sea is one major cause of tsunamis. The sea surface deforms as a response from the sea bottom motion caused by the earthquake. Analytical description of surface wave generated by bottom motion can be obtained from the linearized dispersive model. For a bottom motion in the form of a downward motion, the result is expressed in terms of improper integral. Here, we focus on analyzing the convergence of this integral, and then the improper integral is approximated into a finite integral so that the integral can be evaluated numerically. Further, we simulate free surface elevation for three different type of bottom motions, classified as impulsive, intermediate, and slow  movements. We demonstrate that the wave propagating to the right, with a depression as the leading wave, followed with subsequent wave crests. This phenomena is often observed in most tsunami events.

  1. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  2. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  3. Gamma-gamma density and lithology tools simulation based on GEANT4 advanced low energy Compton scattering (GALECS) package

    International Nuclear Information System (INIS)

    Esmaeili-sani, Vahid; Moussavi-zarandi, Ali; Boghrati, Behzad; Afarideh, Hossein

    2012-01-01

    Geophysical bore-hole data represent the physical properties of rocks, such as density and formation lithology, as a function of depth in a well. Properties of rocks are obtained from gamma ray transport logs. Transport of gamma rays, from a 137 Cs point gamma source situated in a bore-hole tool, through rock media to detectors, has been simulated using a GEANT4 radiation transport code. The advanced Compton scattering concepts were used to gain better analyses about well formation. The simulation and understanding of advanced Compton scattering highly depends on how accurately the effects of Doppler broadening and Rayleigh scattering are taken into account. A Monte Carlo package that simulates the gamma-gamma well logging tools based on GEANT4 advanced low energy Compton scattering (GALECS).

  4. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides; Simulacion Monte Carlo: herramienta para la calibracion en determinaciones analiticas de radionucleidos

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez, E-mail: cphr@cphr.edu.cu [Centro de Proteccion e Higiene de las Radiaciones (CPHR), La Habana (Cuba)

    2013-07-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program.

  5. A generic analytical foot rollover model for predicting translational ankle kinematics in gait simulation studies.

    Science.gov (United States)

    Ren, Lei; Howard, David; Ren, Luquan; Nester, Chris; Tian, Limei

    2010-01-19

    The objective of this paper is to develop an analytical framework to representing the ankle-foot kinematics by modelling the foot as a rollover rocker, which cannot only be used as a generic tool for general gait simulation but also allows for case-specific modelling if required. Previously, the rollover models used in gait simulation have often been based on specific functions that have usually been of a simple form. In contrast, the analytical model described here is in a general form that the effective foot rollover shape can be represented by any polar function rho=rho(phi). Furthermore, a normalized generic foot rollover model has been established based on a normative foot rollover shape dataset of 12 normal healthy subjects. To evaluate model accuracy, the predicted ankle motions and the centre of pressure (CoP) were compared with measurement data for both subject-specific and general cases. The results demonstrated that the ankle joint motions in both vertical and horizontal directions (relative RMSE approximately 10%) and CoP (relative RMSE approximately 15% for most of the subjects) are accurately predicted over most of the stance phase (from 10% to 90% of stance). However, we found that the foot cannot be very accurately represented by a rollover model just after heel strike (HS) and just before toe off (TO), probably due to shear deformation of foot plantar tissues (ankle motion can occur without any foot rotation). The proposed foot rollover model can be used in both inverse and forward dynamics gait simulation studies and may also find applications in rehabilitation engineering. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  7. Analytical, numerical, and experimental simulation of tornado flows

    International Nuclear Information System (INIS)

    Bautin, S.P.; Krutova, I.Yu.; Obukhov, A.G.

    2015-01-01

    It has been proven that this problem with analytic input data near the point under consideration has a unique analytic solution representable in the form of a convergent series. The analysis of the first coefficients of this series has shown that the circular motion of the gas swirling in the positive direction in the Northern Hemisphere and in the negative direction in the Southern Hemisphere arises immediately at the beginning of the radial flow into the cylinder [ru

  8. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  9. Simulator for candu600 fuel handling system. the experimental model

    International Nuclear Information System (INIS)

    Marinescu, N.; Predescu, D.; Valeca, S.

    2013-01-01

    A main way to increase the nuclear plant safety is related to selection and continuous training of the operation staff. In this order, the computer programs for training, testing and evaluation of the knowledge get, or training simulators including the advanced analytical models of the technological systems are using. The Institute for Nuclear Research from Pitesti, Romania intend to design and build an Fuel Handling Simulator at his F/M Head Test Rig facility, that will be used for training of operating personnel. This paper presents simulated system, advantages to use the simulator, and the experimental model of simulator, that has been built to allows setting of the requirements and fabrication details, especially for the software kit that will be designed and implement on main simulator. (authors)

  10. Analytical simulation of the cantilever-type energy harvester

    Directory of Open Access Journals (Sweden)

    Jie Mei

    2016-01-01

    Full Text Available This article describes an analytical model of the cantilever-type energy harvester based on Euler–Bernoulli’s beam theory. Starting from the Hamiltonian form of total energy equation, the bending mode shapes and electromechanical dynamic equations are derived. By solving the constitutive electromechanical dynamic equation, the frequency transfer function of output voltage and power can be obtained. Through a case study of a unimorph piezoelectric energy harvester, this analytical modeling method has been validated by the finite element method.

  11. Gamma-gamma density and lithology tools simulation based on GEANT4 advanced low energy Compton scattering (GALECS) package

    Energy Technology Data Exchange (ETDEWEB)

    Esmaeili-sani, Vahid, E-mail: vaheed_esmaeely80@yahoo.com [Department of Nuclear Engineering and Physics, Amirkabir University of Technology, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of); Moussavi-zarandi, Ali; Boghrati, Behzad; Afarideh, Hossein [Department of Nuclear Engineering and Physics, Amirkabir University of Technology, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of)

    2012-02-01

    Geophysical bore-hole data represent the physical properties of rocks, such as density and formation lithology, as a function of depth in a well. Properties of rocks are obtained from gamma ray transport logs. Transport of gamma rays, from a {sup 137}Cs point gamma source situated in a bore-hole tool, through rock media to detectors, has been simulated using a GEANT4 radiation transport code. The advanced Compton scattering concepts were used to gain better analyses about well formation. The simulation and understanding of advanced Compton scattering highly depends on how accurately the effects of Doppler broadening and Rayleigh scattering are taken into account. A Monte Carlo package that simulates the gamma-gamma well logging tools based on GEANT4 advanced low energy Compton scattering (GALECS).

  12. Valve-specific, analytic-phenomenological modelling of spray dispersion in zero-dimensional simulation; Ventilspezifische, analytisch-phaenomenologische Modellierung der Sprayausbreitung fuer die nulldimensionale Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schuerg, F.; Arndt, S. [Robert Bosch GmbH, Stuttgart (Germany); Weigand, B. [Stuttgart Univ. (Germany). Inst. fuer Thermodynamik der Luft- und Raumfahrt

    2007-07-01

    Spray-guided combustion processes for gasoline direct injection offer a great fuel saving potential. The quality of mixture formation has direct impact on combustion and emissions and ultimately on the technical feasibility of the consumption advantage. Therefore, it is very important to select the optimal mixture formation strategy. A systematic optimization of the mixture formation process based on experiments or three-dimensional computational fluid dynamics requires tremendous effort. An efficient alternative is the application-oriented, zero-dimensional numerical simulation of mixture formation. With a systemic model formulation in terms of global thermodynamic and fluid mechanical balance equations, the presented simulation model considers all relevant aspects of the mixture formation process. A comparison with measurements in a pressure/temperature chamber using laser-induced exciplex fluorescence tomography revealed a very satisfactory agreement between simulation and experiment. The newly developed, analytic-phenomenological spray propagation model precisely captures the injector-specific mixture formation characteristics of an annular-orifice injector in terms of penetration and volume. Vaporization rate and mean air/fuel ratio as the key quantities of mixture formation are correctly reproduced. Thus, the simulation model is suited to numerically assess the quality and to optimize the strategy of mixture formation. (orig.)

  13. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    DEFF Research Database (Denmark)

    Petersen, Martin Nordal; Nuijts, Roeland; Bjorn, Lars Lange

    2014-01-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division ......This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength......-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi...

  14. Comparing semi-analytic particle tagging and hydrodynamical simulations of the Milky Way's stellar halo

    Science.gov (United States)

    Cooper, Andrew P.; Cole, Shaun; Frenk, Carlos S.; Le Bret, Theo; Pontzen, Andrew

    2017-08-01

    Particle tagging is an efficient, but approximate, technique for using cosmological N-body simulations to model the phase-space evolution of the stellar populations predicted, for example, by a semi-analytic model of galaxy formation. We test the technique developed by Cooper et al. (which we call stings here) by comparing particle tags with stars in a smooth particle hydrodynamic (SPH) simulation. We focus on the spherically averaged density profile of stars accreted from satellite galaxies in a Milky Way (MW)-like system. The stellar profile in the SPH simulation can be recovered accurately by tagging dark matter (DM) particles in the same simulation according to a prescription based on the rank order of particle binding energy. Applying the same prescription to an N-body version of this simulation produces a density profile differing from that of the SPH simulation by ≲10 per cent on average between 1 and 200 kpc. This confirms that particle tagging can provide a faithful and robust approximation to a self-consistent hydrodynamical simulation in this regime (in contradiction to previous claims in the literature). We find only one systematic effect, likely due to the collisionless approximation, namely that massive satellites in the SPH simulation are disrupted somewhat earlier than their collisionless counterparts. In most cases, this makes remarkably little difference to the spherically averaged distribution of their stellar debris. We conclude that, for galaxy formation models that do not predict strong baryonic effects on the present-day DM distribution of MW-like galaxies or their satellites, differences in stellar halo predictions associated with the treatment of star formation and feedback are much more important than those associated with the dynamical limitations of collisionless particle tagging.

  15. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  16. GNU polyxmass: a software framework for mass spectrometric simulations of linear (bio-polymeric analytes

    Directory of Open Access Journals (Sweden)

    Rusconi Filippo

    2006-04-01

    Full Text Available Abstract Background Nowadays, a variety of (bio-polymers can be analyzed by mass spectrometry. The detailed interpretation of the spectra requires a huge number of "hypothesis cycles", comprising the following three actions 1 put forth a structural hypothesis, 2 test it, 3 (invalidate it. This time-consuming and painstaking data scrutiny is alleviated by using specialized software tools. However, all the software tools available to date are polymer chemistry-specific. This imposes a heavy overhead to researchers who do mass spectrometry on a variety of (bio-polymers, as each polymer type will require a different software tool to perform data simulations and analyses. We developed a software to address the lack of an integrated software framework able to deal with different polymer chemistries. Results The GNU polyxmass software framework performs common (bio-chemical simulations–along with simultaneous mass spectrometric calculations–for any kind of linear bio-polymeric analyte (DNA, RNA, saccharides or proteins. The framework is organized into three modules, all accessible from one single binary program. The modules let the user to 1 define brand new polymer chemistries, 2 perform quick mass calculations using a desktop calculator paradigm, 3 graphically edit polymer sequences and perform (bio-chemical/mass spectrometric simulations. Any aspect of the mass calculations, polymer chemistry reactions or graphical polymer sequence editing is configurable. Conclusion The scientist who uses mass spectrometry to characterize (bio-polymeric analytes of different chemistries is provided with a single software framework for his data prediction/analysis needs, whatever the polymer chemistry being involved.

  17. CLASS: Core Library for Advanced Scenario Simulations

    International Nuclear Information System (INIS)

    Mouginot, B.; Thiolliere, N.

    2015-01-01

    The nuclear reactor simulation community has to perform complex electronuclear scenario simulations. To avoid constraints coming from the existing powerful scenario software such as COSI, VISION or FAMILY, the open source Core Library for Advanced Scenario Simulation (CLASS) has been developed. The main asset of CLASS is its ability to include any type of reactor, whether the system is innovative or standard. A reactor is fully described by its evolution database which should contain a set of different validated fuel compositions in order to simulate transitional scenarios. CLASS aims to be a useful tool to study scenarios involving Generation-IV reactors as well as innovative fuel cycles, like the thorium cycle. In addition to all standard key objects required by an electronuclear scenario simulation (the isotopic vector, the reactor, the fuel storage and the fabrication units), CLASS also integrates two new specific modules: fresh fuel evolution and recycled fuel fabrication. The first module, dealing with fresh fuel evolution, is implemented in CLASS by solving Bateman equations built from a database induced cross-sections. The second module, which incorporates the fabrication of recycled fuel to CLASS, can be defined by user priorities and/or algorithms. By default, it uses a linear Pu equivalent-method, which allows predicting, from the isotopic composition, the maximum burn-up accessible for a set type of fuel. This paper presents the basis of the CLASS scenario, the fuel method applied to a MOX fuel and an evolution module benchmark based on the French electronuclear fleet from 1977 to 2012. Results of the CLASS calculation were compared with the inventory made and published by the ANDRA organisation in 2012. For UOX used fuels, the ANDRA reported 12006 tonnes of heavy metal in stock, including cooling, versus 18500 tonnes of heavy metal predicted by CLASS. The large difference is easily explained by the presence of 56 tonnes of plutonium already separated

  18. Advanced char burnout models for the simulation of pulverized coal fired boilers

    Energy Technology Data Exchange (ETDEWEB)

    T. Severin; S. Wirtz; V. Scherer [Ruhr-University, Bochum (Germany). Institute of Energy Plant Technology (LEAT)

    2005-07-01

    The numerical simulation of coal combustion processes is widely used as an efficient means to predict burner or system behaviour. In this paper an approach to improve CFD simulations of pulverized coal fired boilers with advanced coal combustion models is presented. In simple coal combustion models, first order Arrhenius rate equations are used for devolatilization and char burnout. The accuracy of such simple models is sufficient for the basic aspects of heat release. The prediction of carbon-in-ash is one aspect of special interest in the simulation of pulverized coal fired boilers. To determine the carbon-in-ash levels in the fly ash of coal fired furnaces, the char burnout model has to be more detailed. It was tested, in how far changing operating conditions affect the carbon-in-ash prediction of the simulation. To run several test cases in a short time, a simplified cellnet model was applied. To use a cellnet model for simulations of pulverized coal fired boilers, it was coupled with a Lagrangian particle model, used in CFD simulations, too. 18 refs., 5 figs., 5 tabs.

  19. Advances in the testing and evaluation of airborne radar through realtime simulation of synthetic clutter

    CSIR Research Space (South Africa)

    Strydom, JJ

    2011-11-01

    Full Text Available and Evaluation of Airborne Radar through Realtime Simulation of Synthetic Clutter Presenter: Jurgen Strydom Systems Engineer & Signal Analyst Experimental EW Systems, CSIR Email: jjstrydom@csir.co.za Co-authors: Jacques Cilliers, CSIR 48th AOC Conference... environment simulation domain ? CSIR 2011 Slide 2 ? Technological advancements and challenges in the simulation of clutter for an airborne radar platform is discussed Where we are from: South Africa ? CSIR 2011 Slide 3 Health Natural Environment...

  20. Application of nanotechnology in miniaturized systems and its use for advanced analytics and diagnostics - an updated review.

    Science.gov (United States)

    Sandetskaya, Natalia; Allelein, Susann; Kuhlmeier, Dirk

    2013-12-01

    A combination of Micro-Electro-Mechanical Systems and nanoscale structures allows for the creation of novel miniaturized devices, which broaden the boundaries of the diagnostic approaches. Some materials possess unique properties at the nanolevel, which are different from those in bulk materials. In the last few years these properties became a focus of interest for many researchers, as well as methods of production, design and operation of the nanoobjects. Intensive research and development work resulted in numerous inventions exploiting nanotechnology in miniaturized systems. Modern technical and laboratory equipment allows for the precise control of such devices, making them suitable for sensitive and accurate detection of the analytes. The current review highlights recent patents in the field of nanotechnology in microdevices, applicable for medical, environmental or food analysis. The paper covers the structural and functional basis of such systems and describes specific embodiments in three principal branches: application of nanoparticles, nanofluidics, and nanosensors in the miniaturized systems for advanced analytics and diagnostics. This overview is an update of an earlier review article.

  1. Improving advanced cardiovascular life support skills in medical students: simulation-based education approach

    Directory of Open Access Journals (Sweden)

    Hamidreza Reihani

    2015-01-01

    Full Text Available Objective: In this trial, we intend to assess the effect of simulation-based education approach on advanced cardiovascular life support skills among medical students. Methods: Through convenient sampling method, 40 interns of Mashhad University of Medical Sciences in their emergency medicine rotation (from September to December 2012 participated in this study. Advanced Cardiovascular Life Support (ACLS workshops with pretest and post-test exams were performed. Workshops and checklists for pretest and post-test exams were designed according to the latest American Heart Association (AHA guidelines. Results: The total score of the students increased significantly after workshops (24.6 out of 100 to 78.6 out of 100. This demonstrates 53.9% improvement in the skills after the simulation-based education (P< 0.001. Also the mean score of each station had a significant improvement (P< 0.001. Conclusion: Pretests showed that interns had poor performance in practical clinical matters while their scientific knowledge, such as ECG interpretation was acceptable. The overall results of the study highlights that Simulation based-education approach is highly effective in Improving ACLS skills among medical students.

  2. One-dimensional model of interacting-step fluctuations on vicinal surfaces: Analytical formulas and kinetic Monte Carlo simulations

    Science.gov (United States)

    Patrone, Paul N.; Einstein, T. L.; Margetis, Dionisios

    2010-12-01

    We study analytically and numerically a one-dimensional model of interacting line defects (steps) fluctuating on a vicinal crystal. Our goal is to formulate and validate analytical techniques for approximately solving systems of coupled nonlinear stochastic differential equations (SDEs) governing fluctuations in surface motion. In our analytical approach, the starting point is the Burton-Cabrera-Frank (BCF) model by which step motion is driven by diffusion of adsorbed atoms on terraces and atom attachment-detachment at steps. The step energy accounts for entropic and nearest-neighbor elastic-dipole interactions. By including Gaussian white noise to the equations of motion for terrace widths, we formulate large systems of SDEs under different choices of diffusion coefficients for the noise. We simplify this description via (i) perturbation theory and linearization of the step interactions and, alternatively, (ii) a mean-field (MF) approximation whereby widths of adjacent terraces are replaced by a self-consistent field but nonlinearities in step interactions are retained. We derive simplified formulas for the time-dependent terrace-width distribution (TWD) and its steady-state limit. Our MF analytical predictions for the TWD compare favorably with kinetic Monte Carlo simulations under the addition of a suitably conservative white noise in the BCF equations.

  3. Oxygen ordering in YBa2Cu3O6+x using Monte Carlo simulation and analytic theory

    DEFF Research Database (Denmark)

    Mønster, D.; Lindgård, Per-Anker; Andersen, N.H.

    2001-01-01

    We have simulated the phase diagram and structural properties of the oxygen ordering in YBa2Cu3O6+x testing simple extensions of the asymmetric next-nearest-neighbor Ising (ASYNNNI) Model. In a preliminary paper [Phys. Rev. B 60, 110 (1999)] we demonstrated that the inclusion of a single further...... on a nano scale into box-like domains and anti-domains of typical average dimension (10a,30b,2c). Theory and model simulations demonstrate that the distribution of such domains causes deviations from Lorentzian line shapes, and not the Porod effect. Analytic theory is used to estimate the effect of a range...... of values of the interaction parameters used, as well as the effect of an extension to include infinite ranged interactions. In the experiments a large cap is found between the onset temperatures of the ortho-I and ortho-II orders at x=0.5. This cannot be fully reproduced in the simulations. The simulations...

  4. Conjugate heat transfer simulations of advanced research reactor fuel

    Energy Technology Data Exchange (ETDEWEB)

    Piro, M.H.A., E-mail: pirom@aecl.ca; Leitch, B.W.

    2014-07-01

    Highlights: • Temperature predictions are enhanced by coupling heat transfer in solid and fluid zones. • Seven different cases are considered to observe trends in predicted temperature and pressure. • The seven cases consider high/medium/low power, flow, burnup, fuel material and geometry. • Simulations provide temperature predictions for performance/safety. Boiling is unlikely. • Simulations demonstrate that a candidate geometry can enhance performance/safety. - Abstract: The current work presents numerical simulations of coupled fluid flow and heat transfer of advanced U–Mo/Al and U–Mo/Mg research reactor fuels in support of performance and safety analyses. The objective of this study is to enhance predictions of the flow regime and fuel temperatures through high fidelity simulations that better capture various heat transfer pathways and with a more realistic geometric representation of the fuel assembly in comparison to previous efforts. Specifically, thermal conduction, convection and radiation mechanisms are conjugated between the solid and fluid regions. Also, a complete fuel element assembly is represented in three dimensional space, permitting fluid flow and heat transfer to be simulated across the entire domain. Seven case studies are examined that vary the coolant inlet conditions, specific power, and burnup to investigate the predicted changes in the pressure drop in the coolant and the fuel, clad and coolant temperatures. In addition, an alternate fuel geometry is considered with helical fins (replacing straight fins in the existing design) to investigate the relative changes in predicted fluid and solid temperatures. Numerical simulations predict that the clad temperature is sensitive to changes in the thermal boundary layer in the coolant, particularly in simultaneously developing flow regions, while the temperature in the fuel is anticipated to be unaffected. Finally, heat transfer between fluid and solid regions is enhanced with

  5. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  6. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  7. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  8. Dynamic Impact Testing and Model Development in Support of NASA's Advanced Composites Program

    Science.gov (United States)

    Melis, Matthew E.; Pereira, J. Michael; Goldberg, Robert; Rassaian, Mostafa

    2018-01-01

    The purpose of this paper is to provide an executive overview of the HEDI effort for NASA's Advanced Composites Program and establish the foundation for the remaining papers to follow in the 2018 SciTech special session NASA ACC High Energy Dynamic Impact. The paper summarizes the work done for the Advanced Composites Program to advance our understanding of the behavior of composite materials during high energy impact events and to advance the ability of analytical tools to provide predictive simulations. The experimental program carried out at GRC is summarized and a status on the current development state for MAT213 will be provided. Future work will be discussed as the HEDI effort transitions from fundamental analysis and testing to investigating sub-component structural concept response to impact events.

  9. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    Science.gov (United States)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  10. Advanced High and Low Fidelity HPC Simulations of FCS Concept Designs for Dynamic Systems

    National Research Council Canada - National Science Library

    Sandhu, S. S; Kanapady, R; Tamma, K. K

    2004-01-01

    ...) resources of many Army initiatives. In this paper we present a new and advanced HPC based rigid and flexible modeling and simulation technology capable of adaptive high/low fidelity modeling that is useful in the initial design concept...

  11. Advanced Engineering Environments for Space Transportation System Development

    Science.gov (United States)

    Thomas, L. Dale; Smith, Charles A.; Beveridge, James

    2000-01-01

    There are significant challenges facing today's launch vehicle industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker, all face the developer of a space transportation system. Within NASA, multiple technology development and demonstration projects are underway toward the objectives of safe, reliable, and affordable access to space. New information technologies offer promising opportunities to develop advanced engineering environments to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. At the Marshall Space Flight Center, work has begun on development of an advanced engineering environment specifically to support the design, modeling, and analysis of space transportation systems. This paper will give an overview of the challenges of developing space transportation systems in today's environment and subsequently discuss the advanced engineering environment and its anticipated benefits.

  12. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  13. Computer simulation of a 20-kHz power system for advanced launch systems

    Science.gov (United States)

    Sudhoff, S. D.; Wasynczuk, O.; Krause, P. C.; Kenny, B. H.

    1993-01-01

    The performance of two 20-kHz actuator power systems being built for an advanced launch system are evaluated for typical launch senario using an end-to-end system simulation. Aspects of system performance ranging from the switching of the power electronic devices to the vehicle aerodynamics are represented in the simulation. It is shown that both systems adequately stabilize the vehicle against a wind gust during launch. However, it is also shown that in both cases there are bus voltage and current fluctuations which make system power quality a concern.

  14. Massive quiescent galaxies at z > 3 in the Millennium simulation populated by a semi-analytic galaxy formation model

    Science.gov (United States)

    Rong, Yu; Jing, Yingjie; Gao, Liang; Guo, Qi; Wang, Jie; Sun, Shuangpeng; Wang, Lin; Pan, Jun

    2017-10-01

    We take advantage of the statistical power of the large-volume dark-matter-only Millennium simulation (MS), combined with a sophisticated semi-analytic galaxy formation model, to explore whether the recently reported z = 3.7 quiescent galaxy ZF-COSMOS-20115 (ZF) can be accommodated in current galaxy formation models. In our model, a population of quiescent galaxies with stellar masses and star formation rates comparable to those of ZF naturally emerges at redshifts z 3.5 massive QGs are rare (about 2 per cent of the galaxies with the similar stellar masses), the existing AGN feedback model implemented in the semi-analytic galaxy formation model can successfully explain the formation of the high-redshift QGs as it does on their lower redshift counterparts.

  15. Ion heating and energy partition at the heliospheric termination shock: hybrid simulations and analytical model

    Energy Technology Data Exchange (ETDEWEB)

    Gary, S Peter [Los Alamos National Laboratory; Winske, Dan [Los Alamos National Laboratory; Wu, Pin [BOSTON UNIV.; Schwadron, N A [BOSTON UNIV.; Lee, M [UNIV OF NEW HAMPSHIRE

    2009-01-01

    The Los Alamos hybrid simulation code is used to examine heating and the partition of dissipation energy at the perpendicular heliospheric termination shock in the presence of pickup ions. The simulations are one-dimensional in space but three-dimensional in field and velocity components, and are carried out for a range of values of pickup ion relative density. Results from the simulations show that because the solar wind ions are relatively cold upstream, the temperature of these ions is raised by a relatively larger factor than the temperature of the pickup ions. An analytic model for energy partition is developed on the basis of the Rankine-Hugoniot relations and a polytropic energy equation. The polytropic index {gamma} used in the Rankine-Hugoniot relations is varied to improve agreement between the model and the simulations concerning the fraction of downstream heating in the pickup ions as well as the compression ratio at the shock. When the pickup ion density is less than 20%, the polytropic index is about 5/3, whereas for pickup ion densities greater than 20%, the polytropic index tends toward 2.2, suggesting a fundamental change in the character of the shock, as seen in the simulations, when the pickup ion density is large. The model and the simulations both indicate for the upstream parameters chosen for Voyager 2 conditions that the pickup ion density is about 25% and the pickup ions gain the larger share (approximately 90%) of the downstream thermal pressure, consistent with Voyager 2 observations near the shock.

  16. Science based integrated approach to advanced nuclear fuel development - integrated multi-scale multi-physics hierarchical modeling and simulation framework Part III: cladding

    International Nuclear Information System (INIS)

    Tome, Carlos N.; Caro, J.A.; Lebensohn, R.A.; Unal, Cetin; Arsenlis, A.; Marian, J.; Pasamehmetoglu, K.

    2010-01-01

    Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating the phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.

  17. Advanced graphical user interface for multi-physics simulations using AMST

    Science.gov (United States)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  18. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  19. Advancement in tritium transport simulations for solid breeding blanket system

    Energy Technology Data Exchange (ETDEWEB)

    Ying, Alice, E-mail: ying@fusion.ucla.edu [Mechanical and Aerospace Engineering Department, UCLA, Los Angeles, CA 90095 (United States); Zhang, Hongjie [Mechanical and Aerospace Engineering Department, UCLA, Los Angeles, CA 90095 (United States); Merrill, Brad J. [Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Ahn, Mu-Young [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    In this paper, advancement on tritium transport simulations was demonstrated for a solid breeder blanket HCCR TBS, where multi-physics and detailed engineering descriptions are considered using a commercial simulation code. The physics involved includes compressible purge gas fluid flow, heat transfer, chemical reaction, isotope swamping effect, and tritium isotopes mass transport. The strategy adopted here is to develop numerical procedures and techniques that allow critical details of material, geometric and operational heterogeneity in a most complete engineering description of the TBS being incorporated into the simulation. Our application focuses on the transient assessment in view of ITER being pulsed operations. An immediate advantage is a more realistic predictive and design analysis tool accounting pulsed operations induced temperature variations which impact helium purge gas flow as well as Q{sub 2} composition concentration time and space evolutions in the breeding regions. This affords a more accurate prediction of tritium permeation into the He coolant by accounting correct temperature and partial pressure effects and realistic diffusion paths. The analysis also shows that by introducing by-pass line to accommodate ITER pulsed operations in the TES loop allows tritium extraction design being more cost effective.

  20. Comprehensive simulation-enhanced training curriculum for an advanced minimally invasive procedure: a randomized controlled trial.

    Science.gov (United States)

    Zevin, Boris; Dedy, Nicolas J; Bonrath, Esther M; Grantcharov, Teodor P

    2017-05-01

    There is no comprehensive simulation-enhanced training curriculum to address cognitive, psychomotor, and nontechnical skills for an advanced minimally invasive procedure. 1) To develop and provide evidence of validity for a comprehensive simulation-enhanced training (SET) curriculum for an advanced minimally invasive procedure; (2) to demonstrate transfer of acquired psychomotor skills from a simulation laboratory to live porcine model; and (3) to compare training outcomes of SET curriculum group and chief resident group. University. This prospective single-blinded, randomized, controlled trial allocated 20 intermediate-level surgery residents to receive either conventional training (control) or SET curriculum training (intervention). The SET curriculum consisted of cognitive, psychomotor, and nontechnical training modules. Psychomotor skills in a live anesthetized porcine model in the OR was the primary outcome. Knowledge of advanced minimally invasive and bariatric surgery and nontechnical skills in a simulated OR crisis scenario were the secondary outcomes. Residents in the SET curriculum group went on to perform a laparoscopic jejunojejunostomy in the OR. Cognitive, psychomotor, and nontechnical skills of SET curriculum group were also compared to a group of 12 chief surgery residents. SET curriculum group demonstrated superior psychomotor skills in a live porcine model (56 [47-62] versus 44 [38-53], Ppsychomotor skills in the live porcine model and in the OR in a human patient (56 [47-62] versus 63 [61-68]; P = .21). SET curriculum group demonstrated inferior knowledge (13 [11-15] versus 16 [14-16]; P<.05), equivalent psychomotor skill (63 [61-68] versus 68 [62-74]; P = .50), and superior nontechnical skills (41 [38-45] versus 34 [27-35], P<.01) compared with chief resident group. Completion of the SET curriculum resulted in superior training outcomes, compared with conventional surgery training. Implementation of the SET curriculum can standardize training

  1. Numerical simulation and experimental validation of the three-dimensional flow field and relative analyte concentration distribution in an atmospheric pressure ion source.

    Science.gov (United States)

    Poehler, Thorsten; Kunte, Robert; Hoenen, Herwart; Jeschke, Peter; Wissdorf, Walter; Brockmann, Klaus J; Benter, Thorsten

    2011-11-01

    In this study, the validation and analysis of steady state numerical simulations of the gas flows within a multi-purpose ion source (MPIS) are presented. The experimental results were obtained with particle image velocimetry (PIV) measurements in a non-scaled MPIS. Two-dimensional time-averaged velocity and turbulent kinetic energy distributions are presented for two dry gas volume flow rates. The numerical results of the validation simulations are in very good agreement with the experimental data. All significant flow features have been correctly predicted within the accuracy of the experiments. For technical reasons, the experiments were conducted at room temperature. Thus, numerical simulations of ionization conditions at two operating points of the MPIS are also presented. It is clearly shown that the dry gas volume flow rate has the most significant impact on the overall flow pattern within the APLI source; far less critical is the (larger) nebulization gas flow. In addition to the approximate solution of Reynolds-Averaged Navier-Stokes equations, a transport equation for the relative analyte concentration has been solved. The results yield information on the three-dimensional analyte distribution within the source. It becomes evident that for ion transport into the MS ion transfer capillary, electromagnetic forces are at least as important as fluid dynamic forces. However, only the fluid dynamics determines the three-dimensional distribution of analyte gas. Thus, local flow phenomena in close proximity to the spray shield are strongly impacting on the ionization efficiency.

  2. Simulating QCD at finite density

    CERN Document Server

    de Forcrand, Philippe

    2009-01-01

    In this review, I recall the nature and the inevitability of the "sign problem" which plagues attempts to simulate lattice QCD at finite baryon density. I present the main approaches used to circumvent the sign problem at small chemical potential. I sketch how one can predict analytically the severity of the sign problem, as well as the numerically accessible range of baryon densities. I review progress towards the determination of the pseudo-critical temperature T_c(mu), and towards the identification of a possible QCD critical point. Some promising advances with non-standard approaches are reviewed.

  3. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    Energy Technology Data Exchange (ETDEWEB)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu [North Carolina State University, PO Box 7926, Raleigh, NC 27695-7926 (United States); Kothe, Douglas B., E-mail: kothe@ornl.gov [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6164 (United States)

    2016-05-15

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL

  4. Systematic Review of Patient-Specific Surgical Simulation: Toward Advancing Medical Education.

    Science.gov (United States)

    Ryu, Won Hyung A; Dharampal, Navjit; Mostafa, Ahmed E; Sharlin, Ehud; Kopp, Gail; Jacobs, William Bradley; Hurlbert, Robin John; Chan, Sonny; Sutherland, Garnette R

    Simulation-based education has been shown to be an effective tool to teach foundational technical skills in various surgical specialties. However, most of the current simulations are limited to generic scenarios and do not allow continuation of the learning curve beyond basic technical skills to prepare for more advanced expertise, such as patient-specific surgical planning. The objective of this study was to evaluate the current medical literature with respect to the utilization and educational value of patient-specific simulations for surgical training. We performed a systematic review of the literature using Pubmed, Embase, and Scopus focusing on themes of simulation, patient-specific, surgical procedure, and education. The study included randomized controlled trials, cohort studies, and case-control studies published between 2005 and 2016. Two independent reviewers (W.H.R. and N.D) conducted the study appraisal, data abstraction, and quality assessment of the studies. The search identified 13 studies that met the inclusion criteria; 7 studies employed computer simulations and 6 studies used 3-dimensional (3D) synthetic models. A number of surgical specialties evaluated patient-specific simulation, including neurosurgery, vascular surgery, orthopedic surgery, and interventional radiology. However, most studies were small in size and primarily aimed at feasibility assessments and early validation. Early evidence has shown feasibility and utility of patient-specific simulation for surgical education. With further development of this technology, simulation-based education may be able to support training of higher-level competencies outside the clinical settingto aid learners in their development of surgical skills. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Simulation of advanced accumulator and its application in CPR1000 LBLOCA analysis

    International Nuclear Information System (INIS)

    Hu, Hongwei; Shan, Jianqiang; Gou, Junli; Cao, Jianhua; Shen, Yonggang; Fu, Xiangang

    2014-01-01

    Highlights: • The analysis model was developed for advanced accumulator. • The sensitivity analysis of each key parameter was performed. • The LBLOCA was analyzed for the CPR1000 with advanced accumulator. • The analysis shows that advanced accumulator can improve CPR1000 safety performance. - Abstract: The advanced accumulator is designed to improve the safety and reliability of CPR1000 by providing a small injection flow to keep the reactor core in flooded condition. Thus, the core still stays in a cooling state without the intervention of low pressure safety injection and the startup grace time of the low pressure safety injection pump can be greatly extended. A new model for the advanced accumulator, which is based on the basic conservation equations, is developed and incorporated into RELAP5/MOD 3.3. The simulation of the advanced accumulator can be carried out and results show that the behavior of the advanced accumulator satisfied its primary design target. There is a large flow in the advanced accumulator at the initial stage. When the accumulator water level is lower than the stand pipe, a vortex appears in the damper, which results in a large pressure drop and a small flow. And then the sensitivity analysis is performed and the major factors which affected the flow rate of the advanced accumulator were obtained, including the damper diameter, the initial volume ratio of the water and the nitrogen and the diameter ratio of the standpipe and the small pipe. Additionally, the primary coolant loop cold leg double-ended guillotine break LBLOCA in CPR1000 with advanced accumulator is analyzed. The results show that the criterion for maximum cladding temperature limit (1477 K) (NRC, 1992) can be met ever with 200 s after the startup of the low pressure safety injection. From this point of view, passive advanced accumulator can strive a longer grace time for LPSI. Thus the reliability, safety and economy of the reactor system can be improved

  6. Application of analytical procedure on system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    2000-01-01

    In the Ship Research Institute, research and development of GO-FLOW procedure with various advanced functions as a system reliability analysis method occupying main part of the probabilistic safety assessment (PSA) were promoted. In this study, as an important evaluation technique on executing PSA with lower than level 3, by intending fundamental upgrading of the GO-FLOW procedure, a safety assessment system using the GO-FLOW as well as an analytical function coupling of dynamic behavior analytical function and physical behavior of the system with stochastic phenomenon change were developed. In 1998 fiscal year, preparation and verification of various functions such as dependence addition between the headings, rearrangement in order of time, positioning of same heading to plural positions, calculation of forming frequency with elapsing time were carried out. And, on a simulation analysis function of accident sequence, confirmation on analysis covering all of main accident sequence in the reactor for improved marine reactor, MRX was carried out. In addition, a function near automatically producible on input data for analysis was also prepared. As a result, the conventional analysis not always easy understanding on analytical results except an expert of PSA was solved, and understanding of the accident phenomenon, verification of validity on analysis, feedback to analysis, and feedback to design could be easily carried out. (G.K.)

  7. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  8. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  9. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  10. Advanced techniques for analytic liquid wastes management in the Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Madic, C.; Moulin, J.P.; Runge, S.; Schott, R.; Kashiwai, T.; Hayashi, M.

    1991-01-01

    The JNFS Rokkasho reprocessing plant is a large scale commercial reprocessing plant. Liquid waste treatment relies on concentration by evaporation. The management of liquid wastes is rather sophisticated and implies, beside the organic wastes, sorting out between process and non-process, acidic and salt-bearing, tritiated and low tritiated streams and also according to their level of activity. A particular attention had to be paid to the analytical wastes, as their particularity is to contain not only a significant amount of radioactivity but also some fissile material and exotic chemicals which are useful for analytical purpose but unwanted in the main process mainly because of their corrosive and chelating properties. The analytical wastes are sorted out according to their activity level and fissile material content. On the one hand, a specific process has been developed to recover the bulk of plutonium from the analytical wastes. On the other hand, the foreseeable amount of unwanted chemicals (such as chloride ions) has been drastically reduced by carefully selecting all the analytical methods either by modification of already known methods or in some cases by working out new methods

  11. Analytical researches on the accelerating structures, wakefields, and beam dynamics for future linear colliders

    International Nuclear Information System (INIS)

    Gao, J.

    1996-01-01

    The research works presented in this memoir are oriented not only to the R and D programs towards future linear colliders, but also to the pedagogic purposes. The first part of this memoir (from Chapter 2 to Chapter 9) establishes an analytical framework of the disk-loaded slow wave accelerating structures with can be served as the advanced courses for the students who have got some basic trainings in the linear accelerator theories. The analytical formulae derived in this part describe clearly the properties of the disk-loaded accelerating structures, such as group velocity, shunt impedance, coupling coefficients κ and β, loss factors, and wake fields. The second part (from Chapter 11 to Chapter 13) gives the beam dynamics simulations and the final proposal of an S-Band Superconducting Linear Collider (SSLC) which is aimed to avoid the dark current problem in TESLA project. This memoir has not included all the works conducted since April 1992, such as beam dynamics simulations for CLIC Test Facility (CFT-2) and the design of High Charge Structures (HCS) (11π/12 mode) for CFT-2, in order to make this memoir more harmonious, coherent and continuous. (author)

  12. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam; ARIANNE. Incertidumbres analiticas. Factores de simulacion influyentes en el inventario de la isotopia final

    Energy Technology Data Exchange (ETDEWEB)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-07-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  13. Decision making in trauma settings: simulation to improve diagnostic skills.

    Science.gov (United States)

    Murray, David J; Freeman, Brad D; Boulet, John R; Woodhouse, Julie; Fehr, James J; Klingensmith, Mary E

    2015-06-01

    In the setting of acute injury, a wrong, missed, or delayed diagnosis can impact survival. Clinicians rely on pattern recognition and heuristics to rapidly assess injuries, but an overreliance on these approaches can result in a diagnostic error. Simulation has been advocated as a method for practitioners to learn how to recognize the limitations of heuristics and develop better diagnostic skills. The objective of this study was to determine whether simulation could be used to provide teams the experiences in managing scenarios that require the use of heuristic as well as analytic diagnostic skills to effectively recognize and treat potentially life-threatening injuries. Ten scenarios were developed to assess the ability of trauma teams to provide initial care to a severely injured patient. Seven standard scenarios simulated severe injuries that once diagnosed could be effectively treated using standard Advanced Trauma Life Support algorithms. Because diagnostic error occurs more commonly in complex clinical settings, 3 complex scenarios required teams to use more advanced diagnostic skills to uncover a coexisting condition and treat the patient. Teams composed of 3 to 5 practitioners were evaluated in the performance of 7 (of 10) randomly selected scenarios (5 standard, 2 complex). Expert rates scored teams using standardized checklists and global scores. Eighty-three surgery, emergency medicine, and anesthesia residents constituted 21 teams. Expert raters were able to reliably score the scenarios. Teams accomplished fewer checklist actions and received lower global scores on the 3 analytic scenarios (73.8% [12.3%] and 5.9 [1.6], respectively) compared with the 7 heuristic scenarios (83.2% [11.7%] and 6.6 [1.3], respectively; P heuristic scenarios but were less effective when managing the scenarios that require a more analytic approach. Simulation can be used to provide teams with decision-making experiences in trauma settings and could be used to improve

  14. Atmospheric Corrosion Behavior and Mechanism of a Ni-Advanced Weathering Steel in Simulated Tropical Marine Environment

    Science.gov (United States)

    Wu, Wei; Zeng, Zhongping; Cheng, Xuequn; Li, Xiaogang; Liu, Bo

    2017-12-01

    Corrosion behavior of Ni-advanced weathering steel, as well as carbon steel and conventional weathering steel, in a simulated tropical marine atmosphere was studied by field exposure and indoor simulation tests. Meanwhile, morphology and composition of corrosion products formed on the exposed steels were surveyed through scanning electron microscopy, energy-dispersive x-ray spectroscopy and x-ray diffraction. Results indicated that the additive Ni in weathering steel played an important role during the corrosion process, which took part in the formation of corrosion products, enriched in the inner rust layer and promoted the transformation from loose γ-FeOOH to dense α-FeOOH. As a result, the main aggressive ion, i.e., Cl-, was effectively separated in the outer rust layer which leads to the lowest corrosion rate among these tested steels. Thus, the resistance of Ni-advanced weathering steel to atmospheric corrosion was significantly improved in a simulated tropical marine environment.

  15. A REVIEW ON PREDICTIVE ANALYTICS IN DATA MINING

    OpenAIRE

    Arumugam.S

    2016-01-01

    The data mining its main process is to collect, extract and store the valuable information and now-a-days it’s done by many enterprises actively. In advanced analytics, Predictive analytics is the one of the branch which is mainly used to make predictions about future events which are unknown. Predictive analytics which uses various techniques from machine learning, statistics, data mining, modeling, and artificial intelligence for analyzing the current data and to make predictions about futu...

  16. Advances in Integrated Vehicle Thermal Management and Numerical Simulation

    Directory of Open Access Journals (Sweden)

    Yan Wang

    2017-10-01

    Full Text Available With the increasing demands for vehicle dynamic performance, economy, safety and comfort, and with ever stricter laws concerning energy conservation and emissions, vehicle power systems are becoming much more complex. To pursue high efficiency and light weight in automobile design, the power system and its vehicle integrated thermal management (VITM system have attracted widespread attention as the major components of modern vehicle technology. Regarding the internal combustion engine vehicle (ICEV, its integrated thermal management (ITM mainly contains internal combustion engine (ICE cooling, turbo-charged cooling, exhaust gas recirculation (EGR cooling, lubrication cooling and air conditioning (AC or heat pump (HP. As for electric vehicles (EVs, the ITM mainly includes battery cooling/preheating, electric machines (EM cooling and AC or HP. With the rational effective and comprehensive control over the mentioned dynamic devices and thermal components, the modern VITM can realize collaborative optimization of multiple thermodynamic processes from the aspect of system integration. Furthermore, the computer-aided calculation and numerical simulation have been the significant design methods, especially for complex VITM. The 1D programming can correlate multi-thermal components and the 3D simulating can develop structuralized and modularized design. Additionally, co-simulations can virtualize simulation of various thermo-hydraulic behaviors under the vehicle transient operational conditions. This article reviews relevant researching work and current advances in the ever broadening field of modern vehicle thermal management (VTM. Based on the systematic summaries of the design methods and applications of ITM, future tasks and proposals are presented. This article aims to promote innovation of ITM, strengthen the precise control and the performance predictable ability, furthermore, to enhance the level of research and development (R&D.

  17. Advanced Models and Algorithms for Self-Similar IP Network Traffic Simulation and Performance Analysis

    Science.gov (United States)

    Radev, Dimitar; Lokshina, Izabella

    2010-11-01

    The paper examines self-similar (or fractal) properties of real communication network traffic data over a wide range of time scales. These self-similar properties are very different from the properties of traditional models based on Poisson and Markov-modulated Poisson processes. Advanced fractal models of sequentional generators and fixed-length sequence generators, and efficient algorithms that are used to simulate self-similar behavior of IP network traffic data are developed and applied. Numerical examples are provided; and simulation results are obtained and analyzed.

  18. Repository simulation model: Final report

    International Nuclear Information System (INIS)

    1988-03-01

    This report documents the application of computer simulation for the design analysis of the nuclear waste repository's waste handling and packaging operations. The Salt Repository Simulation Model was used to evaluate design alternatives during the conceptual design phase of the Salt Repository Project. Code development and verification was performed by the Office of Nuclear Waste Isolation (ONWL). The focus of this report is to relate the experience gained during the development and application of the Salt Repository Simulation Model to future repository design phases. Design of the repository's waste handling and packaging systems will require sophisticated analysis tools to evaluate complex operational and logistical design alternatives. Selection of these design alternatives in the Advanced Conceptual Design (ACD) and License Application Design (LAD) phases must be supported by analysis to demonstrate that the repository design will cost effectively meet DOE's mandated emplacement schedule and that uncertainties in the performance of the repository's systems have been objectively evaluated. Computer simulation of repository operations will provide future repository designers with data and insights that no other analytical form of analysis can provide. 6 refs., 10 figs

  19. The role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma.

    Science.gov (United States)

    Kunimatsu-Sanuki, Shiho; Iwase, Aiko; Araie, Makoto; Aoki, Yuki; Hara, Takeshi; Fukuchi, Takeo; Udagawa, Sachiko; Ohkubo, Shinji; Sugiyama, Kazuhisa; Matsumoto, Chota; Nakazawa, Toru; Yamaguchi, Takuhiro; Ono, Hiroshi

    2017-07-01

    To assess the role of specific visual subfields in collisions with oncoming cars during simulated driving in patients with advanced glaucoma. Normal subjects and patients with glaucoma with mean deviation glaucoma. And, 5 of the 100 patients with advanced glaucoma experienced simulator sickness during the main test and were thus excluded. In total, 95 patients with advanced glaucoma and 43 normal subjects completed the main test of DS. Advanced glaucoma patients had significantly more collisions than normal patients in one or both DS scenarios (pglaucoma who were involved in collisions were older (p=0.050) and had worse visual acuity in the better eye (pglaucoma. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Proceedings of the Indian Analytical Science Congress: analytical science for innovations in green energy, technology and industry - souvenir

    International Nuclear Information System (INIS)

    2013-01-01

    The theme of IASC - 2013 is 'Analytical Science for innovations in Green Energy, Technology and Industry'. This theme was chosen to emphasize the unprecedented opportunities for analytical science and technology in the field of green energy, technology and industry, while at the same time recognizing the special challenges faced by analytical science in this field. The objective of the conference is to advance research, development and innovation in analytical sciences for the benefit of its application in the areas of green science and technology. The growing role of analytical science in green energy, technology and industry are significant. The next few years will witness more momentous achievements of analytical science as well as its application in green energy, technology and industry contributing towards the benefit of mankind in terms of healthy, productive, long and comfortable life. Papers relevant to INIS are indexed separately

  1. Analytic and numerical realizations of a disc galaxy

    Science.gov (United States)

    Stringer, M. J.; Brooks, A. M.; Benson, A. J.; Governato, F.

    2010-09-01

    Recent focus on the importance of cold, unshocked gas accretion in galaxy formation - not explicitly included in semi-analytic studies - motivates the following detailed comparison between two inherently different modelling techniques: direct hydrodynamical simulation and semi-analytic modelling. By analysing the physical assumptions built into the GASOLINE simulation, formulae for the emergent behaviour are derived which allow immediate and accurate translation of these assumptions to the GALFORM semi-analytic model. The simulated halo merger history is then extracted and evolved using these equivalent equations, predicting a strikingly similar galactic system. This exercise demonstrates that it is the initial conditions and physical assumptions which are responsible for the predicted evolution, not the choice of modelling technique. On this level playing field, a previously published GALFORM model is applied (including additional physics such as chemical enrichment and feedback from active galactic nuclei) which leads to starkly different predictions.

  2. A comparison of two nodal codes : Advanced nodal code (ANC) and analytic function expansion nodal (AFEN) code

    International Nuclear Information System (INIS)

    Chung, S.K.; Hah, C.J.; Lee, H.C.; Kim, Y.H.; Cho, N.Z.

    1996-01-01

    Modern nodal methods usually employs the transverse integration technique in order to reduce a multi-dimensional diffusion equation to one-dimensional diffusion equations. The use of the transverse integration technique requires two major approximations such as a transverse leakage approximation and a one-dimensional flux approximation. Both the transverse leakage and the one-dimensional flux are approximated by polynomials. ANC (Advanced Nodal Code) developed by Westinghouse employs a modern nodal expansion method for the flux calculation, the equivalence theory for the homogenization error reduction and a group theory for pin power recovery. Unlike the conventional modern nodal methods, AFEN (Analytic Function Expansion Nodal) method expands homogeneous flux distributions within a node into non-separable analytic basis functions, which eliminate two major approximations of the modern nodal methods. A comparison study of AFEN with ANC has been performed to see the applicability of AFEN to commercial PWR and different types of reactors such as MOX fueled reactor. The qualification comparison results demonstrate that AFEN methodology is accurate enough to apply for commercial PWR analysis. The results show that AFEN provides very accurate results (core multiplication factor and assembly power distribution) for cores that exhibit strong flux gradients as in a MOX loaded core. (author)

  3. Annual Performance Assessment of Complex Fenestration Systems in Sunny Climates Using Advanced Computer Simulations

    Directory of Open Access Journals (Sweden)

    Chantal Basurto

    2015-12-01

    Full Text Available Complex Fenestration Systems (CFS are advanced daylighting systems that are placed on the upper part of a window to improve the indoor daylight distribution within rooms. Due to their double function of daylight redirection and solar protection, they are considered as a solution to mitigate the unfavorable effects due to the admission of direct sunlight in buildings located in prevailing sunny climates (risk of glare and overheating. Accordingly, an adequate assessment of their performance should include an annual evaluation of the main aspects relevant to the use of daylight in such regions: the indoor illuminance distribution, thermal comfort, and visual comfort of the occupant’s. Such evaluation is possible with the use of computer simulations combined with the bi-directional scattering distribution function (BSDF data of these systems. This study explores the use of available methods to assess the visible and thermal annual performance of five different CFS using advanced computer simulations. To achieve results, an on-site daylight monitoring was carried out in a building located in a predominantly sunny climate location, and the collected data was used to create and calibrate a virtual model used to carry-out the simulations. The results can be employed to select the CFS, which improves visual and thermal interior environment for the occupants.

  4. Advanced CFD simulation for the assessment of nuclear safety issues at EDF. Some examples

    International Nuclear Information System (INIS)

    Vare, Christophe

    2014-01-01

    EDF R and D has computer power that puts it amongst the top industrial research centers in the world. Its supercomputers and in-house codes as well as its experts represent important capabilities to support EDF activities (safety analyses, support to the design of new reactors, analysis of accidental situations non reproducible by experiments, better understanding of physics or complex system response, effects of uncertainties and identification of prominent parameters, qualification and optimization of processes and materials...). Advanced numerical simulation is a powerful tool allowing EDF to increase its competitiveness, improve its performance and the safety of its plants. On this issue, EDF made the choice to develop its own in-house codes, instead of using commercial software, in order to be able to capitalize its expertise and methodologies. This choice allowed as well easier technological transfer to the concerned business units or engineering divisions, fast adaptation of our simulation tools to emerging needs and the development of specific physics or functionalities not addressed by the commercial offer. During the last ten years, EDF has decided to open its in-house codes, through the Open Source way. This is the case for Code – Aster (structure analysis), Code – Saturne (computational fluid dynamics, CFD), TELEMAC (flow calculations in aquatic environment), SALOME (generic platform for Pre and Post-Processing) and SYRTHES (heat transfer in complex geometries), among others. The 3 open source software: Code – Aster, Code – Saturne and TELEMAC, are certified by the French Nuclear Regulatory Authority for many «Important to Safety» studies. Advanced simulation, which treats complex, multi-field and multi-physics problems, is of great importance for the assessment of nuclear safety issues. This paper will present 2 examples of advanced simulation using Code – Saturne for safety issues of nuclear power plants in the fields of R and D and

  5. Recent Advances in Portable Analytical Electromigration Devices

    Directory of Open Access Journals (Sweden)

    Ann Van Schepdael

    2016-01-01

    Full Text Available This article presents an overview of recent advances in the field of portable capillary electrophoresis and microchip electrophoresis equipment during the period 2013–Mid 2015. Instrumental achievements in the separation as well as the detection part of the equipment are discussed. Several applications from a variety of fields are described.

  6. Optimization of the SNS magnetism reflectometer neutron-guide optics using Monte Carlo simulations

    CERN Document Server

    Klose, F

    2002-01-01

    The magnetism reflectometer at the spallation neutron source SNS will employ advanced neutron optics to achieve high data rate, improved resolution, and extended dynamic range. Optical components utilized will include a multi-channel polygonal curved bender and a tapered neutron-focusing guide section. The results of a neutron beam interacting with these devices are rather complex. Additional complexity arises due to the spectral/time-emission profile of the moderator and non-perfect neutron optical coatings. While analytic formulae for the individual components provide some design guidelines, a realistic performance assessment of the whole instrument can only be achieved by advanced simulation methods. In this contribution, we present guide optics optimizations for the magnetism reflectometer using Monte Carlo simulations. We compare different instrument configurations and calculate the resulting data rates. (orig.)

  7. Comparison of high-accuracy numerical simulations of black-hole binaries with stationary-phase post-Newtonian template waveforms for initial and advanced LIGO

    International Nuclear Information System (INIS)

    Boyle, Michael; Brown, Duncan A; Pekowsky, Larne

    2009-01-01

    We study the effectiveness of stationary-phase approximated post-Newtonian waveforms currently used by ground-based gravitational-wave detectors to search for the coalescence of binary black holes by comparing them to an accurate waveform obtained from numerical simulation of an equal-mass non-spinning binary black hole inspiral, merger and ringdown. We perform this study for the initial- and advanced-LIGO detectors. We find that overlaps between the templates and signal can be improved by integrating the match filter to higher frequencies than used currently. We propose simple analytic frequency cutoffs for both initial and advanced LIGO, which achieve nearly optimal matches, and can easily be extended to unequal-mass, spinning systems. We also find that templates that include terms in the phase evolution up to 3.5 post-Newtonian (pN) order are nearly always better, and rarely significantly worse, than 2.0 pN templates currently in use. For initial LIGO we recommend a strategy using templates that include a recently introduced pseudo-4.0 pN term in the low-mass (M ≤ 35 M o-dot ) region, and 3.5 pN templates allowing unphysical values of the symmetric reduced mass η above this. This strategy always achieves overlaps within 0.3% of the optimum, for the data used here. For advanced LIGO we recommend a strategy using 3.5 pN templates up to M = 12 M o-dot , 2.0 pN templates up to M = 21 M o-dot , pseudo-4.0 pN templates up to 65 M o-dot , and 3.5 pN templates with unphysical η for higher masses. This strategy always achieves overlaps within 0.7% of the optimum for advanced LIGO.

  8. Numerical simulation of abutment pressure redistribution during face advance

    Science.gov (United States)

    Klishin, S. V.; Lavrikov, S. V.; Revuzhenko, A. F.

    2017-12-01

    The paper presents numerical simulation data on the abutment pressure redistribution in rock mass during face advance, including isolines of maximum shear stress and pressure epures. The stress state of rock in the vicinity of a breakage heading is calculated by the finite element method using a 2D nonlinear model of a structurally heterogeneous medium with regard to plasticity and internal self-balancing stress. The thus calculated stress field is used as input data for 3D discrete element modeling of the process. The study shows that the abutment pressure increases as the roof span extends and that the distance between the face breast and the peak point of this pressure depends on the elastoplastic properties and internal self-balancing stress of a rock medium.

  9. TrajAnalytics: A Web-Based Visual Analytics Software of Urban Trajectory

    OpenAIRE

    Zhao, Ye; AL-Dohuki, Shamal; Eynon, Thomas; Kamw, Farah; Sheets, David; Ma, Chao; Ye, Xinyue; Hu, Yueqi; Feng, Tinghao; Yang, Jing

    2017-01-01

    Advanced technologies in sensing and computing have created urban trajectory datasets of humans and vehicles travelling over urban road networks. Understanding and analyzing the large-scale, complex data reflecting city dynamics is of great importance to enhance both human lives and urban environments. Domain practitioners, researchers, and decision-makers need to store, manage, query and visualize such big datasets. We develop a software system named TrajAnalytics, which explicitly supports ...

  10. Extraction, Analytical and Advanced Methods for Detection of Allura Red AC (E129 in Food and Beverages Products

    Directory of Open Access Journals (Sweden)

    Shafiquzzaman eSiddiquee

    2016-05-01

    Full Text Available Allura Red AC (E129 is an azo dye that widely used in drinks, juices, bakery, meat and sweets products. High consumption of Allura Red has claimed an adverse effects of human health including allergies, food intolerance, cancer, multiple sclerosis, attention deficit hyperactivity disorder (ADHD, brain damage, nausea, cardiac disease and asthma due to the reaction of aromatic azo compounds (R = R' = aromatic. Several countries have banned and strictly controlled the uses of Allura Red in food and beverage products. This review paper is critically summarized on the available analytical and advanced methods for determination of Allura Red and also concisely discussed on the acceptable daily intake (ADI, toxicology and extraction methods.

  11. Analytical solution of population balance equation involving ...

    Indian Academy of Sciences (India)

    This paper presents an effective analytical simulation to solve population .... considering spatial dependence and growth, based on the so-called LPA formulation as .... But the particle size distribution is defined so that n(v,t) dx is the number of ..... that was made beforehand in the construction of the analytical solutions ...

  12. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  13. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    Science.gov (United States)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  14. Experimental tests and qualification of analytical methods to address thermohydraulic phenomena in advanced water cooled reactors. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2000-05-01

    Worldwide there is considerable experience in nuclear power technology, especially in water cooled reactor technology. Of the operating plants, in September 1998, 346 were light water reactors (LWRs) totalling 306 GW(e) and 29 were heavy water reactors (HWRs) totalling 15 GW(e). The accumulated experience and lessons learned from these plants are being incorporated into new advanced reactor designs. Utility requirements documents have been formulated to guide these design activities by incorporating this experience, and results from research and development programmes, with the aim of reducing costs and licensing uncertainties by establishing the technical bases for the new designs. Common goals for advanced designs are high availability, user-friendly features, competitive economics and compliance with internationally recognized safety objectives. Large water cooled reactors with power outputs of 1300 MW(e) and above, which possess inherent safety characteristics (e.g. negative Doppler moderator temperature coefficients, and negative moderator void coefficient) and incorporate proven, active engineered systems to accomplish safety functions are being developed. Other designs with power outputs from, for example, 220 MW(e) up to about 1300 MW(e) which also possess inherent safety characteristics and which place more emphasis on utilization of passive safety systems are being developed. Passive systems are based on natural forces and phenomena such as natural convection and gravity, making safety functions less dependent on active systems and components like pumps and diesel generators. In some cases, further experimental tests for the thermohydraulic conditions of interest in advanced designs can provide improved understanding of the phenomena. Further, analytical methods to predict reactor thermohydraulic behaviour can be qualified for use by comparison with the experimental results. These activities should ultimately result in more economical designs. The

  15. Tools for advanced simulations to nuclear propulsion systems in rockets

    International Nuclear Information System (INIS)

    Torres Sepulveda, A.; Perez Vara, R.

    2004-01-01

    While chemical propulsion rockets have dominated space exploration, other forms of rocket propulsion based on nuclear power, electrostatic and magnetic drive, and other principles besides chemical reactions, have been considered from the earliest days of the field. The goal of most of these advanced rocket propulsion schemes is improved efficiency through higher exhaust velocities, in order to reduce the amount of fuel the rocket vehicle needs to carry, though generally at the expense of high thrust. Nuclear propulsion seems to be the most promising short term technology to plan realistic interplanetary missions. The development of a nuclear electric propulsion spacecraft shall require the development of models to analyse the mission and to understand the interaction between the related subsystems (nuclear reactor, electrical converter, power management and distribution, and electric propulsion) during the different phases of the mission. This paper explores the modelling of a nuclear electric propulsion (NEP) spacecraft type using EcosimPro simulation software. This software is a multi-disciplinary simulation tool with a powerful object-oriented simulation language and state-of-the-art solvers. EcosimPro is the recommended ESA simulation tool for environmental Control and Life Support Systems (ECLSS) and has been used successfully within the framework of the European activities of the International Space Station programme. Furthermore, propulsion libraries for chemical and electrical propulsion are currently being developed under ESA contracts to set this tool as standard usage in the propulsion community. At present, there is not any workable NEP spacecraft, but a standardized-modular, multi-purpose interplanetary spacecraft for post-2000 missions, called ISC-2000, has been proposed in reference. The simulation model presented on this paper is based on the preliminary designs for this spacecraft. (Author)

  16. [Final goal and problems in clinical chemistry examination measured by advanced analytical instruments].

    Science.gov (United States)

    Sasaki, M; Hashimoto, E

    1993-07-01

    In the field of clinical chemistry of Japan, the automation of analytical instruments first appeared in the 1960's with the rapid developments in electronics industry. After a series of improvements and modifications in the past thirty years, these analytical instruments became excellent with multifunctions. From the results of these developments, it is now well recognized that automated analytical instruments are indispensable to manage the modern clinical Laboratory. On the other hand, these automated analytical instruments uncovered the various problems which had been hitherto undetected when the manually-operated instruments were used. For instances, the variation of commercially available standard solutions due to the lack of government control causes the different values obtained in institutions. In addition, there are many problems such as a shortage of medical technologists, a complication to handle the sampling and an increased labor costs. Furthermore, the inadequacies in maintenance activities cause the frequent erroneous reports of laboratory findings in spite of the latest and efficient analytical instruments equipped. Thus, the working process in clinical laboratory must be systematized to create the rapidity and the effectiveness. In the present report, we review the developmental history of automation system for analytical instruments, discuss the problems to create the effective clinical laboratory and explore the ways to deal with these emerging issues for the automation technology in clinical laboratory.

  17. Antioxidant phytochemicals in fresh produce: exploitation of genotype variation and advancements in analytical protocols

    Science.gov (United States)

    Manganaris, George A.; Goulas, Vlasios; Mellidou, Ifigeneia; Drogoudi, Pavlina

    2017-12-01

    Horticultural commodities (fruit and vegetables) are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, towards the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivar/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e. small tomato of Santorini island (cv. ‘Tomataki Santorinis’) possesses appreciably high amounts of ascorbic acid. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier ‘gene pool’ as the basis of future adaptation. Towards this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR) and infrared (IR) spectroscopy, electrochemical and chemometric methods, flow injection analysis (FIA), optical sensors and high resolution screening (HRS). Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e. metabolomics, foodomics) is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of cheap and rapid optical sensors and IR spectroscopy is recommended to

  18. Antioxidant Phytochemicals in Fresh Produce: Exploitation of Genotype Variation and Advancements in Analytical Protocols

    Directory of Open Access Journals (Sweden)

    George A. Manganaris

    2018-02-01

    Full Text Available Horticultural commodities (fruit and vegetables are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, toward the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivars/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e., small tomato of Santorini island (cv. “Tomataki Santorinis” possesses appreciably high amounts of ascorbic acid (AsA. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier “gene pool” as the basis of future adaptation. Toward this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR and infrared (IR spectroscopy, electrochemical, and chemometric methods, flow injection analysis (FIA, optical sensors, and high resolution screening (HRS. Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e., metabolomics, foodomics is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of optical sensors and IR spectroscopy is recommended to

  19. 2D fluid-analytical simulation of electromagnetic effects in low pressure, high frequency electronegative capacitive discharges

    International Nuclear Information System (INIS)

    Kawamura, E; Lichtenberg, A J; Lieberman, M A; Marakhtanov, A M

    2016-01-01

    A fast 2D axisymmetric fluid-analytical multifrequency capacitively coupled plasma (CCP) reactor code is used to study center high nonuniformity in a low pressure electronegative chlorine discharge. In the code, a time-independent Helmholtz wave equation is used to solve for the capacitive fields in the linearized frequency domain. This eliminates the time dependence from the electromagnetic (EM) solve, greatly speeding up the simulations at the cost of neglecting higher harmonics. However, since the code allows up to three driving frequencies, we can add the two most important harmonics to the CCP simulations as the second and third input frequencies. The amplitude and phase of these harmonics are estimated by using a recently developed 1D radial nonlinear transmission line (TL) model of a highly asymmetric cylindrical discharge (Lieberman et al 2015 Plasma Sources Sci. Technol. 24 055011). We find that at higher applied frequencies, the higher harmonics contribute significantly to the center high nonuniformity due to their shorter plasma wavelengths. (paper)

  20. A new DG nanoscale TFET based on MOSFETs by using source gate electrode: 2D simulation and an analytical potential model

    Science.gov (United States)

    Ramezani, Zeinab; Orouji, Ali A.

    2017-08-01

    This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.

  1. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    Science.gov (United States)

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  2. A SIMPLE ANALYTICAL METHOD TO DETERMINE SOLAR ENERGETIC PARTICLES' MEAN FREE PATH

    International Nuclear Information System (INIS)

    He, H.-Q.; Qin, G.

    2011-01-01

    To obtain the mean free path of solar energetic particles (SEPs) for a solar event, one usually has to fit time profiles of both flux and anisotropy from spacecraft observations to numerical simulations of SEPs' transport processes. This method can be called a simulation method. But a reasonably good fitting needs a lot of simulations, which demand a large amount of calculation resources. Sometimes, it is necessary to find an easy way to obtain the mean free path of SEPs quickly, for example, in space weather practice. Recently, Shalchi et al. provided an approximate analytical formula of SEPs' anisotropy time profile as a function of particles' mean free path for impulsive events. In this paper, we determine SEPs' mean free path by fitting the anisotropy time profiles from Shalchi et al.'s analytical formula to spacecraft observations. This new method can be called an analytical method. In addition, we obtain SEPs' mean free path with the traditional simulation methods. Finally, we compare the mean free path obtained with the simulation method to that of the analytical method to show that the analytical method, with some minor modifications, can give us a good, quick approximation of SEPs' mean free path for impulsive events.

  3. Modeling And Simulation Of Highly Advanced Multilevel Inverter For Speed Control Of Induction Motor

    Directory of Open Access Journals (Sweden)

    Ravi Raj

    2017-02-01

    Full Text Available In this Paper the problem of removing Power dissipation from single phase Induction Motor with DC sources is considered by the speed control of Induction Motor with highly advanced 9-Level multi-level Inverter which having approximate zero Harmonics. As the demand of power is increasing day by day. So that we must introduced very advanced Electrical Instruments which having high efficiency and less dissipation of power. The requirement of very advanced Inverter is necessary. Here we are designing a Multi-level Inverter up to the 9-level using IGBT Insulated-gate bipolar transistor by Mat lab which having negligible total harmonic distortion THD thats why it will control the speed of single phase Induction motor which is presently widely used in our daily needs. Also several informative Simulation results verify the authority and truthiness of the proposed Model.

  4. An advanced analytical solution for pressure build-up during CO2 injection into infinite saline aquifers: The role of compressibility

    Science.gov (United States)

    Wu, Haiqing; Bai, Bing; Li, Xiaochun

    2018-02-01

    Existing analytical or approximate solutions that are appropriate for describing the migration mechanics of CO2 and the evolution of fluid pressure in reservoirs do not consider the high compressibility of CO2, which reduces their calculation accuracy and application value. Therefore, this work first derives a new governing equation that represents the movement of complex fluids in reservoirs, based on the equation of continuity and the generalized Darcy's law. A more rigorous definition of the coefficient of compressibility of fluid is then presented, and a power function model (PFM) that characterizes the relationship between the physical properties of CO2 and the pressure is derived. Meanwhile, to avoid the difficulty of determining the saturation of fluids, a method that directly assumes the average relative permeability of each fluid phase in different fluid domains is proposed, based on the theory of gradual change. An advanced analytical solution is obtained that includes both the partial miscibility and the compressibility of CO2 and brine in evaluating the evolution of fluid pressure by integrating within different regions. Finally, two typical sample analyses are used to verify the reliability, improved nature and universality of this new analytical solution. Based on the physical characteristics and the results calculated for the examples, this work elaborates the concept and basis of partitioning for use in further work.

  5. Advanced simulators for France's NPPs

    International Nuclear Information System (INIS)

    Zerbino, H.; Renault, P.

    1997-01-01

    The training capabilities of the new generation of full-scope simulators have been greatly enhanced by the massive application of graphic information displays. In a parallel development, the simulation models have attained such a level of performance that real-time simulators are increasingly becoming ideal tools for certain engineering tasks. Their applications should soon extend well beyond the training activities to which they have been restricted in the past

  6. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Kimberlyn C. Mousseau

    2011-10-01

    The Nuclear Energy Computational Fluid Dynamics Advanced Modeling and Simulation (NE-CAMS) system is being developed at the Idaho National Laboratory (INL) in collaboration with Bettis Laboratory, Sandia National Laboratory (SNL), Argonne National Laboratory (ANL), Utah State University (USU), and other interested parties with the objective of developing and implementing a comprehensive and readily accessible data and information management system for computational fluid dynamics (CFD) verification and validation (V&V) in support of nuclear energy systems design and safety analysis. The two key objectives of the NE-CAMS effort are to identify, collect, assess, store and maintain high resolution and high quality experimental data and related expert knowledge (metadata) for use in CFD V&V assessments specific to the nuclear energy field and to establish a working relationship with the U.S. Nuclear Regulatory Commission (NRC) to develop a CFD V&V database, including benchmark cases, that addresses and supports the associated NRC regulations and policies on the use of CFD analysis. In particular, the NE-CAMS system will support the Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program, which aims to develop and deploy advanced modeling and simulation methods and computational tools for reliable numerical simulation of nuclear reactor systems for design and safety analysis. Primary NE-CAMS Elements There are four primary elements of the NE-CAMS knowledge base designed to support computer modeling and simulation in the nuclear energy arena as listed below. Element 1. The database will contain experimental data that can be used for CFD validation that is relevant to nuclear reactor and plant processes, particularly those important to the nuclear industry and the NRC. Element 2. Qualification standards for data evaluation and classification will be incorporated and applied such that validation data sets will result in well

  7. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  8. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  9. Simulation for Supporting Scale-Up of a Fluidized Bed Reactor for Advanced Water Oxidation

    Directory of Open Access Journals (Sweden)

    Farhana Tisa

    2014-01-01

    Full Text Available Simulation of fluidized bed reactor (FBR was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP. The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure in the reactor. Simulation was implemented for 2.8 L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe3+ and Fe2+ mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40–90 mg/L within 60 min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10 L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment.

  10. A study of reset mode in advanced alarm system simulator

    International Nuclear Information System (INIS)

    Yenn, T. C.; Hwang, S. L.; Huang, F. H.; Yu, A. C.; Hsu, C. C.; Huang, H. W.

    2006-01-01

    An automation function has been widely applied in main control room of nuclear power plants. That leads to a new issue of human-automation interaction, which considers human operational performance in automated systems. In this research is the automation alarm reset in the advanced alarm system (AAS) of Advanced Nuclear Power Plant in Taiwan. Since alarms are very crucial for the understanding of the status of the plant as well as the reset function of alarm system will be changed from fully manual to fully automatic, it is very important to test and evaluate the performance and the effect of reset modes in AAS. The purpose of this paper is to evaluate the impact of the auto-reset alarm system on the plant performance and on operators' preference and task load. To develop a dynamic simulator as an AAS was conducted to compare manual and automatic reset function of alarm system on task performance and subjective ratings of task workload, comprehension, and preference. The simulation includes PCTRAN model and alarm software processing. The final results revealed that, using the auto-reset mode, participants had lower task load index (TLX) on effort in the first test trial and was more satisfied in multiple tasks condition. In contrast, using manual reset mode, participants were more satisfied on alarm handling, monitoring, and decision making. In other words, either reset mode in the study has unique features to assist operator, but is insufficient. The reset function in AAS therefore should be very flexible. Additionally, the experimental results also pointed out that the user interfaces need to be improved. Those experiences will be helpful for human factors verification and validation in the near future. (authors)

  11. The importance of simulation facilities for the development of review criteria for advanced human system interfaces

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Wachtel, J.

    1994-01-01

    Advanced control room (ACR) concepts are being developed in the commercial nuclear industry as part of future reactor designs. The ACRs will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role (function) in the system, the method of information presentation, the ways in which the operator interacts with the system, and the requirements on the operator to understand and supervise an increasingly complex system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The NRC is developing guidelines to support their review of these advanced designs. As part of this effort, a methodology for guidance development was established, and topics in need of further research were identified. Simulators of various kinds are likely to play important roles in the development of review guidelines and in the evaluation of ACRs. This paper describes a general approach to review criteria development, and discusses the role of simulators in addressing research needs

  12. Study of advanced fuel system concepts for commercial aircraft

    Science.gov (United States)

    Coffinberry, G. A.

    1985-01-01

    An analytical study was performed in order to assess relative performance and economic factors involved with alternative advanced fuel systems for future commercial aircraft operating with broadened property fuels. The DC-10-30 wide-body tri-jet aircraft and the CF6-8OX engine were used as a baseline design for the study. Three advanced systems were considered and were specifically aimed at addressing freezing point, thermal stability and lubricity fuel properties. Actual DC-10-30 routes and flight profiles were simulated by computer modeling and resulted in prediction of aircraft and engine fuel system temperatures during a nominal flight and during statistical one-day-per-year cold and hot flights. Emergency conditions were also evaluated. Fuel consumption and weight and power extraction results were obtained. An economic analysis was performed for new aircraft and systems. Advanced system means for fuel tank heating included fuel recirculation loops using engine lube heat and generator heat. Environmental control system bleed air heat was used for tank heating in a water recirculation loop. The results showed that fundamentally all of the three advanced systems are feasible but vary in their degree of compatibility with broadened-property fuel.

  13. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    International Nuclear Information System (INIS)

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  14. Predictive analytics and child protection: constraints and opportunities.

    Science.gov (United States)

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Analytical energy spectrum for hybrid mechanical systems

    International Nuclear Information System (INIS)

    Zhong, Honghua; Xie, Qiongtao; Lee, Chaohong; Guan, Xiwen; Gao, Kelin; Batchelor, Murray T

    2014-01-01

    We investigate the energy spectrum for hybrid mechanical systems described by non-parity-symmetric quantum Rabi models. A set of analytical solutions in terms of the confluent Heun functions and their analytical energy spectrum is obtained. The analytical energy spectrum includes regular and exceptional parts, which are both confirmed by direct numerical simulation. The regular part is determined by the zeros of the Wronskian for a pair of analytical solutions. The exceptional part is relevant to the isolated exact solutions and its energy eigenvalues are obtained by analyzing the truncation conditions for the confluent Heun functions. By analyzing the energy eigenvalues for exceptional points, we obtain the analytical conditions for the energy-level crossings, which correspond to two-fold energy degeneracy. (paper)

  16. Cryptography based on neural networks - analytical results

    International Nuclear Information System (INIS)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2002-01-01

    The mutual learning process between two parity feed-forward networks with discrete and continuous weights is studied analytically, and we find that the number of steps required to achieve full synchronization between the two networks in the case of discrete weights is finite. The synchronization process is shown to be non-self-averaging and the analytical solution is based on random auxiliary variables. The learning time of an attacker that is trying to imitate one of the networks is examined analytically and is found to be much longer than the synchronization time. Analytical results are found to be in agreement with simulations. (letter to the editor)

  17. Advanced Simulation Capability for Environmental Management (ASCEM): Early Site Demonstration

    International Nuclear Information System (INIS)

    Meza, Juan; Hubbard, Susan; Freshley, Mark D.; Gorton, Ian; Moulton, David; Denham, Miles E.

    2011-01-01

    The U.S. Department of Energy Office of Environmental Management, Technology Innovation and Development (EM-32), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high performance computing tool will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. As part of the initial development process, a series of demonstrations were defined to test ASCEM components and provide feedback to developers, engage end users in applications, and lead to an outcome that would benefit the sites. The demonstration was implemented for a sub-region of the Savannah River Site General Separations Area that includes the F-Area Seepage Basins. The physical domain included the unsaturated and saturated zones in the vicinity of the seepage basins and Fourmile Branch, using an unstructured mesh fit to the hydrostratigraphy and topography of the site. The calculations modeled variably saturated flow and the resulting flow field was used in simulations of the advection of non-reactive species and the reactive-transport of uranium. As part of the demonstrations, a new set of data management, visualization, and uncertainty quantification tools were developed to analyze simulation results and existing site data. These new tools can be used to provide summary statistics, including information on which simulation parameters were most important in the prediction of uncertainty and to visualize the relationships between model input and output.

  18. Advanced design of local ventilation systems

    Energy Technology Data Exchange (ETDEWEB)

    Kulmala, I. [VTT Manufacturing Technology, Espoo (Finland). Safety Technology

    1997-12-31

    Local ventilation is widely used in industry for controlling airborne contaminants. However, the present design practices of local ventilation systems are mainly based on empirical equations and do not take quantitatively into account the various factors affecting the performance of these systems. The aim of this study was to determine the applicability and limitations of more advanced fluid mechanical methods to the design and development of local ventilation systems. The most important factors affecting the performance of local ventilation systems were determined and their effect was studied in a systematic manner. The numerical calculations were made with the FLUENT computer code and they were verified by laboratory experiments, previous measurements or analytical solutions. The results proved that the numerical calculations can provide a realistic simulation of exhaust openings, effects of ambient air flows and wake regions. The experiences with the low-velocity local supply air showed that these systems can also be modelled fairly well. The results were used to improve the efficiency and thermal comfort of a local ventilation unit and to increase the effective control range of exhaust hoods. In the simulation of the interaction of a hot buoyant source and local exhaust, the predicted capture efficiencies were clearly higher than those observed experimentally. The deviations between measurements and non-isothermal flow calculations may have partly been caused by the inability to achieve grid independent solutions. CFD simulations is an advanced and flexible tool for designing and developing local ventilation. The simulations can provide insight into the time-averaged flow field which may assist us in understanding the observed phenomena and to explain experimental results. However, for successful calculations the applicability and limitations of the models must be known. (orig.) 78 refs.

  19. Analytical modeling and simulation of subthreshold behavior in nanoscale dual material gate AlGaN/GaN HEMT

    Science.gov (United States)

    Kumar, Sona P.; Agrawal, Anju; Chaujar, Rishu; Gupta, Mridula; Gupta, R. S.

    2008-07-01

    A two-dimensional (2-D) analytical model for a Dual Material Gate (DMG) AlGaN/GaN High Electron Mobility Transistor (HEMT) has been developed to demonstrate the unique attributes of this device structure in suppressing short channel effects (SCEs). The model accurately predicts the channel potential, electric field variation along the channel, and sub-threshold drain current, taking into account the effect of lengths of the two gate metals, their work functions, barrier layer thicknesses, and applied drain biases. It is seen that the SCEs and hot carrier effects in DMG AlGaN/GaN HEMT are suppressed due to the work function difference of the two metal gates, thereby screening the drain potential variations by the gate near the drain. Besides, a more uniform electric field along the channel leads to improved carrier transport efficiency. The accuracy of the results obtained from our analytical model has been verified using ATLAS device simulations.

  20. 3rd International Workshop on Advances in Simulation-Driven Optimization and Modeling

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2016-01-01

    This edited volume is devoted to the now-ubiquitous use of computational models across most disciplines of engineering and science, led by a trio of world-renowned researchers in the field. Focused on recent advances of modeling and optimization techniques aimed at handling computationally-expensive engineering problems involving simulation models, this book will be an invaluable resource for specialists (engineers, researchers, graduate students) working in areas as diverse as electrical engineering, mechanical and structural engineering, civil engineering, industrial engineering, hydrodynamics, aerospace engineering, microwave and antenna engineering, ocean science and climate modeling, and the automotive industry, where design processes are heavily based on CPU-heavy computer simulations. Various techniques, such as knowledge-based optimization, adjoint sensitivity techniques, and fast replacement models (to name just a few) are explored in-depth along with an array of the latest techniques to optimize the...

  1. Advances in learning analytics and educational data mining

    NARCIS (Netherlands)

    Vahdat, Mehrnoosh; Ghio, A; Oneto, L.; Anguita, D.; Funk, M.; Rauterberg, G.W.M.

    2015-01-01

    The growing interest in recent years towards Learning An- alytics (LA) and Educational Data Mining (EDM) has enabled novel ap- proaches and advancements in educational settings. The wide variety of research and practice in this context has enforced important possibilities and applications from

  2. High performance pseudo-analytical simulation of multi-object adaptive optics over multi-GPU systems

    KAUST Repository

    Abdelfattah, Ahmad; Gendron, É ric; Gratadour, Damien; Keyes, David E.; Ltaief, Hatem; Sevin, Arnaud; Vidal, Fabrice

    2014-01-01

    Multi-object adaptive optics (MOAO) is a novel adaptive optics (AO) technique dedicated to the special case of wide-field multi-object spectrographs (MOS). It applies dedicated wavefront corrections to numerous independent tiny patches spread over a large field of view (FOV). The control of each deformable mirror (DM) is done individually using a tomographic reconstruction of the phase based on measurements from a number of wavefront sensors (WFS) pointing at natural and artificial guide stars in the field. The output of this study helps the design of a new instrument called MOSAIC, a multi-object spectrograph proposed for the European Extremely Large Telescope (E-ELT). We have developed a novel hybrid pseudo-analytical simulation scheme that allows us to accurately simulate in detail the tomographic problem. The main challenge resides in the computation of the tomographic reconstructor, which involves pseudo-inversion of a large dense symmetric matrix. The pseudo-inverse is computed using an eigenvalue decomposition, based on the divide and conquer algorithm, on multicore systems with multi-GPUs. Thanks to a new symmetric matrix-vector product (SYMV) multi-GPU kernel, our overall implementation scores significant speedups over standard numerical libraries on multicore, like Intel MKL, and up to 60% speedups over the standard MAGMA implementation on 8 Kepler K20c GPUs. At 40,000 unknowns, this appears to be the largest-scale tomographic AO matrix solver submitted to computation, to date, to our knowledge and opens new research directions for extreme scale AO simulations. © 2014 Springer International Publishing Switzerland.

  3. Simulating and Communicating Outcomes in Disaster Management Situations

    Directory of Open Access Journals (Sweden)

    Michal Lichter

    2015-09-01

    Full Text Available An important, but overlooked component of disaster managment is raising the awareness and preparedness of potential stakeholders. We show how recent advances in agent-based modeling and geo-information analytics can be combined to this effect. Using a dynamic simulation model, we estimate the long run outcomes of two very different urban disasters with severe consequences: an earthquake and a missile attack. These differ in terms of duration, intensity, permanence, and focal points. These hypothetical shocks are simulated for the downtown area of Jerusalem. Outcomes are compared in terms of their potential for disaster mitigation. The spatial and temporal dynamics of the simulation yield rich outputs. Web-based mapping is used to visualize these results and communicate risk to policy makers, planners, and the informed public. The components and design of this application are described. Implications for participatory disaster management and planning are discussed.

  4. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  5. Analytical study on model tests of soil-structure interaction

    International Nuclear Information System (INIS)

    Odajima, M.; Suzuki, S.; Akino, K.

    1987-01-01

    Since nuclear power plant (NPP) structures are stiff, heavy and partly-embedded, the behavior of those structures during an earthquake depends on the vibrational characteristics of not only the structure but also the soil. Accordingly, seismic response analyses considering the effects of soil-structure interaction (SSI) are extremely important for seismic design of NPP structures. Many studies have been conducted on analytical techniques concerning SSI and various analytical models and approaches have been proposed. Based on the studies, SSI analytical codes (computer programs) for NPP structures have been improved at JINS (Japan Institute of Nuclear Safety), one of the departments of NUPEC (Nuclear Power Engineering Test Center) in Japan. These codes are soil-spring lumped-mass code (SANLUM), finite element code (SANSSI), thin layered element code (SANSOL). In proceeding with the improvement of the analytical codes, in-situ large-scale forced vibration SSI tests were performed using models simulating light water reactor buildings, and simulation analyses were performed to verify the codes. This paper presents an analytical study to demonstrate the usefulness of the codes

  6. Remote Internet access to advanced analytical facilities: a new approach with Web-based services.

    Science.gov (United States)

    Sherry, N; Qin, J; Fuller, M Suominen; Xie, Y; Mola, O; Bauer, M; McIntyre, N S; Maxwell, D; Liu, D; Matias, E; Armstrong, C

    2012-09-04

    Over the past decade, the increasing availability of the World Wide Web has held out the possibility that the efficiency of scientific measurements could be enhanced in cases where experiments were being conducted at distant facilities. Examples of early successes have included X-ray diffraction (XRD) experimental measurements of protein crystal structures at synchrotrons and access to scanning electron microscopy (SEM) and NMR facilities by users from institutions that do not possess such advanced capabilities. Experimental control, visual contact, and receipt of results has used some form of X forwarding and/or VNC (virtual network computing) software that transfers the screen image of a server at the experimental site to that of the users' home site. A more recent development is a web services platform called Science Studio that provides teams of scientists with secure links to experiments at one or more advanced research facilities. The software provides a widely distributed team with a set of controls and screens to operate, observe, and record essential parts of the experiment. As well, Science Studio provides high speed network access to computing resources to process the large data sets that are often involved in complex experiments. The simple web browser and the rapid transfer of experimental data to a processing site allow efficient use of the facility and assist decision making during the acquisition of the experimental results. The software provides users with a comprehensive overview and record of all parts of the experimental process. A prototype network is described involving X-ray beamlines at two different synchrotrons and an SEM facility. An online parallel processing facility has been developed that analyzes the data in near-real time using stream processing. Science Studio and can be expanded to include many other analytical applications, providing teams of users with rapid access to processed results along with the means for detailed

  7. Improved steamflood analytical model

    Energy Technology Data Exchange (ETDEWEB)

    Chandra, S.; Mamora, D.D. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Texas A and M Univ., TX (United States)

    2005-11-01

    Predicting the performance of steam flooding can help in the proper execution of enhanced oil recovery (EOR) processes. The Jones model is often used for analytical steam flooding performance prediction, but it does not accurately predict oil production peaks. In this study, an improved steam flood model was developed by modifying 2 of the 3 components of the capture factor in the Jones model. The modifications were based on simulation results from a Society of Petroleum Engineers (SPE) comparative project case model. The production performance of a 5-spot steamflood pattern unit was simulated and compared with results obtained from the Jones model. Three reservoir types were simulated through the use of 3-D Cartesian black oil models. In order to correlate the simulation and the Jones analytical model results for the start and height of the production peak, the dimensionless steam zone size was modified to account for a decrease in oil viscosity during steam flooding and its dependence on the steam injection rate. In addition, the dimensionless volume of displaced oil produced was modified from its square-root format to an exponential form. The modified model improved results for production performance by up to 20 years of simulated steam flooding, compared to the Jones model. Results agreed with simulation results for 13 different cases, including 3 different sets of reservoir and fluid properties. Reservoir engineers will benefit from the improved accuracy of the model. Oil displacement calculations were based on methods proposed in earlier research, in which the oil displacement rate is a function of cumulative oil steam ratio. The cumulative oil steam ratio is a function of overall thermal efficiency. Capture factor component formulae were presented, as well as charts of oil production rates and cumulative oil-steam ratios for various reservoirs. 13 refs., 4 tabs., 29 figs.

  8. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code

    International Nuclear Information System (INIS)

    Rinkel, J.; Dinten, J.M.; Tabary, J.

    2004-01-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  9. Milestone M4900: Simulant Mixing Analytical Results

    Energy Technology Data Exchange (ETDEWEB)

    Kaplan, D.I.

    2001-07-26

    This report addresses Milestone M4900, ''Simulant Mixing Sample Analysis Results,'' and contains the data generated during the ''Mixing of Process Heels, Process Solutions, and Recycle Streams: Small-Scale Simulant'' task. The Task Technical and Quality Assurance Plan for this task is BNF-003-98-0079A. A report with a narrative description and discussion of the data will be issued separately.

  10. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  11. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  12. Crystal growth of pure substances: Phase-field simulations in comparison with analytical and experimental results

    Science.gov (United States)

    Nestler, B.; Danilov, D.; Galenko, P.

    2005-07-01

    A phase-field model for non-isothermal solidification in multicomponent systems [SIAM J. Appl. Math. 64 (3) (2004) 775-799] consistent with the formalism of classic irreversible thermodynamics is used for numerical simulations of crystal growth in a pure material. The relation of this approach to the phase-field model by Bragard et al. [Interface Science 10 (2-3) (2002) 121-136] is discussed. 2D and 3D simulations of dendritic structures are compared with the analytical predictions of the Brener theory [Journal of Crystal Growth 99 (1990) 165-170] and with recent experimental measurements of solidification in pure nickel [Proceedings of the TMS Annual Meeting, March 14-18, 2004, pp. 277-288; European Physical Journal B, submitted for publication]. 3D morphology transitions are obtained for variations in surface energy and kinetic anisotropies at different undercoolings. In computations, we investigate the convergence behaviour of a standard phase-field model and of its thin interface extension at different undercoolings and at different ratios between the diffuse interface thickness and the atomistic capillary length. The influence of the grid anisotropy is accurately analyzed for a finite difference method and for an adaptive finite element method in comparison.

  13. Advancing Concentrating Solar Power Research (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2014-02-01

    Researchers at the National Renewable Energy Laboratory (NREL) provide scientific, engineering, and analytical expertise to help advance innovation in concentrating solar power (CSP). This fact sheet summarizes how NREL is advancing CSP research.

  14. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    Science.gov (United States)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  15. Study on advancement of in vivo counting using mathematical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kinase, Sakae [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-05-01

    To obtain an assessment of the committed effective dose, individual monitoring for the estimation of intakes of radionuclides is required. For individual monitoring of exposure to intakes of radionuclides, direct measurement of radionuclides in the body - in vivo counting- is very useful. To advance in a precision in vivo counting which fulfills the requirements of ICRP 1990 recommendations, some problems, such as the investigation of uncertainties in estimates of body burdens by in vivo counting, and the selection of the way to improve the precision, have been studied. In the present study, a calibration technique for in vivo counting application using Monte Carlo simulation was developed. The advantage of the technique is that counting efficiency can be obtained for various shapes and sizes that are very difficult to change for phantoms. To validate the calibration technique, the response functions and counting efficiencies of a whole-body counter installed in JAERI were evaluated using the simulation and measurements. Consequently, the calculations are in good agreement with the measurements. The method for the determination of counting efficiency curves as a function of energy was developed using the present technique and a physiques correction equation was derived from the relationship between parameters of correction factor and counting efficiencies of the JAERI whole-body counter. The uncertainties in body burdens of {sup 137}Cs estimated with the JAERI whole-body counter were also investigated using the Monte Carlo simulation and measurements. It was found that the uncertainties of body burdens estimated with the whole-body counter are strongly dependent on various sources of uncertainty such as radioactivity distribution within the body and counting statistics. Furthermore, the evaluation method of the peak efficiencies of a Ge semi-conductor detector was developed by Monte Carlo simulation for optimum arrangement of Ge semi-conductor detectors for

  16. Recent topics in differential and analytic geometry

    CERN Document Server

    Ochiai, T

    1990-01-01

    Advanced Studies in Pure Mathematics, Volume 18-I: Recent Topics in Differential and Analytic Geometry presents the developments in the field of analytical and differential geometry. This book provides some generalities about bounded symmetric domains.Organized into two parts encompassing 12 chapters, this volume begins with an overview of harmonic mappings and holomorphic foliations. This text then discusses the global structures of a compact Kähler manifold that is locally decomposable as an isometric product of Ricci-positive, Ricci-negative, and Ricci-flat parts. Other chapters con

  17. A simple analytical scaling method for a scaled-down test facility simulating SB-LOCAs in a passive PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il

    1992-02-01

    A Simple analytical scaling method is developed for a scaled-down test facility simulating SB-LOCAs in a passive PWR. The whole scenario of a SB-LOCA is divided into two phases on the basis of the pressure trend ; depressurization phase and pot-boiling phase. The pressure and the core mixture level are selected as the most critical parameters to be preserved between the prototype and the scaled-down model. In each phase the high important phenomena having the influence on the critical parameters are identified and the scaling parameters governing the high important phenomena are generated by the present method. To validate the model used, Marviken CFT and 336 rod bundle experiment are simulated. The models overpredict both the pressure and two phase mixture level, but it shows agreement at least qualitatively with experimental results. In order to validate whether the scaled-down model well represents the important phenomena, we simulate the nondimensional pressure response of a cold-leg 4-inch break transient for AP-600 and the scaled-down model. The results of the present method are in excellent agreement with those of AP-600. It can be concluded that the present method is suitable for scaling the test facility simulating SB-LOCAs in a passive PWR

  18. Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to the Enterprise

    Science.gov (United States)

    2010-04-29

    Technology: From the Office Larry Smith Software Technology Support Center to the Enterprise 517 SMXS/MXDEA 6022 Fir Avenue Hill AFB, UT 84056 801...2010 to 00-00-2010 4. TITLE AND SUBTITLE Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to

  19. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    Science.gov (United States)

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  20. Advances in snow cover distributed modelling via ensemble simulations and assimilation of satellite data

    Science.gov (United States)

    Revuelto, J.; Dumont, M.; Tuzet, F.; Vionnet, V.; Lafaysse, M.; Lecourt, G.; Vernay, M.; Morin, S.; Cosme, E.; Six, D.; Rabatel, A.

    2017-12-01

    Nowadays snowpack models show a good capability in simulating the evolution of snow in mountain areas. However singular deviations of meteorological forcing and shortcomings in the modelling of snow physical processes, when accumulated on time along a snow season, could produce large deviations from real snowpack state. The evaluation of these deviations is usually assessed with on-site observations from automatic weather stations. Nevertheless the location of these stations could strongly influence the results of these evaluations since local topography may have a marked influence on snowpack evolution. Despite the evaluation of snowpack models with automatic weather stations usually reveal good results, there exist a lack of large scale evaluations of simulations results on heterogeneous alpine terrain subjected to local topographic effects.This work firstly presents a complete evaluation of the detailed snowpack model Crocus over an extended mountain area, the Arve upper catchment (western European Alps). This catchment has a wide elevation range with a large area above 2000m a.s.l. and/or glaciated. The evaluation compares results obtained with distributed and semi-distributed simulations (the latter nowadays used on the operational forecasting). Daily observations of the snow covered area from MODIS satellite sensor, seasonal glacier surface mass balance evolution measured in more than 65 locations and the galciers annual equilibrium line altitude from Landsat/Spot/Aster satellites, have been used for model evaluation. Additionally the latest advances in producing ensemble snowpack simulations for assimilating satellite reflectance data over extended areas will be presented. These advances comprises the generation of an ensemble of downscaled high-resolution meteorological forcing from meso-scale meteorological models and the application of a particle filter scheme for assimilating satellite observations. Despite the results are prefatory, they show a good

  1. Advanced Research and Education in Electrical Drives by Using Digital Real-Time Hardware-in-the-Loop Simulation

    DEFF Research Database (Denmark)

    Bojoi, R.; Profumo, F.; Griva, G.

    2002-01-01

    The authors present in this paper a digital real-time hardware-in-the-loop simulation of a three-phase induction motor drive. The main real-time simulation tool is the dSPACE DS1103 PPC Controller Board which simulates the power and signal conditioning parts. The control algorithm of the virtual...... drive has been implemented on the Evaluation Board of TMS320F240 DSP. The experimental results validate this solution as a powerful tool to be used in research and advanced education. Thus, the students can put in practic the theory without spending too much time with details concerning the hardware...

  2. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  3. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  4. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can...

  5. Optical trapping for analytical biotechnology.

    Science.gov (United States)

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. A numerical investigation on the efficiency of range extending systems using Advanced Vehicle Simulator

    Science.gov (United States)

    Varnhagen, Scott; Same, Adam; Remillard, Jesse; Park, Jae Wan

    2011-03-01

    Series plug-in hybrid electric vehicles of varying engine configuration and battery capacity are modeled using Advanced Vehicle Simulator (ADVISOR). The performance of these vehicles is analyzed on the bases of energy consumption and greenhouse gas emissions on the tank-to-wheel and well-to-wheel paths. Both city and highway driving conditions are considered during the simulation. When simulated on the well-to-wheel path, it is shown that the range extender with a Wankel rotary engine consumes less energy and emits fewer greenhouse gases compared to the other systems with reciprocating engines during many driving cycles. The rotary engine has a higher power-to-weight ratio and lower noise, vibration and harshness compared to conventional reciprocating engines, although performs less efficiently. The benefits of a Wankel engine make it an attractive option for use as a range extender in a plug-in hybrid electric vehicle.

  7. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  8. Impact of an Advanced Cardiac Life Support Simulation Laboratory Experience on Pharmacy Student Confidence and Knowledge.

    Science.gov (United States)

    Maxwell, Whitney D; Mohorn, Phillip L; Haney, Jason S; Phillips, Cynthia M; Lu, Z Kevin; Clark, Kimberly; Corboy, Alex; Ragucci, Kelly R

    2016-10-25

    Objective. To assess the impact of an advanced cardiac life support (ACLS) simulation on pharmacy student confidence and knowledge. Design. Third-year pharmacy students participated in a simulation experience that consisted of team roles training, high-fidelity ACLS simulations, and debriefing. Students completed a pre/postsimulation confidence and knowledge assessment. Assessment. Overall, student knowledge assessment scores and student confidence scores improved significantly. Student confidence and knowledge changes from baseline were not significantly correlated. Conversely, a significant, weak positive correlation between presimulation studying and both presimulation confidence and presimulation knowledge was discovered. Conclusions. Overall, student confidence and knowledge assessment scores in ACLS significantly improved from baseline; however, student confidence and knowledge were not significantly correlated.

  9. Unpacking complexities of managerial subjectivity: An analytic fixation on constitutive dynamics

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2012-01-01

    , and the analytic challenges of discourse/Discourse-distinctions and avoiding agency-structure-dualism. This paper proposes an integral conceptualization of subjectification that directs analytic attention to the complex constitutive dynamics of organizational discourses and agency normative to organizational...... is discussed with a case-study of public managers in collaborative governance processes in the Danish day-care sector. With complex-sensitive analytics the paper contributes to the ‘plurivocal’ debate on advancing organizational discourse approaches....

  10. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. X-ray optics simulation and beamline design for the APS upgrade

    Science.gov (United States)

    Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean

    2017-08-01

    The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.

  12. Analytical methods for the determintion of some elements and Fe+2 to Fe+3 ratio in simulated sludges and Synroc formulations

    International Nuclear Information System (INIS)

    Lim, R.

    1981-10-01

    Analytical methods for the determination of Fe, Al, Mn, Ca, Ni, Na, Sr, Cs, Ti, and Ba in simulated sludges and Synroc formulations are discussed. These are the elements that may be completed by atomic absorption spectroscopy, AAS. AAS methods are complicated by the dissolution methods used. These problems are discussed. In addition, the method used for the determination of Fe +2 to Fe +3 ratio is presented

  13. Exploring simulated early star formation in the context of the ultrafaint dwarf galaxies

    Science.gov (United States)

    Corlies, Lauren; Johnston, Kathryn V.; Wise, John H.

    2018-04-01

    Ultrafaint dwarf galaxies (UFDs) are typically assumed to have simple, stellar populations with star formation ending at reionization. Yet as the observations of these galaxies continue to improve, their star formation histories (SFHs) are revealed to be more complicated than previously thought. In this paper, we study how star formation, chemical enrichment, and mixing proceed in small, dark matter haloes at early times using a high-resolution, cosmological, hydrodynamical simulation. The goals are to inform the future use of analytic models and to explore observable properties of the simulated haloes in the context of UFD data. Specifically, we look at analytic approaches that might inform metal enrichment within and beyond small galaxies in the early Universe. We find that simple assumptions for modelling the extent of supernova-driven winds agree with the simulation on average, whereas inhomogeneous mixing and gas flows have a large effect on the spread in simulated stellar metallicities. In the context of the UFDs, this work demonstrates that simulations can form haloes with a complex SFH and a large spread in the metallicity distribution function within a few hundred Myr in the early Universe. In particular, bursty and continuous star formation are seen in the simulation and both scenarios have been argued from the data. Spreads in the simulated metallicities, however, remain too narrow and too metal-rich when compared to the UFDs. Future work is needed to help reduce these discrepancies and advance our interpretation of the data.

  14. Co-simulation of dynamic systems in parallel and serial model configurations

    International Nuclear Information System (INIS)

    Sweafford, Trevor; Yoon, Hwan Sik

    2013-01-01

    Recent advancement in simulation software and computation hardware make it realizable to simulate complex dynamic systems comprised of multiple submodels developed in different modeling languages. The so-called co-simulation enables one to study various aspects of a complex dynamic system with heterogeneous submodels in a cost-effective manner. Among several different model configurations for co-simulation, synchronized parallel configuration is regarded to expedite the simulation process by simulation multiple sub models concurrently on a multi core processor. In this paper, computational accuracies as well as computation time are studied for three different co-simulation frameworks : integrated, serial, and parallel. for this purpose, analytical evaluations of the three different methods are made using the explicit Euler method and then they are applied to two-DOF mass-spring systems. The result show that while the parallel simulation configuration produces the same accurate results as the integrated configuration, results of the serial configuration, results of the serial configuration show a slight deviation. it is also shown that the computation time can be reduced by running simulation in the parallel configuration. Therefore, it can be concluded that the synchronized parallel simulation methodology is the best for both simulation accuracy and time efficiency.

  15. Recent Advances in Paper-Based Sensors

    Directory of Open Access Journals (Sweden)

    Edith Chow

    2012-08-01

    Full Text Available Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed.

  16. Order statistics and energy-ordered histograms: an analytical approach to continuum gamma-ray spectra

    International Nuclear Information System (INIS)

    Urrego, J.P.; Cristancho, F.

    2001-01-01

    Full text: Fusion-evaporation heavy ion collisions have enable us to explore new regions of phase space E - I, particularly high spin and excitation energy regions, where level densities are so high that modern detectors are unable to resolve individual gamma-ray transitions and consequently the resulting spectrum is continuous and undoubtedly contains a lot of new physics. In spite of that, very few experiments have been designed to extract conclusions about behavior of nuclei in continuum, thus in order to obtain a continuum spectroscopy it is necessary to apply to numerical simulations. In this sense GAMBLE a Monte Carlo based code- is a powerful tool that with some modifications allows us to test a new method to analyze the outcome of experiments focused on the properties of phase space regions in nuclear continuum: The use of Energy-Ordered Spectra (EOS) . Let's suppose that in a experiment is collected all gamma radiation emitted by a specific nucleus in a fixed intrinsic excitation energy range and that the different EOS are constructed. Although it has been shown that comparisons between such EOS and Monte Carlo simulations give information about the level density and the strength function their interpretation is not too clear because the large number of input values needed in a code like GAMBLE. On the other hand, if we could have an analytical description of EOS, the understanding of the underlying physics would be more simple because one could control exactly the involved variables and eventually simulation would be unnecessary. Promissory advances in that direction come from mathematical theory of Order Statistics (OS) In this work it is described the modified code GAMBLE and some simulated EOS for 170 Hf are shown. The simulations are made with different formulations for both level density (Fermi Gas at constant and variable temperature) and gamma strength function (GDR, single particle). Further it is described in detail how OS are employed in the

  17. Discrete-event simulation of coordinated multi-point joint transmission in LTE-Advanced with constrained backhaul

    DEFF Research Database (Denmark)

    Artuso, Matteo; Christiansen, Henrik Lehrmann

    2014-01-01

    Inter-cell interference in LTE-Advanced can be mitigated using coordinated multi-point (CoMP) techniques with joint transmission of user data . However, this requires tight coordination of the eNodeBs, usin g the X2 interface. In this paper we use discrete-event simulation to evaluate the latency...... requirements for the X2 interface and investigate the consequences of a constrained ba ckhaul. Our simulation results show a gain of the system throug hput of up to 120% compared to the case without CoMP for low-latency backhaul. With X2 latencies above 5 ms CoMP is no longer a benefit to the network....

  18. Emerging Cyber Infrastructure for NASA's Large-Scale Climate Data Analytics

    Science.gov (United States)

    Duffy, D.; Spear, C.; Bowen, M. K.; Thompson, J. H.; Hu, F.; Yang, C. P.; Pierce, D.

    2016-12-01

    The resolution of NASA climate and weather simulations have grown dramatically over the past few years with the highest-fidelity models reaching down to 1.5 KM global resolutions. With each doubling of the resolution, the resulting data sets grow by a factor of eight in size. As the climate and weather models push the envelope even further, a new infrastructure to store data and provide large-scale data analytics is necessary. The NASA Center for Climate Simulation (NCCS) has deployed the Data Analytics Storage Service (DASS) that combines scalable storage with the ability to perform in-situ analytics. Within this system, large, commonly used data sets are stored in a POSIX file system (write once/read many); examples of data stored include Landsat, MERRA2, observing system simulation experiments, and high-resolution downscaled reanalysis. The total size of this repository is on the order of 15 petabytes of storage. In addition to the POSIX file system, the NCCS has deployed file system connectors to enable emerging analytics built on top of the Hadoop File System (HDFS) to run on the same storage servers within the DASS. Coupled with a custom spatiotemporal indexing approach, users can now run emerging analytical operations built on MapReduce and Spark on the same data files stored within the POSIX file system without having to make additional copies. This presentation will discuss the architecture of this system and present benchmark performance measurements from traditional TeraSort and Wordcount to large-scale climate analytical operations on NetCDF data.

  19. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  20. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    Science.gov (United States)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  1. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    Science.gov (United States)

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Earthquake simulation, actual earthquake monitoring and analytical methods for soil-structure interaction investigation

    Energy Technology Data Exchange (ETDEWEB)

    Tang, H T [Seismic Center, Electric Power Research Institute, Palo Alto, CA (United States)

    1988-07-01

    Approaches for conducting in-situ soil-structure interaction experiments are discussed. High explosives detonated under the ground can generate strong ground motion to induce soil-structure interaction (SSI). The explosive induced data are useful in studying the dynamic characteristics of the soil-structure system associated with the inertial aspect of the SSI problem. The plane waves generated by the explosives cannot adequately address the kinematic interaction associated with actual earthquakes because of he difference in wave fields and their effects. Earthquake monitoring is ideal for obtaining SSI data that can address all aspects of the SSI problem. The only limitation is the level of excitation that can be obtained. Neither the simulated earthquake experiments nor the earthquake monitoring experiments can have exact similitude if reduced scale test structures are used. If gravity effects are small, reasonable correlations between the scaled model and the prototype can be obtained provided that input motion can be scaled appropriately. The key product of the in-situ experiments is the data base that can be used to qualify analytical methods for prototypical applications. (author)

  3. Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems

    OpenAIRE

    Herrera, Manuel; Meniconi, Silvia; Alvisi, Stefano; Izquierdo, Joaquin

    2018-01-01

    This document is intended to be a presentation of the Special Issue “Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems”. The final aim of this Special Issue is to propose a suitable framework supporting insightful hydraulic mechanisms to aid the decision-making processes of water utility managers and practitioners. Its 18 peer-reviewed articles present as varied topics as: water distribution system design, optimization of network perf...

  4. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    Science.gov (United States)

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  5. A development of simulation and analytical program for through-diffusion experiments for a single layer of diffusion media

    International Nuclear Information System (INIS)

    Sato, Haruo

    2001-01-01

    A program (TDROCK1. FOR) for simulation and analysis of through-diffusion experiments for a single layer of diffusion media was developed. This program was made by Pro-Fortran language, which was suitable for scientific and technical calculations, and relatively easy explicit difference method was adopted for an analysis. In the analysis, solute concentration in the tracer cell as a function of time that we could not treat to date can be input and the decrease in the solute concentration as a function of time by diffusion from the tracer cell to the measurement cell, the solute concentration distribution in the porewater of diffusion media and the solute concentration in the measurement cell as a function of time can be calculated. In addition, solution volume in both cells and diameter and thickness of the diffusion media are also variable as an input condition. This simulation program could well explain measured result by simulating solute concentration in the measurement cell as a function of time for case which apparent and effective diffusion coefficients were already known. Based on this, the availability and applicability of this program to actual analysis and simulation were confirmed. This report describes the theoretical treatment for the through-diffusion experiments for a single layer of diffusion media, analytical model, an example of source program and the manual. (author)

  6. Advancement of DOE's EnergyPlus Building Energy Simulation Payment

    Energy Technology Data Exchange (ETDEWEB)

    Gu, Lixing [Florida Solar Energy Center, Cocoa, FL (United States); Shirey, Don [Florida Solar Energy Center, Cocoa, FL (United States); Raustad, Richard [Florida Solar Energy Center, Cocoa, FL (United States); Nigusse, Bereket [Florida Solar Energy Center, Cocoa, FL (United States); Sharma, Chandan [Florida Solar Energy Center, Cocoa, FL (United States); Lawrie, Linda [DHL Consulting, Bonn (Germany); Strand, Rick [Univ. of Illinois, Champaign, IL (United States); Pedersen, Curt [COPA, Panama City (Panama); Fisher, Dan [Oklahoma State Univ., Stillwater, OK (United States); Lee, Edwin [Oklahoma State Univ., Stillwater, OK (United States); Witte, Mike [GARD Analytics, Arlington Heights, IL (United States); Glazer, Jason [GARD Analytics, Arlington Heights, IL (United States); Barnaby, Chip [Wrightsoft, Lexington, MA (United States)

    2011-09-30

    EnergyPlus{sup TM} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOE's Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. The 5-year project was managed by the National Energy Technology Laboratory and was divided into 5 budget period between 2006 and 2011. During the project period, 11 versions of EnergyPlus were released. This report summarizes work performed by an EnergyPlus development team led by the University of Central Florida's Florida Solar Energy Center (UCF/FSEC). The team members consist of DHL Consulting, C. O. Pedersen Associates, University of Illinois at Urbana-Champaign, Oklahoma State University, GARD Analytics, Inc., and WrightSoft Corporation. The project tasks involved new feature development, testing and validation, user support and training, and general EnergyPlus support. The team developed 146 new features during the 5-year period to advance the EnergyPlus capabilities. Annual contributions of new features are 7 in budget period 1, 19 in period 2, 36 in period 3, 41 in period 4, and 43 in period 5, respectively. The testing and validation task focused on running test suite and publishing report, developing new IEA test suite cases, testing and validating new source code, addressing change requests, and creating and testing installation package. The user support and training task provided support for users and interface developers, and organized and taught workshops. The general support task involved upgrading StarTeam (team sharing) software and updating existing utility software. The project met the DOE objectives and completed all tasks successfully. Although the EnergyPlus software was enhanced

  7. Analytical control in metallurgical processes

    International Nuclear Information System (INIS)

    Coedo, A.G.; Dorado, M.T.; Padilla, I.

    1998-01-01

    This paper illustrates the role of analysis in enabling metallurgical industry to meet quality demands. For example, for the steel industry the demands by the automotive, aerospace, power generation, tinplate packaging industries and issue of environment near steel plants. Although chemical analysis technology continues to advance, achieving improved speed, precision and accuracy at lower levels of detection, the competitiveness of manufacturing industry continues to drive property demands at least at the same rate. Narrower specification ranges, lower levels of residual elements and economic pressures prescribe faster process routes, all of which lead to increased demands on the analytical function. These damands are illustrated by examples from several market sectors in which customer issues are considered together with ther analytical implications. (Author) 5 refs

  8. Analytical model of impedance in elliptical beam pipes

    CERN Document Server

    Pesah, Arthur Chalom

    2017-01-01

    Beam instabilities are among the main limitations in building higher intensity accelerators. Having a good impedance model for every accelerators is necessary in order to build components that minimize the probability of instabilities caused by the interaction beam-environment and to understand what piece to change in case of intensity increasing. Most of accelerator components have their impedance simulated with finite elements method (using softwares like CST Studio), but simple components such as circular or flat pipes are modeled analytically, with a decreasing computation time and an increasing precision compared to their simulated model. Elliptical beam pipes, while being a simple component present in some accelerators, still misses a good analytical model working for the hole range of velocities and frequencies. In this report, we present a general framework to study the impedance of elliptical pipes analytically. We developed a model for both longitudinal and transverse impedance, first in the case of...

  9. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  10. Advances in radiation-hydrodynamics and atomic physics simulation for current and new neutron-less targets

    International Nuclear Information System (INIS)

    Velarde, G.; Minguez, E.; Bravo, E.

    2003-01-01

    We present advances in advanced fusion cycles, atomic physics and radiation hydrodynamics. With ARWEN code we analyze a target design for ICF based on jet production. ARWEN is 2D Adaptive Mesh Refinement fluid dynamic and multigroup radiation transport. We are designing, by using also ARWEN, a target for laboratory simulation of astrophysical phenomena. We feature an experimental device to reproduce collisions of two shock waves, scaled to roughly represent cosmic supernova remnants. Opacity calculations are obtained with ANALOP code, which uses parametric potentials fitting to self-consistent potentials. It includes temperature and density effects by linearized Debye-Hueckel and it treats excited configurations and H+He-like lines. Advanced fusion cycles, as the a neutronic proton-boron 11 reaction, require very high ignition temperatures. Plasma conditions for a fusion-burning wave to propagate at such temperatures are rather extreme and complex, because of the overlapping effects of the main energy transport mechanisms. Calculations on the most appropriate ICF regimes for this purpose are presented. (author)

  11. Recent advances in nuclear power plant simulation

    International Nuclear Information System (INIS)

    Zerbino, H.; Plisson, P.; Friant, J.Y.

    1997-01-01

    The field of industrial simulation has experienced very significant progress in recent years, and power plant simulation in particular has been an extremely active area. Improvements may be recorded in practically all simulator subsystems. In Europe, the construction of new full- or optimized-scope nuclear power plant simulators during the middle 1990's has been remarkable intense. In fact, it is possible to identify a distinct simulator generation, which constitutes a new de facto simulation standard. Thomson Training and Simulation has taken part in these developments by designing, building, and validation several of these new simulators for Dutch, German and French nuclear power plants. Their characteristics are discussed in this paper. The following main trends may be identified: Process modeling is clearly evolving towards obtaining engineering-grade performance, even under the added constraints of real-time operation and a very wide range of operating conditions to be covered; Massive use of modern graphic user interfaces (GUI) ensures an unprecedented flexibility and user-friendliness for the Instructor Station; The massive use of GUIs also allows the development of Trainee Stations (TS), which significantly enhance the in-depth training value of the simulators; The development of powerful Software Development Environments (SDE) enables the simulator maintenance teams to keep abreast of modifications carried out in the reference plants; Finally, simulator maintenance and its compliance with simulator fidelity requirements are greatly enhanced by integrated Configuration Management Systems (CMS). In conclusion, the power plant simulation field has attained a strong level of maturity, which benefits its approximately forty years of service to the power generation industry. (author)

  12. Systemic Analysis, Mapping, Modeling, and Simulation of the Advanced Accelerator Applications Program

    International Nuclear Information System (INIS)

    Guan, Yue; Laidler, James J.; Morman, James A.

    2002-01-01

    Advanced chemical separations methods envisioned for use in the Department of Energy Advanced Accelerator Applications (AAA) program have been studied using the Systemic Analysis, Mapping, Modeling, and Simulation (SAMMS) method. This integrated and systematic method considers all aspects of the studied process as one dynamic and inter-dependent system. This particular study focuses on two subjects: the chemical separation processes for treating spent nuclear fuel, and the associated non-proliferation implications of such processing. Two levels of chemical separation models are developed: level 1 models treat the chemical process stages by groups; and level 2 models depict the details of each process stage. Models to estimate the proliferation risks based on proliferation barrier assessment are also developed. This paper describes the research conducted for the single-stratum design in the AAA program. Further research conducted for the multi-strata designs will be presented later. The method and models described in this paper can help in the design of optimized processes that fulfill the chemical separation process specifications and non-proliferation requirements. (authors)

  13. Simulation of sampling effects in FPAs

    Science.gov (United States)

    Cook, Thomas H.; Hall, Charles S.; Smith, Frederick G.; Rogne, Timothy J.

    1991-09-01

    The use of multiplexers and large focal plane arrays in advanced thermal imaging systems has drawn renewed attention to sampling and aliasing issues in imaging applications. As evidenced by discussions in a recent workshop, there is no clear consensus among experts whether aliasing in sensor designs can be readily tolerated, or must be avoided at all cost. Further, there is no straightforward, analytical method that can answer the question, particularly when considering image interpreters as different as humans and autonomous target recognizers (ATR). However, the means exist for investigating sampling and aliasing issues through computer simulation. The U.S. Army Tank-Automotive Command (TACOM) Thermal Image Model (TTIM) provides realistic sensor imagery that can be evaluated by both human observers and TRs. This paper briefly describes the history and current status of TTIM, explains the simulation of FPA sampling effects, presents validation results of the FPA sensor model, and demonstrates the utility of TTIM for investigating sampling effects in imagery.

  14. Application of advanced nuclear and instrumental analytical techniques for characterisation of environmental materials

    International Nuclear Information System (INIS)

    Sudersanan, M.; Pawaskar, P.B.; Kayasth, S.R.; Kumar, S.C.

    2002-01-01

    Full text: Increasing realisation about the toxic effects of metal ions in environmental materials has given an impetus to research on analytical techniques for their characterization. The large number of analytes present at very low levels has necessitated the use of sensitive, selective and element specific techniques for their characterization. The concern about precision and accuracy on such analysis, which have socio-economic bearing, has emphasized the use of Certified Reference Materials and the use of multi-technique approach for the unambiguous characterization of analytes. The recent work carried out at Analytical Chemistry Division, BARC on these aspects is presented in this paper. Increasing use of fossil fuels has led to the generation of large quantities of fly ash which pose problems of safe disposal. The utilization of these materials for land filling is an attractive option but the presence of trace amounts of toxic metals like mercury, arsenic, lead etc may cause environmental problems. In view of the inhomogeneous nature of the material, efficient sample processing is an important factor, in addition to the validation of the results by the use of proper standards. Analysis was carried out on flyash samples received as reference materials and also as samples from commercial sources using a combination of both nuclear techniques like INAA and RNAA as well as other techniques like AAS, ICPAES, cold vapour AAS for mercury and hydride generation technique for arsenic. Similar analysis using nuclear techniques was employed for the characterization of air particulates. Biological materials often serve as sensitive indicator materials for pollution measurements. They are also employed for studies on the uptake of toxic metals like U, Th, Cd, Pb, Hg etc. The presence of large amounts of organic materials in them necessitate an appropriate sample dissolution procedure. In view of the possibility of loss of certain analytes like Cd, Hg, As, by high

  15. Education and training for operators using a full scope simulator and an its upgrading program in JOYO

    International Nuclear Information System (INIS)

    Sawada, Makoto; Terano, Toshihiro; Hunaki, Isao

    1996-01-01

    A JOYO full scope operator training simulator installed in 1983, is being used with high average unit availability factor of more than 70% per annum. The education and training for the operators using it has been greatly contributing to safety operation of the experimental fast reactor JOYO. The simulator mainly consisting of five control panels, a computer system having two computers and an instructor's console, is able to simulate the plant behaviors and the sequential processes with real time under normal or anomaly conditions. Now, according as the JOYO MK-ILL project which enhances the irradiation capability of JOYO, an upgrading program of the simulator is proceeding with the aim of advancing its efficient usage by improving the training function and the analytical accuracy of the simulator. (author)

  16. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    Science.gov (United States)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  17. Monte Carlo simulation applied to alpha spectrometry

    International Nuclear Information System (INIS)

    Baccouche, S.; Gharbi, F.; Trabelsi, A.

    2007-01-01

    Alpha particle spectrometry is a widely-used analytical method, in particular when we deal with pure alpha emitting radionuclides. Monte Carlo simulation is an adequate tool to investigate the influence of various phenomena on this analytical method. We performed an investigation of those phenomena using the simulation code GEANT of CERN. The results concerning the geometrical detection efficiency in different measurement geometries agree with analytical calculations. This work confirms that Monte Carlo simulation of solid angle of detection is a very useful tool to determine with very good accuracy the detection efficiency.

  18. Assessment of driving-related performance in chronic whiplash using an advanced driving simulator.

    Science.gov (United States)

    Takasaki, Hiroshi; Treleaven, Julia; Johnston, Venerina; Rakotonirainy, Andry; Haines, Andrew; Jull, Gwendolen

    2013-11-01

    Driving is often nominated as problematic by individuals with chronic whiplash associated disorders (WAD), yet driving-related performance has not been evaluated objectively. The purpose of this study was to test driving-related performance in persons with chronic WAD against healthy controls of similar age, gender and driving experience to determine if driving-related performance in the WAD group was sufficiently impaired to recommend fitness to drive assessment. Driving-related performance was assessed using an advanced driving simulator during three driving scenarios; freeway, residential and a central business district (CBD). Total driving duration was approximately 15min. Five driving tasks which could cause a collision (critical events) were included in the scenarios. In addition, the effect of divided attention (identify red dots projected onto side or rear view mirrors) was assessed three times in each scenario. Driving performance was measured using the simulator performance index (SPI) which is calculated from 12 measures. z-Scores for all SPI measures were calculated for each WAD subject based on mean values of the control subjects. The z-scores were then averaged for the WAD group. A z-score of ≤-2 indicated a driving failing grade in the simulator. The number of collisions over the five critical events was compared between the WAD and control groups as was reaction time and missed response ratio in identifying the red dots. Seventeen WAD and 26 control subjects commenced the driving assessment. Demographic data were comparable between the groups. All subjects completed the freeway scenario but four withdrew during the residential and eight during the CBD scenario because of motion sickness. All scenarios were completed by 14 WAD and 17 control subjects. Mean z-scores for the SPI over the three scenarios was statistically lower in the WAD group (-0.3±0.3; Pdriving. There were no differences in the reaction time and missed response ratio in divided

  19. Advanced gadolinia core and Toshiba advanced reactor management system

    International Nuclear Information System (INIS)

    Miyamoto, Toshiki; Yoshioka, Ritsuo; Ebisuya, Mitsuo

    1988-01-01

    At the Hamaoka Nuclear Power Station, Unit No. 3, advanced core design and core management technology have been adopted, significantly improving plant availability, operability and reliability. The outstanding technologies are the advanced gadolinia core (AGC) which utilizes gadolinium for the axial power distribution control, and Toshiba advanced reactor management system (TARMS) which uses a three-dimensional core physics simulator to calculate the power distribution. Presented here are the effects of these advanced technologies as observed during field testing. (author)

  20. Generation of large scale urban environments to support advanced sensor and seeker simulation

    Science.gov (United States)

    Giuliani, Joseph; Hershey, Daniel; McKeown, David, Jr.; Willis, Carla; Van, Tan

    2009-05-01

    One of the key aspects for the design of a next generation weapon system is the need to operate in cluttered and complex urban environments. Simulation systems rely on accurate representation of these environments and require automated software tools to construct the underlying 3D geometry and associated spectral and material properties that are then formatted for various objective seeker simulation systems. Under an Air Force Small Business Innovative Research (SBIR) contract, we have developed an automated process to generate 3D urban environments with user defined properties. These environments can be composed from a wide variety of source materials, including vector source data, pre-existing 3D models, and digital elevation models, and rapidly organized into a geo-specific visual simulation database. This intermediate representation can be easily inspected in the visible spectrum for content and organization and interactively queried for accuracy. Once the database contains the required contents, it can then be exported into specific synthetic scene generation runtime formats, preserving the relationship between geometry and material properties. To date an exporter for the Irma simulation system developed and maintained by AFRL/Eglin has been created and a second exporter to Real Time Composite Hardbody and Missile Plume (CHAMP) simulation system for real-time use is currently being developed. This process supports significantly more complex target environments than previous approaches to database generation. In this paper we describe the capabilities for content creation for advanced seeker processing algorithms simulation and sensor stimulation, including the overall database compilation process and sample databases produced and exported for the Irma runtime system. We also discuss the addition of object dynamics and viewer dynamics within the visual simulation into the Irma runtime environment.

  1. Strategic Plan for Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson; Kimberlyn C. Mousseau; Hyung Lee

    2011-09-01

    NE-KAMS knowledge base will assist computational analysts, physics model developers, experimentalists, nuclear reactor designers, and federal regulators by: (1) Establishing accepted standards, requirements and best practices for V&V and UQ of computational models and simulations, (2) Establishing accepted standards and procedures for qualifying and classifying experimental and numerical benchmark data, (3) Providing readily accessible databases for nuclear energy related experimental and numerical benchmark data that can be used in V&V assessments and computational methods development, (4) Providing a searchable knowledge base of information, documents and data on V&V and UQ, and (5) Providing web-enabled applications, tools and utilities for V&V and UQ activities, data assessment and processing, and information and data searches. From its inception, NE-KAMS will directly support nuclear energy research, development and demonstration programs within the U.S. Department of Energy (DOE), including the Consortium for Advanced Simulation of Light Water Reactors (CASL), the Nuclear Energy Advanced Modeling and Simulation (NEAMS), the Light Water Reactor Sustainability (LWRS), the Small Modular Reactors (SMR), and the Next Generation Nuclear Power Plant (NGNP) programs. These programs all involve computational modeling and simulation (M&S) of nuclear reactor systems, components and processes, and it is envisioned that NE-KAMS will help to coordinate and facilitate collaboration and sharing of resources and expertise for V&V and UQ across these programs. In addition, from the outset, NE-KAMS will support the use of computational M&S in the nuclear industry by developing guidelines and recommended practices aimed at quantifying the uncertainty and assessing the applicability of existing analysis models and methods. The NE-KAMS effort will initially focus on supporting the use of computational fluid dynamics (CFD) and thermal hydraulics (T/H) analysis for M&S of nuclear

  2. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  3. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    Science.gov (United States)

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  4. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  5. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  6. Rate of detection of advanced neoplasms in proximal colon by simulated sigmoidoscopy vs fecal immunochemical tests.

    Science.gov (United States)

    Castells, Antoni; Quintero, Enrique; Álvarez, Cristina; Bujanda, Luis; Cubiella, Joaquín; Salas, Dolores; Lanas, Angel; Carballo, Fernando; Morillas, Juan Diego; Hernández, Cristina; Jover, Rodrigo; Hijona, Elizabeth; Portillo, Isabel; Enríquez-Navascués, José M; Hernández, Vicent; Martínez-Turnes, Alfonso; Menéndez-Villalva, Carlos; González-Mao, Carmen; Sala, Teresa; Ponce, Marta; Andrés, Mercedes; Teruel, Gloria; Peris, Antonio; Sopeña, Federico; González-Rubio, Francisca; Seoane-Urgorri, Agustín; Grau, Jaume; Serradesanferm, Anna; Pozo, Àngels; Pellisé, Maria; Balaguer, Francesc; Ono, Akiko; Cruzado, José; Pérez-Riquelme, Francisco; Alonso-Abreu, Inmaculada; Carrillo-Palau, Marta; de la Vega-Prieto, Mariola; Iglesias, Rosario; Amador, Javier; Blanco, José Manuel; Sastre, Rocio; Ferrándiz, Juan; González-Hernández, Ma José; Andreu, Montserrat; Bessa, Xavier

    2014-10-01

    We compared the ability of biennial fecal immunochemical testing (FIT) and one-time sigmoidoscopy to detect colon side-specific advanced neoplasms in a population-based, multicenter, nationwide, randomized controlled trial. We identified asymptomatic men and women, 50-69 years old, through community health registries and randomly assigned them to groups that received a single colonoscopy examination or biennial FIT. Sigmoidoscopy yield was simulated from results obtained from the colonoscopy group, according to the criteria proposed in the UK Flexible Sigmoidoscopy Trial for colonoscopy referral. Patients who underwent FIT and were found to have ≥75 ng hemoglobin/mL were referred for colonoscopy. Data were analyzed from 5059 subjects in the colonoscopy group and 10,507 in the FIT group. The main outcome was rate of detection of any advanced neoplasm proximal to the splenic flexure. Advanced neoplasms were detected in 317 subjects (6.3%) in the sigmoidoscopy simulation group compared with 288 (2.7%) in the FIT group (odds ratio for sigmoidoscopy, 2.29; 95% confidence interval, 1.93-2.70; P = .0001). Sigmoidoscopy also detected advanced distal neoplasia in a higher percentage of patients than FIT (odds ratio, 2.61; 95% confidence interval, 2.20-3.10; P = .0001). The methods did not differ significantly in identifying patients with advanced proximal neoplasms (odds ratio, 1.17; 95% confidence interval, 0.78-1.76; P = .44). This was probably due to the lower performance of both strategies in detecting patients with proximal lesions (sigmoidoscopy detected these in 19.1% of patients and FIT in 14.9% of patients) vs distal ones (sigmoidoscopy detected these in 86.8% of patients and FIT in 33.5% of patients). Sigmoidoscopy, but not FIT, detected proximal lesions in lower percentages of women (especially those 50-59 years old) than men. Sigmoidoscopy and FIT have similar limitations in detecting advanced proximal neoplasms, which depend on patients' characteristics

  7. Experimental and analytical investigation on metal damage suffered from simulated lightning currents

    Science.gov (United States)

    Yakun, LIU; Zhengcai, FU; Quanzhen, LIU; Baoquan, LIU; Anirban, GUHA

    2017-12-01

    The damage of two typical metal materials, Al alloy 3003 and steel alloy Q235B, subjected to four representative lightning current components are investigated by laboratory and analytical studies to provide fundamental data for lightning protection. The four lightning components simulating the natural lightning consist of the first return stroke, the continuing current of interval stroke, the long continuing current, and the subsequent stroke, with amplitudes 200 kA, 8 kA, 400 A, and 100 kA, respectively. The damage depth and area suffered from different lightning components are measured by the ultrasonic scanning system. And the temperature rise is measured by the thermal imaging camera. The results show that, for both Al 3003 and steel Q235B, the first return stroke component results in the largest damage area with damage depth 0.02 mm uttermost. The long continuing current component leads to the deepest damage depth of 3.3 mm for Al 3003 and much higher temperature rise than other components. The correlation analysis between damage results and lightning parameters indicates that the damage depth has a positive correlation with charge transfer. The damage area is mainly determined by the current amplitude and the temperature rise increases linearly with the charge transfer larger.

  8. Development of advanced spent fuel management process. The fabrication and oxidation behavior of simulated metallized spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Ro, Seung Gy; Shin, Y.J.; You, G.S.; Joo, J.S.; Min, D.K.; Chun, Y.B.; Lee, E.P.; Seo, H.S.; Ahn, S.B

    1999-03-01

    The simulated metallized spent fuel ingots were fabricated and evaluated the oxidation rates and the activation energies under several temperature conditions to develop an advanced spent fuel management process. It was also checked the alloying characteristics of the some elements with metal uranium. (Author). 3 refs., 1 tab., 36 figs.

  9. Analytical modelling and study of the stability characteristics of the Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Nayak, A.K.; Vijayan, P.K.; Saha, D.

    2000-04-01

    An analytical model has been developed to study the thermohydraulic and neutronic-coupled density-wave instability in the Indian Advanced Heavy Water Reactor (AHWR) which is a natural circulation pressure tube type boiling water reactor. The model considers a point kinetics model for the neutron dynamics and a lumped parameter model for the fuel thermal dynamics along with the conservation equations of mass, momentum and energy and equation of state for the coolant. In addition, to study the effect of neutron interactions between different parts of the core, the model considers a coupled multipoint kinetics equation in place of simple point kinetics equation. Linear stability theory was applied to reveal the instability of in-phase and out-of-phase modes in the boiling channels of the AHWR. The results indicate that the design configuration considered may experience both Ledinegg and Type I and Type II density-wave instabilities depending on the operating condition. Some methods of suppressing these instabilities were found out. In addition, it was found that the stability behavior of the reactor is greatly influenced by the void reactivity coefficient, fuel time constant, radial power distribution and channel inlet orificing. The delayed neutrons were found to have strong influence on the Type I and Type II instabilities. Decay ratio maps were predicted considering various operating parameters of the reactor, which are useful for its design. (author)

  10. A new unconditionally stable and consistent quasi-analytical in-stream water quality solution scheme for CSTR-based water quality simulators

    Science.gov (United States)

    Woldegiorgis, Befekadu Taddesse; van Griensven, Ann; Pereira, Fernando; Bauwens, Willy

    2017-06-01

    Most common numerical solutions used in CSTR-based in-stream water quality simulators are susceptible to instabilities and/or solution inconsistencies. Usually, they cope with instability problems by adopting computationally expensive small time steps. However, some simulators use fixed computation time steps and hence do not have the flexibility to do so. This paper presents a novel quasi-analytical solution for CSTR-based water quality simulators of an unsteady system. The robustness of the new method is compared with the commonly used fourth-order Runge-Kutta methods, the Euler method and three versions of the SWAT model (SWAT2012, SWAT-TCEQ, and ESWAT). The performance of each method is tested for different hypothetical experiments. Besides the hypothetical data, a real case study is used for comparison. The growth factors we derived as stability measures for the different methods and the R-factor—considered as a consistency measure—turned out to be very useful for determining the most robust method. The new method outperformed all the numerical methods used in the hypothetical comparisons. The application for the Zenne River (Belgium) shows that the new method provides stable and consistent BOD simulations whereas the SWAT2012 model is shown to be unstable for the standard daily computation time step. The new method unconditionally simulates robust solutions. Therefore, it is a reliable scheme for CSTR-based water quality simulators that use first-order reaction formulations.

  11. Design and development of a virtual reality simulator for advanced cardiac life support training.

    Science.gov (United States)

    Vankipuram, Akshay; Khanal, Prabal; Ashby, Aaron; Vankipuram, Mithra; Gupta, Ashish; DrummGurnee, Denise; Josey, Karen; Smith, Marshall

    2014-07-01

    The use of virtual reality (VR) training tools for medical education could lead to improvements in the skills of clinicians while providing economic incentives for healthcare institutions. The use of VR tools can also mitigate some of the drawbacks currently associated with providing medical training in a traditional clinical environment such as scheduling conflicts and the need for specialized equipment (e.g., high-fidelity manikins). This paper presents the details of the framework and the development methodology associated with a VR-based training simulator for advanced cardiac life support, a time critical, team-based medical scenario. In addition, we also report the key findings of a usability study conducted to assess the efficacy of various features of this VR simulator through a postuse questionnaire administered to various care providers. The usability questionnaires were completed by two groups that used two different versions of the VR simulator. One version consisted of the VR trainer with it all its features and a minified version with certain immersive features disabled. We found an increase in usability scores from the minified group to the full VR group.

  12. Time-domain hybrid method for simulating large amplitude motions of ships advancing in waves

    Directory of Open Access Journals (Sweden)

    Shukui Liu

    2011-03-01

    Full Text Available Typical results obtained by a newly developed, nonlinear time domain hybrid method for simulating large amplitude motions of ships advancing with constant forward speed in waves are presented. The method is hybrid in the way of combining a time-domain transient Green function method and a Rankine source method. The present approach employs a simple double integration algorithm with respect to time to simulate the free-surface boundary condition. During the simulation, the diffraction and radiation forces are computed by pressure integration over the mean wetted surface, whereas the incident wave and hydrostatic restoring forces/moments are calculated on the instantaneously wetted surface of the hull. Typical numerical results of application of the method to the seakeeping performance of a standard containership, namely the ITTC S175, are herein presented. Comparisons have been made between the results from the present method, the frequency domain 3D panel method (NEWDRIFT of NTUA-SDL and available experimental data and good agreement has been observed for all studied cases between the results of the present method and comparable other data.

  13. Using simulation to improve the cognitive and psychomotor skills of novice students in advanced laparoscopic surgery: a meta-analysis.

    Science.gov (United States)

    Al-Kadi, Azzam S; Donnon, Tyrone

    2013-01-01

    Advances in simulation technologies have enhanced the ability to introduce the teaching and learning of laparoscopic surgical skills to novice students. In this meta-analysis, a total of 18 randomized controlled studies were identified that specifically looked at training novices in comparison with a control group as it pertains to knowledge retention, time to completion and suturing and knotting skills. The combined random-effect sizes (ESs) showed that novice students who trained on laparoscopic simulators have considerably developed better laparoscopic suturing and knot tying skills (d = 1.96, p < 0.01), conducted fewer errors (d = 2.13, p < 0.01), retained more knowledge (d = 1.57, p < 0.01) than their respective control groups, and were significantly faster on time to completion (d = 1.98, p < 0.01). As illustrated in corresponding Forest plots, the majority of the primary study outcomes included in this meta-analysis show statistically significant support (p < 0.05) for the use of laparoscopic simulators for novice student training on both knowledge and advanced surgical skill development (28 of 35 outcomes, 80%). The findings of this meta-analysis support strongly the use of simulators for teaching laparoscopic surgery skills to novice students in surgical residency programs.

  14. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  15. Communicating Climate Change through ICT-Based Visualization: Towards an Analytical Framework

    Directory of Open Access Journals (Sweden)

    Björn-Ola Linnér

    2013-11-01

    Full Text Available The difficulties in communicating climate change science to the general public are often highlighted as one of the hurdles for support of enhanced climate action. The advances of interactive visualization using information and communication technology (ICT are claimed to be a game-changer in our ability to communicate complex issues. However, new analytical frameworks are warranted to analyse the role of such technologies. This paper develops a novel framework for analyzing the content, form, context and relevance of ICT-based visualization of climate change, based on insights from literature on climate change communication. Thereafter, we exemplify the analytical framework by applying it to a pilot case of ICT-based climate visualization in a GeoDome. Possibilities to use affordable advanced ICT-based visualization devices in science and policy communication are rapidly expanding. We thus see wider implications and applications of the analytical framework not only for other ICT environments but also other issue areas in sustainability communication.

  16. A first course in ordinary differential equations analytical and numerical methods

    CERN Document Server

    Hermann, Martin

    2014-01-01

    This book presents a modern introduction to analytical and numerical techniques for solving ordinary differential equations (ODEs). Contrary to the traditional format—the theorem-and-proof format—the book is focusing on analytical and numerical methods. The book supplies a variety of problems and examples, ranging from the elementary to the advanced level, to introduce and study the mathematics of ODEs. The analytical part of the book deals with solution techniques for scalar first-order and second-order linear ODEs, and systems of linear ODEs—with a special focus on the Laplace transform, operator techniques and power series solutions. In the numerical part, theoretical and practical aspects of Runge-Kutta methods for solving initial-value problems and shooting methods for linear two-point boundary-value problems are considered. The book is intended as a primary text for courses on the theory of ODEs and numerical treatment of ODEs for advanced undergraduate and early graduate students. It is assumed t...

  17. High Level Requirements for the Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS)

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson; Hyung Lee; Kimberlyn C. Mousseau

    2011-09-01

    The US Department of Energy, Office of Nuclear Energy (DOE-NE), has been tasked with the important mission of ensuring that nuclear energy remains a compelling and viable energy source in the U.S. The motivations behind this mission include cost-effectively meeting the expected increases in the power needs of the country, reducing carbon emissions and reducing dependence on foreign energy sources. In the near term, to ensure that nuclear power remains a key element of U.S. energy strategy and portfolio, the DOE-NE will be working with the nuclear industry to support safe and efficient operations of existing nuclear power plants. In the long term, to meet the increasing energy needs of the U.S., the DOE-NE will be investing in research and development (R&D) and working in concert with the nuclear industry to build and deploy new, safer and more efficient nuclear power plants. The safe and efficient operations of existing nuclear power plants and designing, licensing and deploying new reactor designs, however, will require focused R&D programs as well as the extensive use and leveraging of advanced modeling and simulation (M&S). M&S will play a key role in ensuring safe and efficient operations of existing and new nuclear reactors. The DOE-NE has been actively developing and promoting the use of advanced M&S in reactor design and analysis through its R&D programs, e.g., the Nuclear Energy Advanced Modeling and Simulation (NEAMS) and Consortium for Advanced Simulation of Light Water Reactors (CASL) programs. Also, nuclear reactor vendors are already using CFD and CSM, for design, analysis, and licensing. However, these M&S tools cannot be used with confidence for nuclear reactor applications unless accompanied and supported by verification and validation (V&V) and uncertainty quantification (UQ) processes and procedures which provide quantitative measures of uncertainty for specific applications. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation

  18. Consistent constitutive modeling of metallic target penetration using empirical, analytical, and numerical penetration models

    Directory of Open Access Journals (Sweden)

    John (Jack P. Riegel III

    2016-04-01

    Full Text Available Historically, there has been little correlation between the material properties used in (1 empirical formulae, (2 analytical formulations, and (3 numerical models. The various regressions and models may each provide excellent agreement for the depth of penetration into semi-infinite targets. But the input parameters for the empirically based procedures may have little in common with either the analytical model or the numerical model. This paper builds on previous work by Riegel and Anderson (2014 to show how the Effective Flow Stress (EFS strength model, based on empirical data, can be used as the average flow stress in the analytical Walker–Anderson Penetration model (WAPEN (Anderson and Walker, 1991 and how the same value may be utilized as an effective von Mises yield strength in numerical hydrocode simulations to predict the depth of penetration for eroding projectiles at impact velocities in the mechanical response regime of the materials. The method has the benefit of allowing the three techniques (empirical, analytical, and numerical to work in tandem. The empirical method can be used for many shot line calculations, but more advanced analytical or numerical models can be employed when necessary to address specific geometries such as edge effects or layering that are not treated by the simpler methods. Developing complete constitutive relationships for a material can be costly. If the only concern is depth of penetration, such a level of detail may not be required. The effective flow stress can be determined from a small set of depth of penetration experiments in many cases, especially for long penetrators such as the L/D = 10 ones considered here, making it a very practical approach. In the process of performing this effort, the authors considered numerical simulations by other researchers based on the same set of experimental data that the authors used for their empirical and analytical assessment. The goals were to establish a

  19. Analytical calculation of electrolyte water content of a Proton Exchange Membrane Fuel Cell for on-board modelling applications

    Science.gov (United States)

    Ferrara, Alessandro; Polverino, Pierpaolo; Pianese, Cesare

    2018-06-01

    This paper proposes an analytical model of the water content of the electrolyte of a Proton Exchange Membrane Fuel Cell. The model is designed by accounting for several simplifying assumptions, which make the model suitable for on-board/online water management applications, while ensuring a good accuracy of the considered phenomena, with respect to advanced numerical solutions. The achieved analytical solution, expressing electrolyte water content, is compared with that obtained by means of a complex numerical approach, used to solve the same mathematical problem. The achieved results show that the mean error is below 5% for electrodes water content values ranging from 2 to 15 (given as boundary conditions), and it does not overcome 0.26% for electrodes water content above 5. These results prove the capability of the solution to correctly model electrolyte water content at any operating condition, aiming at embodiment into more complex frameworks (e.g., cell or stack models), related to fuel cell simulation, monitoring, control, diagnosis and prognosis.

  20. Advanced Tokamak Stability Theory

    Science.gov (United States)

    Zheng, Linjin

    2015-03-01

    The intention of this book is to introduce advanced tokamak stability theory. We start with the derivation of the Grad-Shafranov equation and the construction of various toroidal flux coordinates. An analytical tokamak equilibrium theory is presented to demonstrate the Shafranov shift and how the toroidal hoop force can be balanced by the application of a vertical magnetic field in tokamaks. In addition to advanced theories, this book also discusses the intuitive physics pictures for various experimentally observed phenomena.

  1. Advanced Analytic Treatment and Efficient Computation of the Diffraction Integrals in the Extended Nijboer-Zernike Theory

    Science.gov (United States)

    van Haver, S.; Janssen, A. J. E. M.

    2013-07-01

    The computational methods for the diffraction integrals that occur in the Extended Nijboer-Zernike (ENZ-) approach to circular, aberrated, defocused optical systems are reviewed and updated. In the ENZ-approach, the Debye approximation of Rayleigh's integral for the through-focus, complex, point-spread function is evaluated in semi-analytic form. To this end, the generalized pupil function, comprising phase aberrations as well as amplitude non-uniformities, is assumed to be expanded into a series of Zernike circle polynomials, and the contribution of each of these Zernike terms to the diffraction integral is expressed in the form of a rapidly converging series (containing power functions and/or Bessel functions of various kinds). The procedure of expressing the through-focus point-spread function in terms of Zernike expansion coefficients of the pupil function can be reversed and has led to the ENZ-method of retrieval of pupil functions from measured through-focus (inte! nsity) point-spread functions. The review and update concern the computation for systems ranging from as basic as having low NA and small defocus parameter to high-NA systems, with vector fields and polarization, meant for imaging of extended objects into a multi-layered focal region. In the period 2002-2010, the evolution of the form of the diffraction integral (DI) was dictated by the agenda of the ENZ-team in which a next instance of the DI was handled by amending the computation scheme of the previous one. This has resulted into a variety of ad hoc measures, lack of transparency of the schemes, and sometimes prohibitively slow computer codes. It is the aim of the present paper to reconstruct the whole building of computation methods, using consistently more advanced mathematical tools. These tools are -explicit Zernike expansion of the focal factor in the DI, -Clebsch-Gordan coefficients for the omnipresent problem of linearizing products ofZernike circle polynomials, -recursions for Bessel

  2. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  3. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  4. Strategic analytics: towards fully embedding evidence in healthcare decision-making.

    Science.gov (United States)

    Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh

    2015-01-01

    Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.

  5. Monte Carlo and analytical model predictions of leakage neutron exposures from passively scattered proton therapy

    International Nuclear Information System (INIS)

    Pérez-Andújar, Angélica; Zhang, Rui; Newhauser, Wayne

    2013-01-01

    Purpose: Stray neutron radiation is of concern after radiation therapy, especially in children, because of the high risk it might carry for secondary cancers. Several previous studies predicted the stray neutron exposure from proton therapy, mostly using Monte Carlo simulations. Promising attempts to develop analytical models have also been reported, but these were limited to only a few proton beam energies. The purpose of this study was to develop an analytical model to predict leakage neutron equivalent dose from passively scattered proton beams in the 100-250-MeV interval.Methods: To develop and validate the analytical model, the authors used values of equivalent dose per therapeutic absorbed dose (H/D) predicted with Monte Carlo simulations. The authors also characterized the behavior of the mean neutron radiation-weighting factor, w R , as a function of depth in a water phantom and distance from the beam central axis.Results: The simulated and analytical predictions agreed well. On average, the percentage difference between the analytical model and the Monte Carlo simulations was 10% for the energies and positions studied. The authors found that w R was highest at the shallowest depth and decreased with depth until around 10 cm, where it started to increase slowly with depth. This was consistent among all energies.Conclusion: Simple analytical methods are promising alternatives to complex and slow Monte Carlo simulations to predict H/D values. The authors' results also provide improved understanding of the behavior of w R which strongly depends on depth, but is nearly independent of lateral distance from the beam central axis

  6. A Mobile-based Platform for Big Load Profiles Data Analytics in Non-Advanced Metering Infrastructures

    Directory of Open Access Journals (Sweden)

    Moussa Sherin

    2016-01-01

    Full Text Available With the rapidly increase of electricity demand around the world due to industrialization and urbanization, this turns the availability of precise knowledge about the consumption patterns of consumers to a valuable asset for electricity providers, given the current competitive electricity market. This would allow them to provide satisfactory services in time of load peaks and to control fraud and abuse cases. Despite of this crucial necessity, this is currently very hard to achieve in many developing countries since smart meters or advanced metering infrastructures (AMIs are not yet settled there to monitor and report energy usages. Whereas the communication and information technologies have widely emerged in such nations, allowing the enormous spread of smart devices among population. In this paper, we present mobile-based BLPDA, a novel platform for big data analytics of consumerss’ load profiles (LPs in the absence of AMIs’ establishment. The proposed platform utilizes mobile computing in order to collect the consumptions of consumers, build their LPs, and analyze the aggregated usages data. Thus, allowing electricity providers to have better vision for an enhanced decision making process. The experimental results emphasize the effectiveness of our platform as an adequate alternative for AMIs in developing countries with minimal cost.

  7. Analytical Evaluation of the Performance of Proportional Fair Scheduling in OFDMA-Based Wireless Systems

    Directory of Open Access Journals (Sweden)

    Mohamed H. Ahmed

    2012-01-01

    Full Text Available This paper provides an analytical evaluation of the performance of proportional fair (PF scheduling in Orthogonal Frequency-Division Multiple Access (OFDMA wireless systems. OFDMA represents a promising multiple access scheme for transmission over wireless channels, as it combines the orthogonal frequency division multiplexing (OFDM modulation and subcarrier allocation. On the other hand, the PF scheduling is an efficient resource allocation scheme with good fairness characteristics. Consequently, OFDMA with PF scheduling represents an attractive solution to deliver high data rate services to multiple users simultaneously with a high degree of fairness. We investigate a two-dimensional (time slot and frequency subcarrier PF scheduling algorithm for OFDMA systems and evaluate its performance analytically and by simulations. We derive approximate closed-form expressions for the average throughput, throughput fairness index, and packet delay. Computer simulations are used for verification. The analytical results agree well with the results from simulations, which show the good accuracy of the analytical expressions.

  8. New Analytical Methods for the Surface/ Interface and the Micro-Structures in Advanced Nanocomposite Materials by Synchrotron Radiation

    Directory of Open Access Journals (Sweden)

    K. Nakamae

    2010-12-01

    Full Text Available Analytical methods of surface/interface structure and micro-structure in advanced nanocomposite materials by using the synchrotron radiation are introduced. Recent results obtained by the energy-tunable and highly collimated brilliant X-rays, in-situ wide angle/small angle X-ray diffraction with high accuracy are reviewed. It is shown that small angle X-ray scattering is one of the best methods to characterize nanoparticle dispersibility, filler aggregate/agglomerate structures and in-situ observation of hierarchical structure deformation in filled rubber under cyclic stretch. Grazing Incidence(small and wide angle X-ray Scattering are powerful to analyze the sintering process of metal nanoparticle by in-situ observation as well as the orientation of polymer molecules and crystalline orientation at very thin surface layer (ca 7nm of polymer film. While the interaction and conformation of adsorbed molecule at interface can be investigated by using high energy X-ray XPS with Enough deep position (ca 9 micron m.

  9. Advanced numerical simulation based on a non-local micromorphic model for metal forming processes

    Directory of Open Access Journals (Sweden)

    Diamantopoulou Evangelia

    2016-01-01

    Full Text Available An advanced numerical methodology is developed for metal forming simulation based on thermodynamically-consistent nonlocal constitutive equations accounting for various fully coupled mechanical phenomena under finite strain in the framework of micromorphic continua. The numerical implementation into ABAQUS/Explicit is made for 2D quadrangular elements thanks to the VUEL users’ subroutine. Simple examples with presence of a damaged area are made in order to show the ability of the proposed methodology to describe the independence of the solution from the space discretization.

  10. Simulation and Advanced Practice Nursing Education

    Science.gov (United States)

    Blue, Dawn I.

    2016-01-01

    This quantitative study compared changes in level of confidence resulting from participation in simulation or traditional instructional methods for BSN (Bachelor of Science in Nursing) to DNP (Doctor of Nursing Practice) students in a nurse practitioner course when they entered the clinical practicum. Simulation has been used in many disciplines…

  11. Analytical theory of intensity fluctuations in SASE

    Energy Technology Data Exchange (ETDEWEB)

    Yu, L.H.; Krinsky, S. [Brookhaven National Lab., Upton, NY (United States). National Synchrotron Light Source

    1997-07-01

    Recent advances in SASE experiments stimulate interest in quantitative comparison of measurements with theory. Extending the previous analysis of the SASE intensity in guided modes, the authors provide an analytical description of the intensity fluctuations by calculating intensity correlation functions in the frequency domain. Comparison of the results with experiment yields new insight into the SASE process.

  12. Simulation study on the cold neutron guides in China advanced research reactor

    International Nuclear Information System (INIS)

    Guo Liping; Yang Tonghua; Wang Hongli; Sun Kai; Zhao Zhixiang

    2003-01-01

    The designs of the two cold neutron guides, CNG1 and CNG2, to be built in China advanced research reactor (CARR) are studied with Monte-Carlo simulation technique. The neutron flux density at the exit of the both guides can reach above 1 x10 9 cm -2 ·s -1 under the assumed flux spectrum of the cold neutron source. The transmission efficiency is 50% and 42%, and the maximum divergence is about 2.2 degree and 1.9 degree, respectively for CNG1 and CNG2. Neutron distribution along horizontal direction is quite uniform for both guides, with maximum fluctuation of less than 3%. Gravity can affect neutron distribution along vertical direction considerably

  13. -Omic and Electronic Health Record Big Data Analytics for Precision Medicine.

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D

    2017-02-01

    Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.

  14. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  15. Simulation of hybrid vehicle propulsion with an advanced battery model

    Energy Technology Data Exchange (ETDEWEB)

    Nallabolu, S.; Kostetzer, L.; Rudnyi, E. [CADFEM GmbH, Grafing (Germany); Geppert, M.; Quinger, D. [LION Smart GmbH, Frieding (Germany)

    2011-07-01

    In the recent years there has been observed an increasing concern about global warming and greenhouse gas emissions. In addition to the environmental issues the predicted scarcity of oil supplies and the dramatic increase in oil price puts new demands on vehicle design. As a result energy efficiency and reduced emission have become one of main selling point for automobiles. Hybrid electric vehicles (HEV) have therefore become an interesting technology for the governments and automotive industries. HEV are more complicated compared to conventional vehicles due to the fact that these vehicles contain more electrical components such as electric machines, power electronics, electronic continuously variable transmissions (CVT), and embedded powertrain controllers. Advanced energy storage devices and energy converters, such as Li-ion batteries, ultracapacitors, and fuel cells are also considered. A detailed vehicle model used for an energy flow analysis and vehicle performance simulation is necessary. Computer simulation is indispensible to facilitate the examination of the vast hybrid electric vehicle design space with the aim to predict the vehicle performance over driving profiles, estimate fuel consumption and the pollution emissions. There are various types of mathematical models and simulators available to perform system simulation of vehicle propulsion. One of the standard methods to model the complete vehicle powertrain is ''backward quasistatic modeling''. In this method vehicle subsystems are defined based on experiential models in the form of look-up tables and efficiency maps. The interaction between adjacent subsystems of the vehicle is defined through the amount of power flow. Modeling the vehicle subsystems like motor, engine, gearbox and battery is under this technique is based on block diagrams. The vehicle model is applied in two case studies to evaluate the vehicle performance and fuel consumption. In the first case study the affect

  16. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  17. Applying Advanced Analytical Approaches to Characterize the Impact of Specific Clinical Gaps and Profiles on the Management of Rheumatoid Arthritis.

    Science.gov (United States)

    Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven

    2016-01-01

    The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.

  18. Analytical mechanics for relativity and quantum mechanics

    CERN Document Server

    Johns, Oliver Davis

    2011-01-01

    Analytical Mechanics for Relativity and Quantum Mechanics is an innovative and mathematically sound treatment of the foundations of analytical mechanics and the relation of classical mechanics to relativity and quantum theory. It is intended for use at the introductory graduate level. A distinguishing feature of the book is its integration of special relativity into teaching of classical mechanics. After a thorough review of the traditional theory, Part II of the book introduces extended Lagrangian and Hamiltonian methods that treat time as a transformable coordinate rather than the fixed parameter of Newtonian physics. Advanced topics such as covariant Langrangians and Hamiltonians, canonical transformations, and Hamilton-Jacobi methods are simplified by the use of this extended theory. And the definition of canonical transformation no longer excludes the Lorenz transformation of special relativity. This is also a book for those who study analytical mechanics to prepare for a critical exploration of quantum...

  19. MASCOTTE: analytical model of eddy current signals

    International Nuclear Information System (INIS)

    Delsarte, G.; Levy, R.

    1992-01-01

    Tube examination is a major application of the eddy current technique in the nuclear and petrochemical industries. Such examination configurations being specially adapted to analytical modes, a physical model is developed on portable computers. It includes simple approximations made possible by the effective conditions of the examinations. The eddy current signal is described by an analytical formulation that takes into account the tube dimensions, the sensor conception, the physical characteristics of the defect and the examination parameters. Moreover, the model makes it possible to associate real signals and simulated signals

  20. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  1. Analytic equation of state for FCC C60 solid based on analytic mean-field potential approach

    International Nuclear Information System (INIS)

    Sun Jiuxun

    2006-01-01

    The analytic mean-field approach (AMFP) was applied to the FCC C60 solid. For the intermolecular forces the Girifalco potential has been utilized. The analytic expressions for the Helmholtz free energy, internal energy and equation of state have been derived. The numerical results of thermodynamic quantities are compared with the molecular dynamic (MD) simulations and the unsymmetrized self-consistent field approach (CUSF) in the literature. It is shown that our AMFP results are in good agreement with the MD data both at low and high temperatures. The results of CUSF are in accordance with the AMFP at low temperature, but at high temperature the difference becomes prominent. Especially the AMFP predicted that the FCC C60 solid is stable upto 2202 K, the spinodal temperature, in good agreement with 2320 K from the MD simulation. However, the CUST just gives 1916 K, a temperature evidently lower than the MD data. The AMFP qualifies as a useful approach that can reasonably consider the anharmonic effects at high temperature

  2. Advances in classical and analytical mechanics: A reviews of author’s results

    Directory of Open Access Journals (Sweden)

    Hedrih-Stevanović Katica R.

    2013-01-01

    Full Text Available A review, in subjective choice, of author’s scientific results in area of: classical mechanics, analytical mechanics of discrete hereditary systems, analytical mechanics of discrete fractional order system vibrations, elastodynamics, nonlinear dynamics and hybrid system dynamics is presented. Main original author’s results were presented through the mathematical methods of mechanics with examples of applications for solving problems of mechanical real system dynamics abstracted to the theoretical models of mechanical discrete or continuum systems, as well as hybrid systems. Paper, also, presents serries of methods and scientific results authored by professors Mitropolyski, Andjelić and Rašković, as well as author’s of this paper original scientific research results obtained by methods of her professors. Vector method based on mass inertia moment vectors and corresponding deviational vector components for pole and oriented axis, defined in 1991 by K. Hedrih, is presented. Results in construction of analytical dynamics of hereditary discrete system obtained in collaboration with O. A. Gorosho are presented. Also, some selections of results author’s postgraduate students and doctorantes in area of nonlinear dynamics are presented. A list of scientific projects headed by author of this paper is presented with a list of doctoral dissertation and magister of sciences thesis which contain scientific research results obtained under the supervision by author of this paper or their fist doctoral candidates. [Projekat Ministarstva nauke Republike Srbije, br. ON174001: Dynamics of hybrid systems with complex structures

  3. Distributed data networks: a blueprint for Big Data sharing and healthcare analytics.

    Science.gov (United States)

    Popovic, Jennifer R

    2017-01-01

    This paper defines the attributes of distributed data networks and outlines the data and analytic infrastructure needed to build and maintain a successful network. We use examples from one successful implementation of a large-scale, multisite, healthcare-related distributed data network, the U.S. Food and Drug Administration-sponsored Sentinel Initiative. Analytic infrastructure-development concepts are discussed from the perspective of promoting six pillars of analytic infrastructure: consistency, reusability, flexibility, scalability, transparency, and reproducibility. This paper also introduces one use case for machine learning algorithm development to fully utilize and advance the portfolio of population health analytics, particularly those using multisite administrative data sources. © 2016 New York Academy of Sciences.

  4. Analytical Chemistry Division annual progress report for period ending December 31, 1988

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: (1) Analytical Research, Development, and Implementation. The division maintains a program to conceptualize, investigate, develop, assess, improve, and implement advanced technology for chemical and physicochemical measurements. Emphasis is on problems and needs identified with ORNL and Department of Energy (DOE) programs; however, attention is also given to advancing the analytical sciences themselves. (2) Programmatic Research, Development, and Utilization. The division carries out a wide variety of chemical work that typically involves analytical research and/or development plus the utilization of analytical capabilities to expedite programmatic interests. (3) Technical Support. The division performs chemical and physicochemical analyses of virtually all types. The Analytical Chemistry Division is organized into four major sections, each of which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to December 31, 1988. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8.

  5. Analytical Chemistry Division annual progress report for period ending December 31, 1988

    International Nuclear Information System (INIS)

    1988-05-01

    The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: (1) Analytical Research, Development, and Implementation. The division maintains a program to conceptualize, investigate, develop, assess, improve, and implement advanced technology for chemical and physicochemical measurements. Emphasis is on problems and needs identified with ORNL and Department of Energy (DOE) programs; however, attention is also given to advancing the analytical sciences themselves. (2) Programmatic Research, Development, and Utilization. The division carries out a wide variety of chemical work that typically involves analytical research and/or development plus the utilization of analytical capabilities to expedite programmatic interests. (3) Technical Support. The division performs chemical and physicochemical analyses of virtually all types. The Analytical Chemistry Division is organized into four major sections, each of which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to December 31, 1988. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8

  6. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  7. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  8. Handbook of Advanced Magnetic Materials

    CERN Document Server

    Liu, Yi; Shindo, Daisuke

    2006-01-01

    From high-capacity, inexpensive hard drives to mag-lev trains, recent achievements in magnetic materials research have made the dreams of a few decades ago reality. The objective of Handbook of Advanced Magnetic Materials is to provide a timely, comprehensive review of recent progress in magnetic materials research. This broad yet detailed reference consists of four volumes: 1.) Nanostructured advanced magnetic materials, 2.) Characterization and simulation of advanced magnetic materials, 3.) Processing of advanced magnetic materials, and 4.) Properties and applications of advanced magnetic materials The first volume documents and explains recent development of nanostructured magnetic materials, emphasizing size effects. The second volume provides a comprehensive review of both experimental methods and simulation techniques for the characterization of magnetic materials. The third volume comprehensively reviews recent developments in the processing and manufacturing of advanced magnetic materials. With the co...

  9. A Performance Analytical Strategy for Network-on-Chip Router with Input Buffer Architecture

    Directory of Open Access Journals (Sweden)

    WANG, J.

    2012-11-01

    Full Text Available In this paper, a performance analytical strategy is proposed for Network-on-Chip router with input buffer architecture. First, an analytical model is developed based on semi-Markov process. For the non-work-conserving router with small buffer size, the model can be used to analyze the schedule delay and the average service time for each buffer when given the related parameters. Then, the packet average delay in router is calculated by using the model. Finally, we validate the effectiveness of our strategy by simulation. By comparing our analytical results to simulation results, we show that our strategy successfully captures the Network-on-Chip router performance and it performs better than the state-of-art technology. Therefore, our strategy can be used as an efficiency performance analytical tool for Network-on-Chip design.

  10. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  11. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  12. Piezoresistive Cantilever Performance-Part I: Analytical Model for Sensitivity.

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C; Pruitt, Beth L

    2010-02-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors.

  13. Piezoresistive Cantilever Performance—Part I: Analytical Model for Sensitivity

    Science.gov (United States)

    Park, Sung-Jin; Doll, Joseph C.; Pruitt, Beth L.

    2010-01-01

    An accurate analytical model for the change in resistance of a piezoresistor is necessary for the design of silicon piezoresistive transducers. Ion implantation requires a high-temperature oxidation or annealing process to activate the dopant atoms, and this treatment results in a distorted dopant profile due to diffusion. Existing analytical models do not account for the concentration dependence of piezoresistance and are not accurate for nonuniform dopant profiles. We extend previous analytical work by introducing two nondimensional factors, namely, the efficiency and geometry factors. A practical benefit of this efficiency factor is that it separates the process parameters from the design parameters; thus, designers may address requirements for cantilever geometry and fabrication process independently. To facilitate the design process, we provide a lookup table for the efficiency factor over an extensive range of process conditions. The model was validated by comparing simulation results with the experimentally determined sensitivities of piezoresistive cantilevers. We performed 9200 TSUPREM4 simulations and fabricated 50 devices from six unique process flows; we systematically explored the design space relating process parameters and cantilever sensitivity. Our treatment focuses on piezoresistive cantilevers, but the analytical sensitivity model is extensible to other piezoresistive transducers such as membrane pressure sensors. PMID:20336183

  14. Recent advances in computational-analytical integral transforms for convection-diffusion problems

    Science.gov (United States)

    Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.; Almeida, A. P.

    2017-10-01

    An unifying overview of the Generalized Integral Transform Technique (GITT) as a computational-analytical approach for solving convection-diffusion problems is presented. This work is aimed at bringing together some of the most recent developments on both accuracy and convergence improvements on this well-established hybrid numerical-analytical methodology for partial differential equations. Special emphasis is given to novel algorithm implementations, all directly connected to enhancing the eigenfunction expansion basis, such as a single domain reformulation strategy for handling complex geometries, an integral balance scheme in dealing with multiscale problems, the adoption of convective eigenvalue problems in formulations with significant convection effects, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Then, selected examples are presented that illustrate the improvement achieved in each class of extension, in terms of convergence acceleration and accuracy gain, which are related to conjugated heat transfer in complex or multiscale microchannel-substrate geometries, multidimensional Burgers equation model, and diffusive metal extraction through polymeric hollow fiber membranes. Numerical results are reported for each application and, where appropriate, critically compared against the traditional GITT scheme without convergence enhancement schemes and commercial or dedicated purely numerical approaches.

  15. Simulations of Liners and Test Objects for a New Atlas Advanced Radiography Source

    International Nuclear Information System (INIS)

    Morgan, D. V.; Iversen, S.; Hilko, R. A.

    2002-01-01

    The Advanced Radiographic Source (ARS) will improve the data significantly due to its smaller source width. Because of the enhanced ARS output, larger source-to-object distances are a reality. The harder ARS source will allow radiography of thick high-Z targets. The five different spectral simulations resulted in similar imaging detector weighted transmission. This work used a limited set of test objects and imaging detectors. Other test objects and imaging detectors could possibly change the MVp-sensitivity result. The effect of material motion blur must be considered for the ARS due to the expected smaller X-ray source size. This study supports the original 1.5-MVp value

  16. Qualification of RETRAN for simulator applications

    International Nuclear Information System (INIS)

    Harrison, J.F.

    1988-01-01

    The use of full-scope control room replica simulators increased substantially following the accident at Three Mile Island Unit 2. The technical capability required to represent severe events has been included, in varying degrees, in most simulators purchased since the TMI-2 accident. The ability of the instructor to create a large variety of combinations of malfunctions has also greatly expanded. The nuclear industry has developed a standard which establishes the minimum functional requirements for full-scope nuclear control room simulators used for operator training. This standard, ANSI/ANS-3.5, was first issued in 1981 and was reissued in 1985. A method for performing simulator qualification with best estimate analytical data has been proposed in EPRI NP-4243, Analytic Simulator Qualification Methodology. The idea presented there is to choose a set of transients which drive the simulator into all the system conditions (dynamic states) likely to be encountered during operator training. The key observable parameters for each state are compared to analyses performed with the best estimate analytical model The closeness of the comparison determines the fidelity of the simulator. The approach described in EPRI NP-4243 has been adapted for evaluating RETRAN's capability for use in simulator qualification. RETRAN analyses which compare the RETRAN results to plant or test facility data are evaluated with respect to the simulator test matrix documented in EPRI NP-4243

  17. Gravitational waveforms for neutron star binaries from binary black hole simulations

    Science.gov (United States)

    Barkett, Kevin; Scheel, Mark; Haas, Roland; Ott, Christian; Bernuzzi, Sebastiano; Brown, Duncan; Szilagyi, Bela; Kaplan, Jeffrey; Lippuner, Jonas; Muhlberger, Curran; Foucart, Francois; Duez, Matthew

    2016-03-01

    Gravitational waves from binary neutron star (BNS) and black-hole/neutron star (BHNS) inspirals are primary sources for detection by the Advanced Laser Interferometer Gravitational-Wave Observatory. The tidal forces acting on the neutron stars induce changes in the phase evolution of the gravitational waveform, and these changes can be used to constrain the nuclear equation of state. Current methods of generating BNS and BHNS waveforms rely on either computationally challenging full 3D hydrodynamical simulations or approximate analytic solutions. We introduce a new method for computing inspiral waveforms for BNS/BHNS systems by adding the post-Newtonian (PN) tidal effects to full numerical simulations of binary black holes (BBHs), effectively replacing the non-tidal terms in the PN expansion with BBH results. Comparing a waveform generated with this method against a full hydrodynamical simulation of a BNS inspiral yields a phase difference of < 1 radian over ~ 15 orbits. The numerical phase accuracy required of BNS simulations to measure the accuracy of the method we present here is estimated as a function of the tidal deformability parameter λ.

  18. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Wulff, W.

    1993-01-01

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  19. Applicability of the Analytical Solution to N-Person Social Dilemma Games

    Directory of Open Access Journals (Sweden)

    Ugo Merlone

    2018-05-01

    Full Text Available The purpose of this study is to present an analysis of the applicability of an analytical solution to the N−person social dilemma game. Such solution has been earlier developed for Pavlovian agents in a cellular automaton environment with linear payoff functions and also been verified using agent based simulation. However, no discussion has been offered for the applicability of this result in all Prisoners' Dilemma game scenarios or in other N−person social dilemma games such as Chicken or Stag Hunt. In this paper it is shown that the analytical solution works in all social games where the linear payoff functions are such that each agent's cooperating probability fluctuates around the analytical solution without cooperating or defecting with certainty. The social game regions where this determination holds are explored by varying payoff function parameters. It is found by both simulation and a special method that the analytical solution applies best in Chicken when the payoff parameter S is slightly negative and then the analytical solution slowly degrades as S becomes more negative. It turns out that the analytical solution is only a good estimate for Prisoners' Dilemma games and again becomes worse as S becomes more negative. A sensitivity analysis is performed to determine the impact of different initial cooperating probabilities, learning factors, and neighborhood size.

  20. Design and simulation of advanced charge recovery piezoactuator drivers

    International Nuclear Information System (INIS)

    Biancuzzi, G; Lemke, T; Woias, P; Goldschmidtboeing, F; Ruthmann, O; Schrag, H J; Vodermayer, B; Schmid, T

    2010-01-01

    The German Artificial Sphincter System project aims at the development of an implantable sphincter prosthesis driven by a piezoelectrically actuated micropump. The system has been designed to be fully implantable, i.e. the power supply is provided by a rechargeable lithium polymer battery. In order to provide sufficient battery duration and to limit battery dimensions, special effort has to be made to minimize power consumption of the whole system and, in particular, of the piezoactuator driver circuitry. Inductive charge recovery can be used to recover part of the charge stored within the actuator. We are going to present a simplified inductor-based circuit capable of voltage inversion across the actuator without the need of an additional negative voltage source. The dimension of the inductors required for such a concept is nevertheless significant. We therefore present a novel alternative concept, called direct switching, where the equivalent capacitance of the actuator is charged directly by a step-up converter and discharged by a step-down converter. We achieved superior performance compared to a simple inductor-based driver with the advantage of using small-size chip inductors. As a term of comparison, the performance of the aforementioned drivers is compared to a conventional driver that does not implement any charge recovery technique. With our design we have been able to achieve more than 50% reduction in power consumption compared to the simplest conventional driver. The new direct switching driver performs 15% better than an inductor-based driver. A novel, whole-system SPICE simulation is presented, where both the driving circuit and the piezoactuator are modeled making use of advanced nonlinear models. Such a simulation is a precious tool to design and optimize piezoactuator drivers

  1. NASA Advanced Supercomputing Facility Expansion

    Science.gov (United States)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  2. Three lessons for genetic toxicology from baseball analytics.

    Science.gov (United States)

    Dertinger, Stephen D

    2017-07-01

    In many respects the evolution of baseball statistics mirrors advances made in the field of genetic toxicology. From its inception, baseball and statistics have been inextricably linked. Generations of players and fans have used a number of relatively simple measurements to describe team and individual player's current performance, as well as for historical record-keeping purposes. Over the years, baseball analytics has progressed in several important ways. Early advances were based on deriving more meaningful metrics from simpler forerunners. Now, technological innovations are delivering much deeper insights. Videography, radar, and other advances that include automatic player recognition capabilities provide the means to measure more complex and useful factors. Fielders' reaction times, efficiency of the route taken to reach a batted ball, and pitch-framing effectiveness come to mind. With the current availability of complex measurements from multiple data streams, multifactorial analyses occurring via machine learning algorithms have become necessary to make sense of the terabytes of data that are now being captured in every Major League Baseball game. Collectively, these advances have transformed baseball statistics from being largely descriptive in nature to serving data-driven, predictive roles. Whereas genetic toxicology has charted a somewhat parallel course, a case can be made that greater utilization of baseball's mindset and strategies would serve our scientific field well. This paper describes three useful lessons for genetic toxicology, courtesy of the field of baseball analytics: seek objective knowledge; incorporate multiple data streams; and embrace machine learning. Environ. Mol. Mutagen. 58:390-397, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  4. SINGLE PHASE ANALYTICAL MODELS FOR TERRY TURBINE NOZZLE

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Haihua; Zhang, Hongbin; Zou, Ling; O' Brien, James

    2016-11-01

    All BWR RCIC (Reactor Core Isolation Cooling) systems and PWR AFW (Auxiliary Feed Water) systems use Terry turbine, which is composed of the wheel with turbine buckets and several groups of fixed nozzles and reversing chambers inside the turbine casing. The inlet steam is accelerated through the turbine nozzle and impacts on the wheel buckets, generating work to drive the RCIC pump. As part of the efforts to understand the unexpected “self-regulating” mode of the RCIC systems in Fukushima accidents and extend BWR RCIC and PWR AFW operational range and flexibility, mechanistic models for the Terry turbine, based on Sandia National Laboratories’ original work, has been developed and implemented in the RELAP-7 code to simulate the RCIC system. RELAP-7 is a new reactor system code currently under development with the funding support from U.S. Department of Energy. The RELAP-7 code is a fully implicit code and the preconditioned Jacobian-free Newton-Krylov (JFNK) method is used to solve the discretized nonlinear system. This paper presents a set of analytical models for simulating the flow through the Terry turbine nozzles when inlet fluid is pure steam. The implementation of the models into RELAP-7 will be briefly discussed. In the Sandia model, the turbine bucket inlet velocity is provided according to a reduced-order model, which was obtained from a large number of CFD simulations. In this work, we propose an alternative method, using an under-expanded jet model to obtain the velocity and thermodynamic conditions for the turbine bucket inlet. The models include both adiabatic expansion process inside the nozzle and free expansion process out of the nozzle to reach the ambient pressure. The combined models are able to predict the steam mass flow rate and supersonic velocity to the Terry turbine bucket entrance, which are the necessary input conditions for the Terry Turbine rotor model. The nozzle analytical models were validated with experimental data and

  5. Simulation of an advanced small aperture track system

    Science.gov (United States)

    Williams, Tommy J.; Crockett, Gregg A.; Brunson, Richard L.; Beatty, Brad; Zahirniak, Daniel R.; Deuto, Bernard G.

    2001-08-01

    Simulation development for EO Systems has progressed to new levels with the advent of COTS software tools such as Matlab/Simulink. These tools allow rapid reuse of simulation library routines. We have applied these tools to newly emerging Acquisition Tracking and Pointing (ATP) systems using many routines developed through a legacy to High Energy Laser programs such as AirBorne Laser, Space Based Laser, Tactical High Energy Laser, and The Air Force Research Laboratory projects associated with the Starfire Optical Range. The simulation architecture allows ease in testing various track algorithms under simulated scenes with the ability to rapidly vary system hardware parameters such as track sensor and track loop control systems. The atmospheric turbulence environment and associated optical distortion is simulated to high fidelity levels through the application of an atmospheric phase screen model to produce scintillation of the laser illuminator uplink. The particular ATP system simulated is a small transportable system for tracking satellites in a daytime environment and projects a low power laser and receives laser return from retro-reflector equipped satellites. The primary application of the ATP system (and therefore the simulation) is the determination of the illuminator beam profile, jitter, and scintillation of the low power laser at the satellite. The ATP system will serve as a test bed for satellite tracking in a high background during daytime. Of particular interest in this simulation is the ability to emulate the hardware modelogic within the simulation to test and refine system states and mode change decisions. Additionally, the simulation allows data from the hardware system tests to be imported into Matlab and to thereby drive the simulation or to be easily compared to simulation results.

  6. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.; Bruckner-Lea, Cindy J.; Marks, James D.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  7. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  8. Interactive visualization to advance earthquake simulation

    Science.gov (United States)

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  9. Facilitating Multiple Intelligences Through Multimodal Learning Analytics

    Directory of Open Access Journals (Sweden)

    Ayesha PERVEEN

    2018-01-01

    Full Text Available This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner’s 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as multiple intelligences by online education systems and then suggests a framework of the advanced form of learning analytics i.e., multimodal learning analytics for tracing and facilitating multiple intelligences while they are engaged in online ubiquitous learning. As multimodal learning analytics is still an evolving area, it poses many challenges for technologists, educationists as well as organizational managers. Learning analytics make machines meet humans, therefore, the educationists with an expertise in learning theories can help technologists devise latest technological methods for multimodal learning analytics and organizational managers can implement them for the improvement of online education. Therefore, a careful instructional design based on a deep understanding of students’ learning abilities, is required to develop teaching plans and technological possibilities for monitoring students’ learning paths. This is how learning analytics can help design an adaptive instructional design based on a quick analysis of the data gathered. Based on that analysis, the academicians can critically reflect upon the quick or delayed implementation of the existing instructional design based on students’ cognitive abilities or even about the single or double loop learning design. The researcher concludes that the online education is multimodal in nature, has the capacity to endorse multiliteracies and, therefore, multiple intelligences can be tracked and facilitated through multimodal learning analytics in an online mode. However, online teachers’ training both in technological implementations and

  10. An advanced constitutive model in the sheet metal forming simulation: the Teodosiu microstructural model and the Cazacu Barlat yield criterion

    International Nuclear Information System (INIS)

    Alves, J.L.; Oliveira, M.C.; Menezes, L.F.

    2004-01-01

    Two constitutive models used to describe the plastic behavior of sheet metals in the numerical simulation of sheet metal forming process are studied: a recently proposed advanced constitutive model based on the Teodosiu microstructural model and the Cazacu Barlat yield criterion is compared with a more classical one, based on the Swift law and the Hill 1948 yield criterion. These constitutive models are implemented into DD3IMP, a finite element home code specifically developed to simulate sheet metal forming processes, which generically is a 3-D elastoplastic finite element code with an updated Lagrangian formulation, following a fully implicit time integration scheme, large elastoplastic strains and rotations. Solid finite elements and parametric surfaces are used to model the blank sheet and tool surfaces, respectively. Some details of the numerical implementation of the constitutive models are given. Finally, the theory is illustrated with the numerical simulation of the deep drawing of a cylindrical cup. The results show that the proposed advanced constitutive model predicts with more exactness the final shape (medium height and ears profile) of the formed part, as one can conclude from the comparison with the experimental results

  11. Gear Mesh Loss-of-Lubrication Experiments and Analytical Simulation

    Science.gov (United States)

    Handschuh, Robert F.; Polly, Joseph; Morales, Wilfredo

    2011-01-01

    An experimental program to determine the loss-of-lubrication (LOL) characteristics of spur gears in an aerospace simulation test facility has been completed. Tests were conducted using two different emergency lubricant types: (1) an oil mist system (two different misted lubricants) and (2) a grease injection system (two different grease types). Tests were conducted using a NASA Glenn test facility normally used for conducting contact fatigue. Tests were run at rotational speeds up to 10000 rpm using two different gear designs and two different gear materials. For the tests conducted using an air-oil misting system, a minimum lubricant injection rate was determined to permit the gear mesh to operate without failure for at least 1 hr. The tests allowed an elevated steady state temperature to be established. A basic 2-D heat transfer simulation has been developed to investigate temperatures of a simulated gear as a function of frictional behavior. The friction (heat generation source) between the meshing surfaces is related to the position in the meshing cycle, the load applied, and the amount of lubricant in the contact. Experimental conditions will be compared to those from the 2-D simulation.

  12. Co-Simulation of an Inverter Fed Permanent Magnet Synchronous Machine

    Directory of Open Access Journals (Sweden)

    Kiss Gergely Máté

    2014-10-01

    Full Text Available Co-simulation is a method which makes it possible to study the electric machine and its drive at once, as one system. By taking into account the actual inverter voltage waveforms in a finite element model instead of using only the fundamental, we are able to study the electrical machine's behavior in more realistic scenario. The recent increase in the use of variable speed drives justifies the research on such simulation techniques. In this paper we present the co-simulation of an inverter fed permanent magnet synchronous machine. The modelling method employs an analytical variable speed drive model and a finite element electrical machine model. By linking the analytical variable speed drive model together with a finite element model the complex simulation model enables the investigation of the electrical machine during actual operation. The methods are coupled via the results. This means that output of the finite element model serves as an input to the analytical model, and the output of the analytical model provides the input of the finite element model for a different simulation, thus enabling the finite element simulation of an inverter fed machine. The resulting speed and torque characteristics from the analytical model and the finite element model show a good agreement. The experiences with the co-simulation technique encourage further research and effort to improve the method.

  13. Use of analytical aids for accident management

    International Nuclear Information System (INIS)

    Ward, L.W.

    1991-01-01

    The use of analytical aids by utility technical support teams can enhance the staff's ability to manage accidents. Since instrumentation is exposed to environments beyond design-basis conditions, instruments may provide ambiguous information or may even fail. While it is most likely that many instruments will remain operable, their ability to provide unambiguous information needed for the management of beyond-design-basis events and severe accidents is questionable. Furthermore, given these limitation in instrumentation, the need to ascertain and confirm current plant status and forecast future behavior to effectively manage accidents at nuclear facilities requires a computational capability to simulate the thermal and hydraulic behavior in the primary, secondary, and containment systems. With the need to extend the current preventive approach in accident management to include mitigative actions, analytical aids could be used to further enhance the current capabilities at nuclear facilities. This need for computational or analytical aids is supported based on a review of the candidate accident management strategies discussed in NUREG/CR-5474. Based on the review of the NUREG/CR-5474 strategies, two major analytical aids are considered necessary to support the implementation and monitoring of many of the strategies in this document. These analytical aids include (1) An analytical aid to provide reactor coolant and secondary system behavior under LOCA conditions. (2) An analytical aid to predict containment pressure and temperature response with a steam, air, and noncondensable gas mixture present

  14. Simulation of Heating with the Waves of Ion Cyclotron Range of Frequencies in Experimental Advanced Superconducting Tokamak

    International Nuclear Information System (INIS)

    Yang Cheng; Zhu Sizheng; Zhang Xinjun

    2010-01-01

    Simulation on the heating scenarios in experimental advanced superconducting tokamak (EAST) was performed by using a full wave code TORIC. The locations of resonance layers for these heating schemes are predicted and the simulations for different schemes in ICRF experiments in EAST, for example, ion heating (both fundamental and harmonic frequency) or electron heating (by direct fast waves or by mode conversion waves), on-axis or off-axis heating, and high-field-side (HFS) launching or low-field-side (LFS) launching, etc, were conducted. For the on-axis minority ion heating of 3 He in D( 3 He) plasma, the impacts of both density and temperature on heating were discussed in the EAST parameter ranges.

  15. Analytical solution of population balance equation involving ...

    Indian Academy of Sciences (India)

    This paper presents an effective analytical simulation to solve population balance equation (PBE), involving particulate aggregation and breakage, by making use ... The domain part of the email address of all email addresses used by the office of Indian Academy of Sciences, including those of the staff, the journals, various ...

  16. Simulation of strain localization in polycrystals

    International Nuclear Information System (INIS)

    Deryugin, Ye.Ye.; Payuk, V.A.; Lasko, G.V.

    2002-01-01

    In the work simulation of plastic deformation evolution is presented for the case of polycrystals under external loading. Strain localization in polycrystal is simulated analytically following an unconventional method. The model is based on new progressive relaxation elements methods. Emphasis of the model is combining of discrete methods and continual approach. It makes possible to present local sites of plastic deformation analytically in a continuous medium and to calculate their respective no uniform stress field

  17. Advances in Biosensing Methods

    Directory of Open Access Journals (Sweden)

    Reema Taneja

    2007-02-01

    Full Text Available A fractal analysis is presented for the binding and dissociation (if applicable kinetics of analyte-receptor reactions occurring on biosensor surfaces. The applications of the biosensors have appeared in the recent literature. The examples provided together provide the reader with a perspective of the advances in biosensors that are being used to detect analytes of interest. This should also stimulate interest in applying biosensors to other areas of application. The fractal analysis limits the evaluation of the rate constants for binding and dissociation (if applicable for the analyte-receptor reactions occurring in biosensor surfaces. The fractal dimension provides a quantitative measure of the degree of heterogeneity on the biosensor surface. Predictive relations are presented that relate the binding co-efficient with the degree of heterogeneity or the fractal dimension on the biosensor surface

  18. Simulations of Failure via Three-Dimensional Cracking in Fuel Cladding for Advanced Nuclear Fuels

    International Nuclear Information System (INIS)

    Lu, Hongbing; Bukkapatnam, Satish; Harimkar, Sandip; Singh, Raman; Bardenhagen, Scott

    2014-01-01

    Enhancing performance of fuel cladding and duct alloys is a key means of increasing fuel burnup. This project will address the failure of fuel cladding via three-dimensional cracking models. Researchers will develop a simulation code for the failure of the fuel cladding and validate the code through experiments. The objective is to develop an algorithm to determine the failure of fuel cladding in the form of three-dimensional cracking due to prolonged exposure under varying conditions of pressure, temperature, chemical environment, and irradiation. This project encompasses the following tasks: 1. Simulate 3D crack initiation and growth under instantaneous and/or fatigue loads using a new variant of the material point method (MPM); 2. Simulate debonding of the materials in the crack path using cohesive elements, considering normal and shear traction separation laws; 3. Determine the crack propagation path, considering damage of the materials incorporated in the cohesive elements to allow the energy release rate to be minimized; 4. Simulate the three-dimensional fatigue crack growth as a function of loading histories; 5. Verify the simulation code by comparing results to theoretical and numerical studies available in the literature; 6. Conduct experiments to observe the crack path and surface profile in unused fuel cladding and validate against simulation results; and 7. Expand the adaptive mesh refinement infrastructure parallel processing environment to allow adaptive mesh refinement at the 3D crack fronts and adaptive mesh merging in the wake of cracks. Fuel cladding is made of materials such as stainless steels and ferritic steels with added alloying elements, which increase stability and durability under irradiation. As fuel cladding is subjected to water, chemicals, fission gas, pressure, high temperatures, and irradiation while in service, understanding performance is essential. In the fast fuel used in advanced burner reactors, simulations of the nuclear

  19. Simulations of Failure via Three-Dimensional Cracking in Fuel Cladding for Advanced Nuclear Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Hongbing [Univ. of Texas, Austin, TX (United States); Bukkapatnam, Satish; Harimkar, Sandip; Singh, Raman; Bardenhagen, Scott

    2014-01-09

    Enhancing performance of fuel cladding and duct alloys is a key means of increasing fuel burnup. This project will address the failure of fuel cladding via three-dimensional cracking models. Researchers will develop a simulation code for the failure of the fuel cladding and validate the code through experiments. The objective is to develop an algorithm to determine the failure of fuel cladding in the form of three-dimensional cracking due to prolonged exposure under varying conditions of pressure, temperature, chemical environment, and irradiation. This project encompasses the following tasks: 1. Simulate 3D crack initiation and growth under instantaneous and/or fatigue loads using a new variant of the material point method (MPM); 2. Simulate debonding of the materials in the crack path using cohesive elements, considering normal and shear traction separation laws; 3. Determine the crack propagation path, considering damage of the materials incorporated in the cohesive elements to allow the energy release rate to be minimized; 4. Simulate the three-dimensional fatigue crack growth as a function of loading histories; 5. Verify the simulation code by comparing results to theoretical and numerical studies available in the literature; 6. Conduct experiments to observe the crack path and surface profile in unused fuel cladding and validate against simulation results; and 7. Expand the adaptive mesh refinement infrastructure parallel processing environment to allow adaptive mesh refinement at the 3D crack fronts and adaptive mesh merging in the wake of cracks. Fuel cladding is made of materials such as stainless steels and ferritic steels with added alloying elements, which increase stability and durability under irradiation. As fuel cladding is subjected to water, chemicals, fission gas, pressure, high temperatures, and irradiation while in service, understanding performance is essential. In the fast fuel used in advanced burner reactors, simulations of the nuclear

  20. EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics

    Science.gov (United States)

    Baumann, P.; Rossi, A. P.

    2016-12-01

    The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research

  1. Simulating Galaxies and Active Galactic Nuclei in the LSST Image Simulation Effort

    NARCIS (Netherlands)

    Pizagno II, Jim; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Chang, C.; Gibson, R. R.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, S. K.; Lorenz, S.; Marshall, S.; Shmakova, S. M.; Sylvestri, N.; Todd, N.; Young, M.

    We present an extragalactic source catalog, which includes galaxies and Active Galactic Nuclei, that is used for the Large Survey Synoptic Telescope Imaging Simulation effort. The galaxies are taken from the De Lucia et. al. (2006) semi-analytic modeling (SAM) of the Millennium Simulation. The LSST

  2. Microbial Enhanced Oil Recovery - Advanced Reservoir Simulation

    DEFF Research Database (Denmark)

    Nielsen, Sidsel Marie

    the water phase. The biofilm formation implies that the concentration of bacteria near the inlet increases. In combination with surfactant production, the biofilm results in a higher surfactant concentration in the initial part of the reservoir. The oil that is initially bypassed in connection...... simulator. In the streamline simulator, the effect of gravity is introduced using an operator splitting technique. The gravity effect stabilizes oil displacement causing markedly improvement of the oil recovery, when the oil density becomes relatively low. The general characteristics found for MEOR in one......-dimensional simulations are also demonstrated both in two and three dimensions. Overall, this MEOR process conducted in a heterogeneous reservoir also produces more oil compared to waterflooding, when the simulations are run in multiple dimensions. The work presented in this thesis has resulted in two publications so far....

  3. Bridging Numerical and Analytical Models of Transient Travel Time Distributions: Challenges and Opportunities

    Science.gov (United States)

    Danesh Yazdi, M.; Klaus, J.; Condon, L. E.; Maxwell, R. M.

    2017-12-01

    Recent advancements in analytical solutions to quantify water and solute time-variant travel time distributions (TTDs) and the related StorAge Selection (SAS) functions synthesize catchment complexity into a simplified, lumped representation. While these analytical approaches are easy and efficient in application, they require high frequency hydrochemical data for parameter estimation. Alternatively, integrated hydrologic models coupled to Lagrangian particle-tracking approaches can directly simulate age under different catchment geometries and complexity at a greater computational expense. Here, we compare and contrast the two approaches by exploring the influence of the spatial distribution of subsurface heterogeneity, interactions between distinct flow domains, diversity of flow pathways, and recharge rate on the shape of TTDs and the relating SAS functions. To this end, we use a parallel three-dimensional variably saturated groundwater model, ParFlow, to solve for the velocity fields in the subsurface. A particle-tracking model, SLIM, is then implemented to determine the age distributions at every real time and domain location, facilitating a direct characterization of the SAS functions as opposed to analytical approaches requiring calibration of such functions. Steady-state results reveal that the assumption of random age sampling scheme might only hold in the saturated region of homogeneous catchments resulting in an exponential TTD. This assumption is however violated when the vadose zone is included as the underlying SAS function gives a higher preference to older ages. The dynamical variability of the true SAS functions is also shown to be largely masked by the smooth analytical SAS functions. As the variability of subsurface spatial heterogeneity increases, the shape of TTD approaches a power-law distribution function, including a broader distribution of shorter and longer travel times. We further found that larger (smaller) magnitude of effective

  4. Development of CFD software for the simulation of thermal hydraulics in advanced nuclear reactors. Final report

    International Nuclear Information System (INIS)

    Bachar, Abdelaziz; Haslinger, Wolfgang; Scheuerer, Georg; Theodoridis, Georgios

    2015-01-01

    The objectives of the project were: Improvement of the simulation accuracy for nuclear reactor thermo-hydraulics by coupling system codes with three-dimensional CFD software; Extension of CFD software to predict thermo-hydraulics in advanced reactor concepts; Validation of the CFD software by simulation different UPTF TRAM-C test cases and development of best practice guidelines. The CFD module was based on the ANSYS CFD software and the system code ATHLET of GRS. All three objectives were met: The coupled ATHLET-ANSYS CFD software is in use at GRS and TU Muenchen. Besides the test cases described in the report, it has been used for other applications, for instance the TALL-3D experiment of KTH Stockholm. The CFD software was extended with material properties for liquid metals, and validated using existing data. Several new concepts were tested when applying the CFD software to the UPTF test cases: Simulations with Conjugate Heat Transfer (CHT) were performed for the first time. This led to better agreement between predictions and data and reduced uncertainties when applying temperature boundary conditions. The meshes for the CHT simulation were also used for a coupled fluid-structure-thermal analysis which was another novelty. The results of the multi-physics analysis showed plausible results for the mechanical and thermal stresses. The workflow developed as part of the current project can be directly used for industrial nuclear reactor simulations. Finally, simulations for two-phase flows with and without interfacial mass transfer were performed. These showed good agreement with data. However, a persisting problem for the simulation of multi-phase flows are the long simulation times which make use for industrial applications difficult.

  5. Advanced modeling and simulation of integrated gasification combined cycle power plants with CO2-capture

    International Nuclear Information System (INIS)

    Rieger, Mathias

    2014-01-01

    The objective of this thesis is to provide an extensive description of the correlations in some of the most crucial sub-processes for hard coal fired IGCC with carbon capture (CC-IGCC). For this purpose, process simulation models are developed for four industrial gasification processes, the CO-shift cycle, the acid gas removal unit, the sulfur recovery process, the gas turbine, the water-/steam cycle and the air separation unit (ASU). Process simulations clarify the influence of certain boundary conditions on plant operation, performance and economics. Based on that, a comparative benchmark of CC-IGCC concepts is conducted. Furthermore, the influence of integration between the gas turbine and the ASU is analyzed in detail. The generated findings are used to develop an advanced plant configuration with improved economics. Nevertheless, IGCC power plants with carbon capture are not found to be an economically efficient power generation technology at present day boundary conditions.

  6. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  7. A combined approach of simulation and analytic hierarchy process in assessing production facility layouts

    Science.gov (United States)

    Ramli, Razamin; Cheng, Kok-Min

    2014-07-01

    One of the important areas of concern in order to obtain a competitive level of productivity in a manufacturing system is the layout design and material transportation system (conveyor system). However, changes in customers' requirements have triggered the need to design other alternatives of the manufacturing layout for existing production floor. Hence, this paper discusses effective alternatives of the process layout specifically, the conveyor system layout. Subsequently, two alternative designs for the conveyor system were proposed with the aims to increase the production output and minimize space allocation. The first proposed layout design includes the installation of conveyor oven in the particular manufacturing room based on priority, and the second one is the one without the conveyor oven in the layout. Simulation technique was employed to design the new facility layout. Eventually, simulation experiments were conducted to understand the performance of each conveyor layout design based on operational characteristics, which include predicting the output of layouts. Utilizing the Analytic Hierarchy Process (AHP), the newly and improved layout designs were assessed before the final selection was done. As a comparison, the existing conveyor system layout was included in the assessment process. Relevant criteria involved in this layout design problem were identified as (i) usage of space of each design, (ii) operator's utilization rates, (iii) return of investment (ROI) of the layout, and (iv) output of the layout. In the final stage of AHP analysis, the overall priority of each alternative layout was obtained and thus, a selection for final use by the management was made based on the highest priority value. This efficient planning and designing of facility layout in a particular manufacturing setting is able to minimize material handling cost, minimize overall production time, minimize investment in equipment, and optimize utilization of space.

  8. Virtual Environments for Advanced Trainers and Simulators

    NARCIS (Netherlands)

    Jense, G.J.; Kuijper, F.

    1993-01-01

    Virtual environment technology is expected to make a big impact on future training and simulation systems. Direct stimulation of human senses (eyesight, auditory, tactile) and new paradigms for user input will improve the realism of simulations and thereby the effectiveness of training systems.

  9. The analytic renormalization group

    Directory of Open Access Journals (Sweden)

    Frank Ferrari

    2016-08-01

    Full Text Available Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k∈Z, associated with the Matsubara frequencies νk=2πk/β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct “Analytic Renormalization Group” linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk|<μ (with the possible exception of the zero mode G0, together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk|≥μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  10. Modeling of Coaxial Slot Waveguides Using Analytical and Numerical Approaches: Revisited

    Directory of Open Access Journals (Sweden)

    Kok Yeow You

    2012-01-01

    Full Text Available Our reviews of analytical methods and numerical methods for coaxial slot waveguides are presented. The theories, background, and physical principles related to frequency-domain electromagnetic equations for coaxial waveguides are reassessed. Comparisons of the accuracies of various types of admittance and impedance equations and numerical simulations are made, and the fringing field at the aperture sensor, which is represented by the lumped capacitance circuit, is evaluated. The accuracy and limitations of the analytical equations are explained in detail. The reasons for the replacement of analytical methods by numerical methods are outlined.

  11. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can ...... lead towards more informed decisions. In this article we discuss some of the challenges related to big data analysis in manufacturing and relevant solutions to some of these challenges....

  12. Development of an MMS/PC based real time simulation of the B and W NSS plant for advanced control system design

    International Nuclear Information System (INIS)

    Bartells, P.S.; Brownell, R.B.

    1990-01-01

    The development of this personal-computer-based simulation of the Babcock and Wilcox nuclear steam system (NSS) was prompted in part by the need for a real-time analysis tool to be used in evaluating advanced control concepts for the NSS. NSS control is currently accomplished via conventional analog systems that are becoming increasingly obsolete. With the widespread use of digital micro-processor-based control systems for fossil power and other applications, the B and W Owners Group Advanced Control System Task Force is developing a next-generation control system for upgrading existing B and W power plants. To take advantage of the digital control technology, it is desirable to have a flexible, cost-effective, and portable control analysis tool available that can simulate various postulated control strategies and algorithms and couple these with simulated plant responses in real time to determine overall effectiveness. To develop the desired capability, B and W has incorporated the simulation methodology of the Modular Modeling System (MMS) and the knowledge gained during development of a similar Department of Energy-funded project. The MMS-based NSS model was developed and then modified to increase execution speed, ported to an IBM Personal System 2 (Model 80) and interfaced with user-friendly graphics. The user can develop alternative control strategies and readily interface them with the NSS model for real-time display and evaluation. The paper addresses the key considerations and programming techniques used to accomplish the resulting simulation

  13. A review of electrochemiluminescence (ECL) in and for microfluidic analytical devices.

    Science.gov (United States)

    Kirschbaum, Stefanie E K; Baeumner, Antje J

    2015-05-01

    The concept and realization of microfluidic total analysis systems (microTAS) have revolutionized the analytical process by integrating the whole breadth of analytical techniques into miniaturized systems. Paramount for efficient and competitive microTAS are integrated detection strategies, which lead to low limits of detection while reducing the sample volume. The concept of electrochemiluminescence (ECL) has been intriguing ever since its introduction based on Ru(bpy)3 (2+) by Tokel and Bard [1] (J Am Chem Soc 1853:2862-2863, 1972), especially because of its immense sensitivity, nonexistent auto-luminescent background signal, and simplicity in experimental design. Therefore, integrating ECL detection into microTAS is a logical consequence to achieve simple, yet highly sensitive, sensors. However, published microanalytical devices employing ECL detection focus in general on traditional ECL chemistry and have yet to take advantage of advances made in standard bench-top ECL strategies. This review will therefore focus on the most recent advancements in microfluidic ECL approaches, but also evaluate the potential impact of bench-top ECL research progress that would further improve performance and lower limits of detection of micro analytical ECL systems, ensuring their desirability as detection principle for microTAS applications.

  14. An advanced simulator for orthopedic surgical training.

    Science.gov (United States)

    Cecil, J; Gupta, Avinash; Pirela-Cruz, Miguel

    2018-02-01

    The purpose of creating the virtual reality (VR) simulator is to facilitate and supplement the training opportunities provided to orthopedic residents. The use of VR simulators has increased rapidly in the field of medical surgery for training purposes. This paper discusses the creation of the virtual surgical environment (VSE) for training residents in an orthopedic surgical process called less invasive stabilization system (LISS) surgery which is used to address fractures of the femur. The overall methodology included first obtaining an understanding of the LISS plating process through interactions with expert orthopedic surgeons and developing the information centric models. The information centric models provided a structured basis to design and build the simulator. Subsequently, the haptic-based simulator was built. Finally, the learning assessments were conducted in a medical school. The results from the learning assessments confirm the effectiveness of the VSE for teaching medical residents and students. The scope of the assessment was to ensure (1) the correctness and (2) the usefulness of the VSE. Out of 37 residents/students who participated in the test, 32 showed improvements in their understanding of the LISS plating surgical process. A majority of participants were satisfied with the use of teaching Avatars and haptic technology. A paired t test was conducted to test the statistical significance of the assessment data which showed that the data were statistically significant. This paper demonstrates the usefulness of adopting information centric modeling approach in the design and development of the simulator. The assessment results underscore the potential of using VR-based simulators in medical education especially in orthopedic surgery.

  15. Do provisions to advance chemical facility safety also advance chemical facility security? - An analysis of possible synergies

    OpenAIRE

    Hedlund, Frank Huess

    2012-01-01

    The European Commission has launched a study on the applicability of existing chemical industry safety provisions to enhancing security of chemical facilities covering the situation in 18 EU Member States. This paper reports some preliminary analytical findings regarding the extent to which existing provisions that have been put into existence to advance safety objectives due to synergy effects could be expected advance security objectives as well.The paper provides a conceptual definition of...

  16. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  17. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  18. Analytic models for the evolution of semilocal string networks

    International Nuclear Information System (INIS)

    Nunes, A. S.; Martins, C. J. A. P.; Avgoustidis, A.; Urrestilla, J.

    2011-01-01

    We revisit previously developed analytic models for defect evolution and adapt them appropriately for the study of semilocal string networks. We thus confirm the expectation (based on numerical simulations) that linear scaling evolution is the attractor solution for a broad range of model parameters. We discuss in detail the evolution of individual semilocal segments, focusing on the phenomenology of segment growth, and also provide a preliminary comparison with existing numerical simulations.

  19. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  20. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  1. Development of integrated analytical data management system

    International Nuclear Information System (INIS)

    Onishi, Koichi; Wachi, Isamu; Hiroki, Toshio

    1986-01-01

    The Analysis Subsection of Technical Service Section, Tokai Reprocessing Plant, Tokai Works, is engaged in analysis activities required for the management of processes and measurements in the plant. Currently, it has been desired to increase the reliability of analytical data and to perform analyses more rapidly to cope with the increasing number of analysis works. To meet this end, on-line data processing has been promoted and advanced analytical equipment has been introduced in order to enhance automization. In the present study, an integrated analytical data mangement system is developed which serves for improvement of reliability of analytical data as well as for rapid retrieval and automatic compilation of these data. Fabrication of a basic model of the system has been nearly completed and test operation has already been started. In selecting hardware to be used, examinations were made on easiness of system extension, Japanese language processing function for improving man-machine interface, large-capacity auxiliary memory system, and data base processing function. The existing analysis works wer reviewed in establishing the basic design of the system. According to this basic design, the system can perform such works as analysis of application slips received from clients as well as recording, sending, filing and retrieval of analysis results. (Nogami, K.)

  2. Numerical simulation of flow field in the China advanced research reactor flow-guide tank

    International Nuclear Information System (INIS)

    Xu Changjiang

    2002-01-01

    The flow-guide tank in China advanced research reactor (CARR) acts as a reactor inlet coolant distributor and play an important role in reducing the flow-induced vibration of the internal components of the reactor core. Numerical simulations of the flow field in the flow-guide tank under different conceptual designing configurations are carried out using the PHOENICS3.2. It is seen that the inlet coolant is well distributed circumferentially into the flow-guide tank with the inlet buffer plate and the flow distributor barrel. The maximum cross-flow velocity within the flow-guide tank is reduced significantly, and the reduction of flow-induced vibration of reactor internals is expected

  3. Advancement of CMOS Doping Technology in an External Development Framework

    Science.gov (United States)

    Jain, Amitabh; Chambers, James J.; Shaw, Judy B.

    2011-01-01

    The consumer appetite for a rich multimedia experience drives technology development for mobile hand-held devices and the infrastructure to support them. Enhancements in functionality, speed, and user experience are derived from advancements in CMOS technology. The technical challenges in developing each successive CMOS technology node to support these enhancements have become increasingly difficult. These trends have motivated the CMOS business towards a collaborative approach based on strategic partnerships. This paper describes our model and experience of CMOS development, based on multi-dimensional industrial and academic partnerships. We provide to our process equipment, materials, and simulation partners, as well as to our silicon foundry partners, the detailed requirements for future integrated circuit products. This is done very early in the development cycle to ensure that these requirements can be met. In order to determine these fundamental requirements, we rely on a strategy that requires strong interaction between process and device simulation, physical and chemical analytical methods, and research at academic institutions. This learning is shared with each project partner to address integration and manufacturing issues encountered during CMOS technology development from its inception through product ramp. We utilize TI's core strengths in physical analysis, unit processes and integration, yield ramp, reliability, and product engineering to support this technological development. Finally, this paper presents examples of the advancement of CMOS doping technology for the 28 nm node and beyond through this development model.

  4. Advanced calculus a transition to analysis

    CERN Document Server

    Dence, Thomas P

    2010-01-01

    Designed for a one-semester advanced calculus course, Advanced Calculus explores the theory of calculus and highlights the connections between calculus and real analysis -- providing a mathematically sophisticated introduction to functional analytical concepts. The text is interesting to read and includes many illustrative worked-out examples and instructive exercises, and precise historical notes to aid in further exploration of calculus. Ancillary list: * Companion website, Ebook- http://www.elsevierdirect.com/product.jsp?isbn=9780123749550 * Student Solutions Manual- To come * Instructor

  5. Optical coherence tomography: Monte Carlo simulation and improvement by optical amplification

    DEFF Research Database (Denmark)

    Tycho, Andreas

    2002-01-01

    An advanced novel Monte Carlo simulation model of the detection process of an optical coherence tomography (OCT) system is presented. For the first time it is shown analytically that the applicability of the incoherent Monte Carlo approach to model the heterodyne detection process of an OCT system...... is firmly justified. This is obtained by calculating the heterodyne mixing of the reference and sample beams in a plane conjugate to the discontinuity in the sample probed by the system. Using this approach, a novel expression for the OCT signal is derived, which only depends uopon the intensity...... flexibility of Monte Carlo simulations, this new model is demonstrated to be excellent as a numerical phantom, i.e., as a substitute for otherwise difficult experiments. Finally, a new model of the signal-to-noise ratio (SNR) of an OCT system with optical amplification of the light reflected from the sample...

  6. Blending technology in teaching advanced health assessment in a family nurse practitioner program: using personal digital assistants in a simulation laboratory.

    Science.gov (United States)

    Elliott, Lydia; DeCristofaro, Claire; Carpenter, Alesia

    2012-09-01

    This article describes the development and implementation of integrated use of personal handheld devices (personal digital assistants, PDAs) and high-fidelity simulation in an advanced health assessment course in a graduate family nurse practitioner (NP) program. A teaching tool was developed that can be utilized as a template for clinical case scenarios blending these separate technologies. Review of the evidence-based literature, including peer-reviewed articles and reviews. Blending the technologies of high-fidelity simulation and handheld devices (PDAs) provided a positive learning experience for graduate NP students in a teaching laboratory setting. Combining both technologies in clinical case scenarios offered a more real-world learning experience, with a focus on point-of-care service and integration of interview and physical assessment skills with existing standards of care and external clinical resources. Faculty modeling and advance training with PDA technology was crucial to success. Faculty developed a general template tool and systems-based clinical scenarios integrating PDA and high-fidelity simulation. Faculty observations, the general template tool, and one scenario example are included in this article. ©2012 The Author(s) Journal compilation ©2012 American Academy of Nurse Practitioners.

  7. Let's Not Forget: Learning Analytics Are about Learning

    Science.gov (United States)

    Gaševic, Dragan; Dawson, Shane; Siemens, George

    2015-01-01

    The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational…

  8. Process simulation of heavy water plants - a powerful analytical tool

    International Nuclear Information System (INIS)

    Miller, A.I.

    1978-10-01

    The commercially conscious designs of Canadian GS (Girdler-Sulphide) have proved sensitive to process conditions. That, combined with the large scale of our units, has meant that computer simulation of their behaviour has been a natural and profitable development. Atomic Energy of Canada Limited has developed a family of steady state simulations to describe all of the Canadian plants. Modelling of plant conditions has demonstrated that the simulation description is very precise and it has become an integral part of the industry's assessments of both plant operation and decisions on capital expenditures. The simulation technique has also found extensive use in detailed designing of both the rehabilitated Glace Bay and the new La Prade plants. It has opened new insights into plant design and uncovered a radical and significant flowsheet change for future designs as well as many less dramatic but valuable lesser changes. (author)

  9. A Simplified Analytical Technique for High Frequency Characterization of Resonant Tunneling Diode

    Directory of Open Access Journals (Sweden)

    DESSOUKI, A. A. S.

    2014-11-01

    Full Text Available his paper proposes a simplified analytical technique for high frequency characterization of the resonant tunneling diode (RTD. An equivalent circuit of the RTD that consists of a parallel combination of conductance, G (V, f, and capacitance, C (V, f is formulated. The proposed approach uses the measured DC current versus voltage characteristic of the RTD to extract the equivalent circuit elements parameters in the entire bias range. Using the proposed analytical technique, the frequency response - including the high frequency range - of many characteristic aspects of the RTD is investigated. Also, the maximum oscillation frequency of the RTD is calculated. The results obtained have been compared with those concluded and reported in the literature. The reported results in literature were obtained through simulation of the RTD at high frequency using either a computationally complicated quantum simulator or through difficult RF measurements. A similar pattern of results and highly concordant conclusion are obtained. The proposed analytical technique is simple, correct, and appropriate to investigate the behavior of the RTD at high frequency. In addition, the proposed technique can be easily incorporated into SPICE program to simulate circuits containing RTD.

  10. Recent Advances in Bioprinting and Applications for Biosensing

    Directory of Open Access Journals (Sweden)

    Andrew D. Dias

    2014-04-01

    Full Text Available Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic.

  11. Composite Cure Process Modeling and Simulations using COMPRO(Registered Trademark) and Validation of Residual Strains using Fiber Optics Sensors

    Science.gov (United States)

    Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.

    2016-01-01

    Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process

  12. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  13. A novel method for energy harvesting simulation based on scenario generation

    Science.gov (United States)

    Wang, Zhe; Li, Taoshen; Xiao, Nan; Ye, Jin; Wu, Min

    2018-06-01

    Energy harvesting network (EHN) is a new form of computer networks. It converts ambient energy into usable electric energy and supply the electrical energy as a primary or secondary power source to the communication devices. However, most of the EHN uses the analytical probability distribution function to describe the energy harvesting process, which cannot accurately identify the actual situation for the lack of authenticity. We propose an EHN simulation method based on scenario generation in this paper. Firstly, instead of setting a probability distribution in advance, it uses optimal scenario reduction technology to generate representative scenarios in single period based on the historical data of the harvested energy. Secondly, it uses homogeneous simulated annealing algorithm to generate optimal daily energy harvesting scenario sequences to get a more accurate simulation of the random characteristics of the energy harvesting network. Then taking the actual wind power data as an example, the accuracy and stability of the method are verified by comparing with the real data. Finally, we cite an instance to optimize the network throughput, which indicate the feasibility and effectiveness of the method we proposed from the optimal solution and data analysis in energy harvesting simulation.

  14. Design and Test of Advanced Thermal Simulators for an Alkali Metal-Cooled Reactor Simulator

    Science.gov (United States)

    Garber, Anne E.; Dickens, Ricky E.

    2011-01-01

    The Early Flight Fission Test Facility (EFF-TF) at NASA Marshall Space Flight Center (MSFC) has as one of its primary missions the development and testing of fission reactor simulators for space applications. A key component in these simulated reactors is the thermal simulator, designed to closely mimic the form and function of a nuclear fuel pin using electric heating. Continuing effort has been made to design simple, robust, inexpensive thermal simulators that closely match the steady-state and transient performance of a nuclear fuel pin. A series of these simulators have been designed, developed, fabricated and tested individually and in a number of simulated reactor systems at the EFF-TF. The purpose of the thermal simulators developed under the Fission Surface Power (FSP) task is to ensure that non-nuclear testing can be performed at sufficiently high fidelity to allow a cost-effective qualification and acceptance strategy to be used. Prototype thermal simulator design is founded on the baseline Fission Surface Power reactor design. Recent efforts have been focused on the design, fabrication and test of a prototype thermal simulator appropriate for use in the Technology Demonstration Unit (TDU). While designing the thermal simulators described in this paper, effort were made to improve the axial power profile matching of the thermal simulators. Simultaneously, a search was conducted for graphite materials with higher resistivities than had been employed in the past. The combination of these two efforts resulted in the creation of thermal simulators with power capacities of 2300-3300 W per unit. Six of these elements were installed in a simulated core and tested in the alkali metal-cooled Fission Surface Power Primary Test Circuit (FSP-PTC) at a variety of liquid metal flow rates and temperatures. This paper documents the design of the thermal simulators, test program, and test results.

  15. Integrating Business Analytics in the Marketing Curriculum: Eight Recommendations

    Science.gov (United States)

    LeClair, Dan

    2018-01-01

    Advances in technology and marketing practice have left little doubt that analytics must be integrated into the marketing curriculum, the question for many educators now is how to best to do so. While the response for each school will depend on its mission and context, as well as its strategies and resources, there already is much that can be…

  16. A three-dimensional (3D) analytical model for subthreshold characteristics of uniformly doped FinFET

    Science.gov (United States)

    Tripathi, Shweta; Narendar, Vadthiya

    2015-07-01

    In this paper, three dimensional (3D) analytical model for subthreshold characteristics of doped FinFET has been presented. The separation of variables technique is used to solve the 3D Poisson's equation analytically with appropriate boundary conditions so as to obtain the expression for channel potential. The thus obtained potential distribution function has been employed in deriving subthreshold current and subthreshold slope model. The channel potential characteristics have been studied as a function of various device parameters such as gate length, gate oxide thickness and channel doping. The proposed analytical model results have been validated by comparing with the simulation data obtained by the 3D device simulator ATLAS™ from Silvaco.

  17. Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Samatova, Nagiza [North Carolina State Univ., Raleigh, NC (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Liao, Wei-keng [Northwestern Univ., Evanston, IL (United States)

    2015-03-19

    This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.

  18. CFD simulations of moderator flow inside Calandria of the Passive Moderator Cooling System of an advanced reactor

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Eshita [Homi Bhabha National Institute, Anushaktinagar, Mumbai 400 094 (India); Kumar, Mukesh [Reactor Engineering Division, Bhabha Atomic Research Center, Trombay, Mumbai 400 085 (India); Joshi, Jyeshtharaj B., E-mail: jbjoshi@gmail.com [Homi Bhabha National Institute, Anushaktinagar, Mumbai 400 094 (India); Department of Chemical Engineering, Institute of Chemical Technology, Matunga, Mumbai 400019 India (India); Nayak, Arun K. [Reactor Engineering Division, Bhabha Atomic Research Center, Trombay, Mumbai 400 085 (India); Vijayan, Pallippattu K., E-mail: vijayanp@barc.gov.in [Reactor Engineering Division, Bhabha Atomic Research Center, Trombay, Mumbai 400 085 (India)

    2015-10-15

    Highlights: • CFD simulations in the Calandria of an advanced reactor under natural circulation. • Under natural convection, majority of the flow recirculates within the Calandria. • Maximum temperature is located at the top and center of the fuel channel matrix. • During SBO, temperature inside Calandria is stratified. - Abstract: Passive systems are being examined for the future Advanced Nuclear Reactor designs. One of such concepts is the Passive Moderator Cooling System (PMCS), which is designed to remove heat from the moderator in the Calandria vessel passively in case of an extended Station Black Out condition. The heated heavy-water moderator (due to heat transferred from the Main Heat Transport System (MHTS) and thermalization of neutrons and gamma from radioactive decay of fuel) rises upward due to buoyancy, gets cooled down in a heat exchanger and returns back to Calandria, completing a natural circulation loop. The natural circulation should provide sufficient cooling to prevent the increase of moderator temperature and pressure beyond safe limits. In an earlier study, a full-scale 1D transient simulation was performed for the reactor including the MHTS and the PMCS, in the event of a station blackout scenario (Kumar et al., 2013). The results indicate that the systems remain within the safe limits for 7 days. However, the flow inside a geometry like Calandria is quite complex due to its large size and inner complexities of dense fuel channel matrix, which was simplified as a 1D pipe flow in the aforesaid analysis. In the current work, CFD simulations are performed to study the temperature distributions and flow distribution of moderator inside the Calandria vessel using a three-dimensional CFD code, OpenFoam 2.2.0. First, a set of steady state simulation was carried out for a band of inlet mass flow rates, which gives the minimum mass flow rate required for removing the maximum heat load, by virtue of prediction of hot spots inside the Calandria

  19. A general analytical expression for the three-dimensional Franck-Condon integral and simulation of the photodetachment spectrum of the PO2- anion

    Science.gov (United States)

    Liang, Jun; Cui, Fang; Wang, Ru; Huang, Wei; Cui, Zhifeng

    2013-04-01

    Calculations of Franck-Condon factors are crucial for interpreting vibronic spectra of molecules and studying nonradiative processes. We have derived straightforwardly a more general analytical expression for the calculation of the three-dimensional Franck-Condon overlap integrals on the basis of harmonic oscillator approximation under the influence of mode mixing effects. This new analytical expression was applied to study the photoelectron spectra of PO2-. The theoretical spectrum obtained by employing CCSD(T) values is in excellent agreement with the observed one. An 'irregular spacing' observed in the experimental photoelectron spectrum of PO2- is interpreted as contributing from a hot-band sequence of the bending vibration ω2 and combination bands of the stretching vibration ω1 and the bending vibration ω2. In addition, the equilibrium geometry parameters, r(O-P) = 1.495 ± 0.005 Å and ∠(O-P-O) = 119.5 ± 0.5°, of theXA1 state of PO2-, are derived by employing an iterative Franck-Condon analysis procedure in the spectral simulation.

  20. Advanced simulation capability for environmental management - current status and future applications

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, Mark; Scheibe, Timothy [Pacific Northwest National Laboratory, Richland, Washington (United States); Robinson, Bruce; Moulton, J. David; Dixon, Paul [Los Alamos National Laboratory, Los Alamos, New Mexico (United States); Marble, Justin; Gerdes, Kurt [U.S. Department of Energy, Office of Environmental Management, Washington DC (United States); Stockton, Tom [Neptune and Company, Inc, Los Alamos, New Mexico (United States); Seitz, Roger [Savannah River National Laboratory, Aiken, South Carolina (United States); Black, Paul [Neptune and Company, Inc, Lakewood, Colorado (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater (EM-12), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach that is currently aimed at understanding and predicting contaminant fate and transport in natural and engineered systems. ASCEM is a modular and open source high-performance computing tool. It will be used to facilitate integrated approaches to modeling and site characterization, and provide robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, with current emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) multi-process simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The integration of the Platform and HPC capabilities were tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities in 2012. The current maturity of the ASCEM computational and analysis capabilities has afforded the opportunity for collaborative efforts to develop decision analysis tools to support and optimize radioactive waste disposal. Recent advances in computerized decision analysis frameworks provide the perfect opportunity to bring this capability into ASCEM. This will allow radioactive waste