WorldWideScience

Sample records for advanced mechanistic code

  1. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  2. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  3. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  4. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  5. Mechanistic modelling of gaseous fission product behaviour in UO2 fuel by Rtop code

    International Nuclear Information System (INIS)

    Kanukova, V.D.; Khoruzhii, O.V.; Kourtchatov, S.Y.; Likhanskii, V.V.; Matveew, L.V.

    2002-01-01

    The current status of a mechanistic modelling by the RTOP code of the fission product behaviour in polycrystalline UO 2 fuel is described. An outline of the code and implemented physical models is presented. The general approach to code validation is discussed. It is exemplified by the results of validation of the models of fuel oxidation and grain growth. The different models of intragranular and intergranular gas bubble behaviour have been tested and the sensitivity of the code in the framework of these models has been analysed. An analysis of available models of the resolution of grain face bubbles is also presented. The possibilities of the RTOP code are presented through the example of modelling behaviour of WWER fuel over the course of a comparative WWER-PWR experiment performed at Halden and by comparison with Yanagisawa experiments. (author)

  6. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  7. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  8. Zebra: An advanced PWR lattice code

    International Nuclear Information System (INIS)

    Cao, L.; Wu, H.; Zheng, Y.

    2012-01-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  9. Zebra: An advanced PWR lattice code

    Energy Technology Data Exchange (ETDEWEB)

    Cao, L.; Wu, H.; Zheng, Y. [School of Nuclear Science and Technology, Xi' an Jiaotong Univ., No. 28, Xianning West Road, Xi' an, ShannXi, 710049 (China)

    2012-07-01

    This paper presents an overview of an advanced PWR lattice code ZEBRA developed at NECP laboratory in Xi'an Jiaotong Univ.. The multi-group cross-section library is generated from the ENDF/B-VII library by NJOY and the 361-group SHEM structure is employed. The resonance calculation module is developed based on sub-group method. The transport solver is Auto-MOC code, which is a self-developed code based on the Method of Characteristic and the customization of AutoCAD software. The whole code is well organized in a modular software structure. Some numerical results during the validation of the code demonstrate that this code has a good precision and a high efficiency. (authors)

  10. Advanced thermionic reactor systems design code

    International Nuclear Information System (INIS)

    Lewis, B.R.; Pawlowski, R.A.; Greek, K.J.; Klein, A.C.

    1991-01-01

    An overall systems design code is under development to model an advanced in-core thermionic nuclear reactor system for space applications at power levels of 10 to 50 kWe. The design code is written in an object-oriented programming environment that allows the use of a series of design modules, each of which is responsible for the determination of specific system parameters. The code modules include a neutronics and core criticality module, a core thermal hydraulics module, a thermionic fuel element performance module, a radiation shielding module, a module for waste heat transfer and rejection, and modules for power conditioning and control. The neutronics and core criticality module determines critical core size, core lifetime, and shutdown margins using the criticality calculation capability of the Monte Carlo Neutron and Photon Transport Code System (MCNP). The remaining modules utilize results of the MCNP analysis along with FORTRAN programming to predict the overall system performance

  11. Zebrafish Models of Human Leukemia: Technological Advances and Mechanistic Insights.

    Science.gov (United States)

    Harrison, Nicholas R; Laroche, Fabrice J F; Gutierrez, Alejandro; Feng, Hui

    2016-01-01

    Insights concerning leukemic pathophysiology have been acquired in various animal models and further efforts to understand the mechanisms underlying leukemic treatment resistance and disease relapse promise to improve therapeutic strategies. The zebrafish (Danio rerio) is a vertebrate organism with a conserved hematopoietic program and unique experimental strengths suiting it for the investigation of human leukemia. Recent technological advances in zebrafish research including efficient transgenesis, precise genome editing, and straightforward transplantation techniques have led to the generation of a number of leukemia models. The transparency of the zebrafish when coupled with improved lineage-tracing and imaging techniques has revealed exquisite details of leukemic initiation, progression, and regression. With these advantages, the zebrafish represents a unique experimental system for leukemic research and additionally, advances in zebrafish-based high-throughput drug screening promise to hasten the discovery of novel leukemia therapeutics. To date, investigators have accumulated knowledge of the genetic underpinnings critical to leukemic transformation and treatment resistance and without doubt, zebrafish are rapidly expanding our understanding of disease mechanisms and helping to shape therapeutic strategies for improved outcomes in leukemic patients.

  12. AECL's advanced code program

    Energy Technology Data Exchange (ETDEWEB)

    McGee, G.; Ball, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    This paper discusses the advanced code project at AECL.Current suite of Analytical, Scientific and Design (ASD) computer codes in use by Canadian Nuclear Power Industry is mostly developed 20 or more years ago. It is increasingly difficult to develop and maintain. It consist of many independent tools and integrated analysis is difficult, time consuming and error-prone. The objectives of this project is to demonstrate that nuclear facility systems, structures and components meet their design objectives in terms of function, cost, and safety; demonstrate that the nuclear facility meets licensing requirements in terms of consequences of off-normal events; dose to public, workers, impact on environment and demonstrate that the nuclear facility meets operational requirements with respect to on-power fuelling and outage management.

  13. Numerical analysis of reflood simulation based on a mechanistic, best-estimate, approach by KREWET code

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Jeong, Eun-Soo

    1983-01-01

    A new computer code entitled KREWET has been developed in an effort to improve the accuracy and applicability of the existing reflood heat transfer simulation computer code. Sample calculations for temperature histories and heat transfer coefficient are made using KREWET code and the results are compared with the predictions of REFLUX, QUEN1D, and the PWR-FLECHT data for various conditions. These show favourable agreement in terms of clad temperature versus time. For high flooding rates (5-15cm/sec) and high pressure (∼413 Kpa), reflood predictions are reasonably well predicted by KREWET code as well as with other codes. For low flooding rates (less than ∼4cm/sec) and low pressure (∼138Kpa), predictions show considerable error in evaluating the rewet position versus time. This observation is common to all the codes examined in the present work

  14. Numerical analysis for reflood simulation based on a mechanistic, best-estimate, approach by KREWET code

    International Nuclear Information System (INIS)

    Chun, M.-H.; Jeong, E.-S.

    1983-01-01

    A new computer code entitled KREWET has been developed in an effort to improve the accuracy and applicability of the existing reflood heat transfer simulation computer code. Sample calculations for temperature histories and heat transfer coefficient are made using KREWET code and the results are compared with the predictions of REFLUX, QUENID, and the PWR-FLECHT data for various conditions. These show favorable agreement in terms of clad temperature versus time. For high flooding rates (5-15cm/sec) and high pressure (approx. =413 Kpa), reflood predictions are reasonably well predicted by KREWET code as well as with other codes. For low flooding rates (less than approx. =4cm/sec) and low pressure (approx. =138 Kpa), predictions show considerable error in evaluating the rewet position versus time. This observation is common to all the codes examined in the present work

  15. Advanced Imaging Optics Utilizing Wavefront Coding.

    Energy Technology Data Exchange (ETDEWEB)

    Scrymgeour, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boye, Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Adelsberger, Kathleen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Image processing offers a potential to simplify an optical system by shifting some of the imaging burden from lenses to the more cost effective electronics. Wavefront coding using a cubic phase plate combined with image processing can extend the system's depth of focus, reducing many of the focus-related aberrations as well as material related chromatic aberrations. However, the optimal design process and physical limitations of wavefront coding systems with respect to first-order optical parameters and noise are not well documented. We examined image quality of simulated and experimental wavefront coded images before and after reconstruction in the presence of noise. Challenges in the implementation of cubic phase in an optical system are discussed. In particular, we found that limitations must be placed on system noise, aperture, field of view and bandwidth to develop a robust wavefront coded system.

  16. Iterative Systems Biology for Medicine – time for advancing from network signature to mechanistic equations

    KAUST Repository

    Gomez-Cabrero, David

    2017-05-09

    The rise and growth of Systems Biology following the sequencing of the human genome has been astounding. Early on, an iterative wet-dry methodology was formulated which turned out as a successful approach in deciphering biological complexity. Such type of analysis effectively identified and associated molecular network signatures operative in biological processes across different systems. Yet, it has proven difficult to distinguish between causes and consequences, thus making it challenging to attack medical questions where we require precise causative drug targets and disease mechanisms beyond a web of associated markers. Here we review principal advances with regard to identification of structure, dynamics, control, and design of biological systems, following the structure in the visionary review from 2002 by Dr. Kitano. Yet, here we find that the underlying challenge of finding the governing mechanistic system equations enabling precision medicine remains open thus rendering clinical translation of systems biology arduous. However, stunning advances in raw computational power, generation of high-precision multi-faceted biological data, combined with powerful algorithms hold promise to set the stage for data-driven identification of equations implicating a fundamental understanding of living systems during health and disease.

  17. Recent advances in multiview distributed video coding

    Science.gov (United States)

    Dufaux, Frederic; Ouaret, Mourad; Ebrahimi, Touradj

    2007-04-01

    We consider dense networks of surveillance cameras capturing overlapped images of the same scene from different viewing directions, such a scenario being referred to as multi-view. Data compression is paramount in such a system due to the large amount of captured data. In this paper, we propose a Multi-view Distributed Video Coding approach. It allows for low complexity / low power consumption at the encoder side, and the exploitation of inter-view correlation without communications among the cameras. We introduce a combination of temporal intra-view side information and homography inter-view side information. Simulation results show both the improvement of the side information, as well as a significant gain in terms of coding efficiency.

  18. A strategy of implementation of the improved constitutive equations for the advanced subchannel code

    International Nuclear Information System (INIS)

    Shirai, Hiroshi; Hotta, Akitoshi; Ninokata, Hisashi

    2004-01-01

    To develop the advanced subchannel analysis code, the dominant factors that influence the boiling transitional process must be taken into account in the mechanistic constitutive equations based on the flow geometries and the fluid properties. The dominant factors that influence the boiling transitional processes are (1) the gas-liquid re-distribution by cross flow, (2) the liquid film dryout, (3) the two-phase flow regime transition, (4) the droplet deposition, and (5) the spacer-droplet interaction. At first, we indicated the strategy for the development of the constitutive equations for the five dominant factors based on the experimental database by the latest measurement technique and the latest computational fluid dynamics method. Then, the problems of the present constitutive equations and the improvement plan of the constitutive equations were indicated. Finally, the layered structure for the two-phase/three-field subchannel code including the new constitutive equations was designed. (author)

  19. Foundational development of an advanced nuclear reactor integrated safety code

    International Nuclear Information System (INIS)

    Clarno, Kevin; Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth; Hooper, Russell Warren; Humphries, Larry LaRon

    2010-01-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  20. Foundational development of an advanced nuclear reactor integrated safety code.

    Energy Technology Data Exchange (ETDEWEB)

    Clarno, Kevin (Oak Ridge National Laboratory, Oak Ridge, TN); Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth (Ktech Corporation, Albuquerque, NM); Hooper, Russell Warren; Humphries, Larry LaRon

    2010-02-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  1. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    Science.gov (United States)

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  2. Advanced codes and methods supporting improved fuel cycle economics - 5493

    International Nuclear Information System (INIS)

    Curca-Tivig, F.; Maupin, K.; Thareau, S.

    2015-01-01

    AREVA's code development program was practically completed in 2014. The basic codes supporting a new generation of advanced methods are the followings. GALILEO is a state-of-the-art fuel rod performance code for PWR and BWR applications. Development is completed, implementation started in France and the U.S.A. ARCADIA-1 is a state-of-the-art neutronics/ thermal-hydraulics/ thermal-mechanics code system for PWR applications. Development is completed, implementation started in Europe and in the U.S.A. The system thermal-hydraulic codes S-RELAP5 and CATHARE-2 are not really new but still state-of-the-art in the domain. S-RELAP5 was completely restructured and re-coded such that its life cycle increases by further decades. CATHARE-2 will be replaced in the future by the new CATHARE-3. The new AREVA codes and methods are largely based on first principles modeling with an extremely broad international verification and validation data base. This enables AREVA and its customers to access more predictable licensing processes in a fast evolving regulatory environment (new safety criteria, requests for enlarged qualification databases, statistical applications, uncertainty propagation...). In this context, the advanced codes and methods and the associated verification and validation represent the key to avoiding penalties on products, on operational limits, or on methodologies themselves

  3. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  4. Recent advances in the Poisson/superfish codes

    International Nuclear Information System (INIS)

    Ryne, R.; Barts, T.; Chan, K.C.D.; Cooper, R.; Deaven, H.; Merson, J.; Rodenz, G.

    1992-01-01

    We report on advances in the POISSON/SUPERFISH family of codes used in the design and analysis of magnets and rf cavities. The codes include preprocessors for mesh generation and postprocessors for graphical display of output and calculation of auxiliary quantities. Release 3 became available in January 1992; it contains many code corrections and physics enhancements, and it also includes support for PostScript, DISSPLA, GKS and PLOT10 graphical output. Release 4 will be available in September 1992; it is free of all bit packing, making the codes more portable and able to treat very large numbers of mesh points. Release 4 includes the preprocessor FRONT and a new menu-driven graphical postprocessor that runs on workstations under X-Windows and that is capable of producing arrow plots. We will present examples that illustrate the new capabilities of the codes. (author). 6 refs., 3 figs

  5. ANDREA: Advanced nodal diffusion code for reactor analysis

    International Nuclear Information System (INIS)

    Belac, J.; Josek, R.; Klecka, L.; Stary, V.; Vocka, R.

    2005-01-01

    A new macro code is being developed at NRI which will allow coupling of the advanced thermal-hydraulics model with neutronics calculations as well as efficient use in core loading pattern optimization process. This paper describes the current stage of the macro code development. The core simulator is based on the nodal expansion method, Helios lattice code is used for few group libraries preparation. Standard features such as pin wise power reconstruction and feedback iterations on critical control rod position, boron concentration and reactor power are implemented. A special attention is paid to the system and code modularity in order to enable flexible and easy implementation of new features in future. Precision of the methods used in the macro code has been verified on available benchmarks. Testing against Temelin PWR operational data is under way (Authors)

  6. Evaluation of Advanced Models for PAFS Condensation Heat Transfer in SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Byoung-Uhn; Kim, Seok; Park, Yu-Sun; Kang, Kyung Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Tae-Hwan; Yun, Byong-Jo [Pusan National University, Busan (Korea, Republic of)

    2015-10-15

    The PAFS (Passive Auxiliary Feedwater System) is operated by the natural circulation to remove the core decay heat through the PCHX (Passive Condensation Heat Exchanger) which is composed of the nearly horizontal tubes. For validation of the cooling and operational performance of the PAFS, PASCAL (PAFS Condensing Heat Removal Assessment Loop) facility was constructed and the condensation heat transfer and natural convection phenomena in the PAFS was experimentally investigated at KAERI (Korea Atomic Energy Research Institute). From the PASCAL experimental result, it was found that conventional system analysis code underestimated the condensation heat transfer. In this study, advanced condensation heat transfer models which can treat the heat transfer mechanisms with the different flow regimes in the nearly horizontal heat exchanger tube were analyzed. The models were implemented in a thermal hydraulic safety analysis code, SPACE (Safety and Performance Analysis Code for Nuclear Power Plant), and it was evaluated with the PASCAL experimental data. With an aim of enhancing the prediction capability for the condensation phenomenon inside the PCHX tube of the PAFS, advanced models for the condensation heat transfer were implemented into the wall condensation model of the SPACE code, so that the PASCAL experimental result was utilized to validate the condensation models. Calculation results showed that the improved model for the condensation heat transfer coefficient enhanced the prediction capability of the SPACE code. This result confirms that the mechanistic modeling for the film condensation in the steam phase and the convection in the condensate liquid contributed to enhance the prediction capability of the wall condensation model of the SPACE code and reduce conservatism in prediction of condensation heat transfer.

  7. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Popov, N.K.; Wren, D.J.; Snell, V.G.; White, A.J.; Boczar, P.G.

    2003-01-01

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  8. On-line application of the PANTHER advanced nodal code

    International Nuclear Information System (INIS)

    Hutt, P.K.; Knight, M.P.

    1992-01-01

    Over the last few years, Nuclear Electric has developed an integrated core performance code package for both light water reactors (LWRs) and advanced gas-cooled reactors (AGRs) that can perform a comprehensive range of calculations for fuel cycle design, safety analysis, and on-line operational support for such plants. The package consists of the following codes: WIMS for lattice physics, PANTHER whole reactor nodal flux and AGR thermal hydraulics, VIPRE for LWR thermal hydraulics, and ENIGMA for fuel performance. These codes are integrated within a UNIX-based interactive system called the Reactor Physics Workbench (RPW), which provides an interactive graphic user interface and quality assurance records/data management. The RPW can also control calculational sequences and data flows. The package has been designed to run both off-line and on-line accessing plant data through the RPW

  9. FAST: An advanced code system for fast reactor transient analysis

    International Nuclear Information System (INIS)

    Mikityuk, Konstantin; Pelloni, Sandro; Coddington, Paul; Bubelis, Evaldas; Chawla, Rakesh

    2005-01-01

    One of the main goals of the FAST project at PSI is to establish a unique analytical code capability for the core and safety analysis of advanced critical (and sub-critical) fast-spectrum systems for a wide range of different coolants. Both static and transient core physics, as well as the behaviour and safety of the power plant as a whole, are studied. The paper discusses the structure of the code system, including the organisation of the interfaces and data exchange. Examples of validation and application of the individual programs, as well as of the complete code system, are provided using studies carried out within the context of designs for experimental accelerator-driven, fast-spectrum systems

  10. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    Energy Technology Data Exchange (ETDEWEB)

    Poole, B R; Nelson, S D; Langdon, S

    2005-05-05

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes.

  11. ADVANCED ELECTRIC AND MAGNETIC MATERIAL MODELS FOR FDTD ELECTROMAGNETIC CODES

    International Nuclear Information System (INIS)

    Poole, B R; Nelson, S D; Langdon, S

    2005-01-01

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which require nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes and 1-D codes

  12. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  13. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC)

    International Nuclear Information System (INIS)

    Schultz, Peter Andrew

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V and V) is required throughout the system to establish evidence-based metrics for the level of confidence in M and S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V and V challenge at the subcontinuum scale, an approach to incorporate V and V concepts into subcontinuum scale modeling and simulation (M and S), and a plan to incrementally incorporate effective V and V into subcontinuum scale M and S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  14. Code qualification of structural materials for AFCI advanced recycling reactors.

    Energy Technology Data Exchange (ETDEWEB)

    Natesan, K.; Li, M.; Majumdar, S.; Nanstad, R.K.; Sham, T.-L. (Nuclear Engineering Division); (ORNL)

    2012-05-31

    This report summarizes the further findings from the assessments of current status and future needs in code qualification and licensing of reference structural materials and new advanced alloys for advanced recycling reactors (ARRs) in support of Advanced Fuel Cycle Initiative (AFCI). The work is a combined effort between Argonne National Laboratory (ANL) and Oak Ridge National Laboratory (ORNL) with ANL as the technical lead, as part of Advanced Structural Materials Program for AFCI Reactor Campaign. The report is the second deliverable in FY08 (M505011401) under the work package 'Advanced Materials Code Qualification'. The overall objective of the Advanced Materials Code Qualification project is to evaluate key requirements for the ASME Code qualification and the Nuclear Regulatory Commission (NRC) approval of structural materials in support of the design and licensing of the ARR. Advanced materials are a critical element in the development of sodium reactor technologies. Enhanced materials performance not only improves safety margins and provides design flexibility, but also is essential for the economics of future advanced sodium reactors. Code qualification and licensing of advanced materials are prominent needs for developing and implementing advanced sodium reactor technologies. Nuclear structural component design in the U.S. must comply with the ASME Boiler and Pressure Vessel Code Section III (Rules for Construction of Nuclear Facility Components) and the NRC grants the operational license. As the ARR will operate at higher temperatures than the current light water reactors (LWRs), the design of elevated-temperature components must comply with ASME Subsection NH (Class 1 Components in Elevated Temperature Service). However, the NRC has not approved the use of Subsection NH for reactor components, and this puts additional burdens on materials qualification of the ARR. In the past licensing review for the Clinch River Breeder Reactor Project (CRBRP

  15. Advanced Electric and Magnetic Material Models for FDTD Electromagnetic Codes

    CERN Document Server

    Poole, Brian R; Nelson, Scott D

    2005-01-01

    The modeling of dielectric and magnetic materials in the time domain is required for pulse power applications, pulsed induction accelerators, and advanced transmission lines. For example, most induction accelerator modules require the use of magnetic materials to provide adequate Volt-sec during the acceleration pulse. These models require hysteresis and saturation to simulate the saturation wavefront in a multipulse environment. In high voltage transmission line applications such as shock or soliton lines the dielectric is operating in a highly nonlinear regime, which requires nonlinear models. Simple 1-D models are developed for fast parameterization of transmission line structures. In the case of nonlinear dielectrics, a simple analytic model describing the permittivity in terms of electric field is used in a 3-D finite difference time domain code (FDTD). In the case of magnetic materials, both rate independent and rate dependent Hodgdon magnetic material models have been implemented into 3-D FDTD codes an...

  16. Beam Optics Analysis - An Advanced 3D Trajectory Code

    International Nuclear Information System (INIS)

    Ives, R. Lawrence; Bui, Thuc; Vogler, William; Neilson, Jeff; Read, Mike; Shephard, Mark; Bauer, Andrew; Datta, Dibyendu; Beal, Mark

    2006-01-01

    Calabazas Creek Research, Inc. has completed initial development of an advanced, 3D program for modeling electron trajectories in electromagnetic fields. The code is being used to design complex guns and collectors. Beam Optics Analysis (BOA) is a fully relativistic, charged particle code using adaptive, finite element meshing. Geometrical input is imported from CAD programs generating ACIS-formatted files. Parametric data is inputted using an intuitive, graphical user interface (GUI), which also provides control of convergence, accuracy, and post processing. The program includes a magnetic field solver, and magnetic information can be imported from Maxwell 2D/3D and other programs. The program supports thermionic emission and injected beams. Secondary electron emission is also supported, including multiple generations. Work on field emission is in progress as well as implementation of computer optimization of both the geometry and operating parameters. The principle features of the program and its capabilities are presented

  17. Benchmarking of the PHOENIX-P/ANC [Advanced Nodal Code] advanced nuclear design system

    International Nuclear Information System (INIS)

    Nguyen, T.Q.; Liu, Y.S.; Durston, C.; Casadei, A.L.

    1988-01-01

    At Westinghouse, an advanced neutronic methods program was designed to improve the quality of the predictions, enhance flexibility in designing advanced fuel and related products, and improve design lead time. Extensive benchmarking data is presented to demonstrate the accuracy of the Advanced Nodal Code (ANC) and the PHOENIX-P advanced lattice code. Qualification data to demonstrate the accuracy of ANC include comparison of key physics parameters against a fine-mesh diffusion theory code, TORTISE. Benchmarking data to demonstrate the validity of the PHOENIX-P methodologies include comparison of physics predictions against critical experiments, isotopics measurements and measured power distributions from spatial criticals. The accuracy of the PHOENIX-P/ANC Advanced Design System is demonstrated by comparing predictions of hot zero power physics parameters and hot full power core follow against measured data from operating reactors. The excellent performance of this system for a broad range of comparisons establishes the basis for implementation of these tools for core design, licensing and operational follow of PWR [pressurized water reactor] cores at Westinghouse

  18. Static benchmarking of the NESTLE advanced nodal code

    International Nuclear Information System (INIS)

    Mosteller, R.D.

    1997-01-01

    Results from the NESTLE advanced nodal code are presented for multidimensional numerical benchmarks representing four different types of reactors, and predictions from NESTLE are compared with measured data from pressurized water reactors (PWRs). The numerical benchmarks include cases representative of PWRs, boiling water reactors (BWRs), CANDU heavy water reactors (HWRs), and high-temperature gas-cooled reactors (HTGRs). The measured PWR data include critical soluble boron concentrations and isothermal temperature coefficients of reactivity. The results demonstrate that NESTLE correctly solves the multigroup diffusion equations for both Cartesian and hexagonal geometries, that it reliably calculates k eff and reactivity coefficients for PWRs, and that--subsequent to the incorporation of additional thermal-hydraulic models--it will be able to perform accurate calculations for the corresponding parameters in BWRs, HWRs, and HTGRs as well

  19. Recent Advances in the Mechanistic Studies of Alkylaromatic Conversions over Zeolite Catalysts

    International Nuclear Information System (INIS)

    Min, Hyung-Ki; Hong, Suk Bong

    2013-01-01

    The transformation of alkylaromatic hydrocarbons using zeolite catalysts play big part in the current petrochemical industry. Here we review recent advances in the understanding of the reaction mechanisms of various alkylaromatic conversions with respect to the structural and physicochemical properties of zeolite catalysts employed. Indeed, the shape-selective nature of zeolite catalysts determines the type of reaction intermediates and hence the prevailing reaction mechanism together with the product distribution. The prospect of zeolite catalysis in the development of more efficient petrochemical processes is also described

  20. Role of Glyoxalase 1 (Glo1 and methylglyoxal (MG in behavior: recent advances and mechanistic insights

    Directory of Open Access Journals (Sweden)

    Margaret G Distler

    2012-11-01

    Full Text Available Glyoxalase 1 (GLO1 is a ubiquitous cellular enzyme that participates in the detoxification of methylglyoxal (MG, a cyotoxic byproduct of glycolysis that induces protein modification (advanced glycation end-products, AGEs, oxidative stress, and apoptosis. The concentration of MG is elevated under high-glucose conditions, such as diabetes. As such, GLO1 and MG have been implicated in the pathogenesis of diabetic complications. Recently, findings have linked GLO1 to numerous behavioral phenotypes, including psychiatric diseases (anxiety, depression, schizophrenia, and autism and pain. This review highlights GLO1’s association with behavioral phenotypes, describes recent discoveries that have elucidated the underlying mechanisms, and identifies opportunities for future research.

  1. A comparison of two nodal codes : Advanced nodal code (ANC) and analytic function expansion nodal (AFEN) code

    International Nuclear Information System (INIS)

    Chung, S.K.; Hah, C.J.; Lee, H.C.; Kim, Y.H.; Cho, N.Z.

    1996-01-01

    Modern nodal methods usually employs the transverse integration technique in order to reduce a multi-dimensional diffusion equation to one-dimensional diffusion equations. The use of the transverse integration technique requires two major approximations such as a transverse leakage approximation and a one-dimensional flux approximation. Both the transverse leakage and the one-dimensional flux are approximated by polynomials. ANC (Advanced Nodal Code) developed by Westinghouse employs a modern nodal expansion method for the flux calculation, the equivalence theory for the homogenization error reduction and a group theory for pin power recovery. Unlike the conventional modern nodal methods, AFEN (Analytic Function Expansion Nodal) method expands homogeneous flux distributions within a node into non-separable analytic basis functions, which eliminate two major approximations of the modern nodal methods. A comparison study of AFEN with ANC has been performed to see the applicability of AFEN to commercial PWR and different types of reactors such as MOX fueled reactor. The qualification comparison results demonstrate that AFEN methodology is accurate enough to apply for commercial PWR analysis. The results show that AFEN provides very accurate results (core multiplication factor and assembly power distribution) for cores that exhibit strong flux gradients as in a MOX loaded core. (author)

  2. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    2016-04-17

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooled fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the

  3. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  4. Telemetry advances in data compression and channel coding

    Science.gov (United States)

    Miller, Warner H.; Morakis, James C.; Yeh, Pen-Shu

    1990-01-01

    Addressed in this paper is the dependence of telecommunication channel, forward error correcting coding and source data compression coding on integrated circuit technology. Emphasis is placed on real time high speed Reed Solomon (RS) decoding using full custom VLSI technology. Performance curves of NASA's standard channel coder and a proposed standard lossless data compression coder are presented.

  5. The H.264/MPEG4 advanced video coding

    Science.gov (United States)

    Gromek, Artur

    2009-06-01

    H.264/MPEG4-AVC is the newest video coding standard recommended by International Telecommunication Union - Telecommunication Standardization Section (ITU-T) and the ISO/IEC Moving Picture Expert Group (MPEG). The H.264/MPEG4-AVC has recently become leading standard for generic audiovisual services, since deployment for digital television. Nowadays is commonly used in wide range of video application ranging like mobile services, videoconferencing, IPTV, HDTV, video storage and many more. In this article, author briefly describes the technology applied in the H.264/MPEG4-AVC video coding standard, the way of real-time implementation and the way of future development.

  6. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    Energy Technology Data Exchange (ETDEWEB)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation.

  7. Advanced thermohydraulic simulation code for transients in LMFBRs (SSC-L code)

    International Nuclear Information System (INIS)

    Agrawal, A.K.

    1978-02-01

    Physical models for various processes that are encountered in preaccident and transient simulation of thermohydraulic transients in the entire liquid metal fast breeder reactor (LMFBR) plant are described in this report. A computer code, SSC-L, was written as a part of the Super System Code (SSC) development project for the ''loop''-type designs of LMFBRs. This code has the self-starting capability, i.e., preaccident or steady-state calculations are performed internally. These results then serve as the starting point for the transient simulation

  8. Application of the DART Code for the Assessment of Advanced Fuel Behavior

    International Nuclear Information System (INIS)

    Rest, J.; Totev, T.

    2007-01-01

    The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO 2 fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)

  9. New Advances in Photoionisation Codes: How and what for?

    International Nuclear Information System (INIS)

    Ercolano, Barbara

    2005-01-01

    The study of photoionised gas in planetary nebulae (PNe) has played a major role in the achievement, over the years, of a better understanding of a number of physical processes, pertinent to a broader range of fields than that of PNe studies, spanning from atomic physics to stellar evolution theories. Whilst empirical techniques are routinely employed for the analysis of the emission line spectra of these objects, the accurate interpretation of the observational data often requires the solution of a set of coupled equations, via the application of a photoionisation/plasma code. A number of large-scale codes have been developed since the late sixties, using various analytical or statistical techniques for the transfer of continuum radiation, mainly under the assumption of spherical symmetry and a few in 3D. These codes have been proved to be powerful and in many cases essential tools, but a clear idea of the underlying physical processes and assumptions is necessary in order to avoid reaching misleading conclusions.The development of the codes over the years has been driven by the observational constraints available, but also compromised by the available computer power. Modern codes are faster and more flexible, with the ultimate goal being the achievement of a description of the observations relying on the smallest number of parameters possible. In this light recent developments have been focused on the inclusion of new available atomic data, the inclusion of a realistic treatment for dust grains mixed in the ionised and photon dominated regions (PDRs) and the expansion of some codes to PDRs with the inclusion of chemical reaction networks. Furthermore the last few years have seen the development of fully 3D photoionisation codes based on the Monte Carlo method.A brief review of the field of photoionisation today is given here, with emphasis on the recent developments, including the expansion of the models to the 3D domain. Attention is given to the identification

  10. Utilization of MCNP code in the research and design for China advanced research reactor

    International Nuclear Information System (INIS)

    Shen Feng

    2006-01-01

    MCNP, which is the internationalized neutronics code, is used for nuclear research and design in China Advanced Research Reactor (CARR). MCNP is an important neutronics code in the research and design for CARR since many calculation tasks could be undertaken by it. Many nuclear parameters on reactor core, the design and optimization research for many reactor utilizations, much verification for other nuclear calculation code and so on are conducted with help of MCNP. (author)

  11. Use of advanced simulations in fuel performance codes

    International Nuclear Information System (INIS)

    Van Uffelen, P.

    2015-01-01

    The simulation of the cylindrical fuel rod behaviour in a reactor or a storage pool for spent fuel requires a fuel performance code. Such tool solves the equations for the heat transfer, the stresses and strains in fuel and cladding, the evolution of several isotopes and the behaviour of various fission products in the fuel rod. The main equations along with their limitations are briefly described. The current approaches adopted for overcoming these limitations and the perspectives are also outlined. (author)

  12. Film grain noise modeling in advanced video coding

    Science.gov (United States)

    Oh, Byung Tae; Kuo, C.-C. Jay; Sun, Shijun; Lei, Shawmin

    2007-01-01

    A new technique for film grain noise extraction, modeling and synthesis is proposed and applied to the coding of high definition video in this work. The film grain noise is viewed as a part of artistic presentation by people in the movie industry. On one hand, since the film grain noise can boost the natural appearance of pictures in high definition video, it should be preserved in high-fidelity video processing systems. On the other hand, video coding with film grain noise is expensive. It is desirable to extract film grain noise from the input video as a pre-processing step at the encoder and re-synthesize the film grain noise and add it back to the decoded video as a post-processing step at the decoder. Under this framework, the coding gain of the denoised video is higher while the quality of the final reconstructed video can still be well preserved. Following this idea, we present a method to remove film grain noise from image/video without distorting its original content. Besides, we describe a parametric model containing a small set of parameters to represent the extracted film grain noise. The proposed model generates the film grain noise that is close to the real one in terms of power spectral density and cross-channel spectral correlation. Experimental results are shown to demonstrate the efficiency of the proposed scheme.

  13. The integrated code system CASCADE-3D for advanced core design and safety analysis

    International Nuclear Information System (INIS)

    Neufert, A.; Van de Velde, A.

    1999-01-01

    The new program system CASCADE-3D (Core Analysis and Safety Codes for Advanced Design Evaluation) links some of Siemens advanced code packages for in-core fuel management and accident analysis: SAV95, PANBOX/COBRA and RELAP5. Consequently by using CASCADE-3D the potential of modern fuel assemblies and in-core fuel management strategies can be much better utilized because safety margins which had been reduced due to conservative methods are now predicted more accurately. By this innovative code system the customers can now take full advantage of the recent progress in fuel assembly design and in-core fuel management.(author)

  14. APOLLO-2: An advanced transport code for LWRs

    International Nuclear Information System (INIS)

    Mathonniere, G.

    1995-01-01

    APOLLO-2 is a fully modular code in which each module corresponds to a specific task: access to the cross-sections libraries, creation of isotopes medium or mixtures, geometry definition, self-shielding calculations, computation of multigroup collision probabilities, flux solver, depletion calculations, transport-transport or transport-diffusion equivalence process, SN calculations, etc... Modules communicate exclusively by ''objects'' containing structured data, these objects are identified and handled by user's given names. Among the major improvements offered by APOLLO-2 the modelization of the self-shielding: it is possible now to deal with a great precision, checked versus Montecarlo calculations, a fuel rod divided into several concentric rings. So the total production of Plutonium is quite better estimated than before and its radial distribution may be predicted also with a good accuracy. Thanks to the versatility of the code some reference calculations and routine ones may be compared easily because only one parameter is changed; for example the self-shielding approximations are modified, the libraries or the flux solver being exactly the same. Other interesting features have been introduced in APOLLO-2: the new isotopes JEF.2 are available in 99 and 172 energy groups libraries, the surface leakage model improves the calculation of the control rod efficiency, the flux-current method allows faster calculations, the possibility of an automatic convergence checking during the depletion calculations coupled with fully automatic corrections, heterogeneous diffusion coefficients used for voiding analysis. 17 refs, 1 tab

  15. Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98

    Energy Technology Data Exchange (ETDEWEB)

    Cerjan, Charles J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shi, Xizeng [Read-Rite Corporation, Fremont, CA (United States)

    2017-11-09

    The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executable code.

  16. ASPECT: An advanced specified-profile evaluation code for tokamaks

    International Nuclear Information System (INIS)

    Stotler, D.P.; Reiersen, W.T.; Bateman, G.

    1993-03-01

    A specified-profile, global analysis code has been developed to evaluate the performance of fusion reactor designs. Both steady-state and time-dependent calculations are carried out; the results of the former can be used in defining the parameters of the latter, if desired. In the steady-state analysis, the performance is computed at a density and temperature chosen to be consistent with input limits (e.g., density and beta) of several varieties. The calculation can be made at either the intersection of the two limits or at the point of optimum performance as the density and temperature are varied along the limiting boundaries. Two measures of performance are available for this purpose: the ignition margin or the confinement level required to achieve a prescribed ignition margin. The time-dependent calculation can be configured to yield either the evolution of plasma energy as a function of time or, via an iteration scheme, the amount of auxiliary power required to achieve a desired final plasma energy

  17. Development of an advanced code system for fast-reactor transient analysis

    International Nuclear Information System (INIS)

    Konstantin Mikityuk; Sandro Pelloni; Paul Coddington

    2005-01-01

    FAST (Fast-spectrum Advanced Systems for power production and resource management) is a recently approved PSI activity in the area of fast spectrum core and safety analysis with emphasis on generic developments and Generation IV systems. In frames of the FAST project we will study both statics and transients core physics, reactor system behaviour and safety; related international experiments. The main current goal of the project is to develop unique analytical and code capability for core and safety analysis of critical (and sub-critical) fast spectrum systems with an initial emphasis on a gas cooled fast reactors. A structure of the code system is shown on Fig. 1. The main components of the FAST code system are 1) ERANOS code for preparation of basic x-sections and their partial derivatives; 2) PARCS transient nodal-method multi-group neutron diffusion code for simulation of spatial (3D) neutron kinetics in hexagonal and square geometries; 3) TRAC/AAA code for system thermal hydraulics; 4) FRED transient model for fuel thermal-mechanical behaviour; 5) PVM system as an interface between separate parts of the code system. The paper presents a structure of the code system (Fig. 1), organization of interfaces and data exchanges between main parts of the code system, examples of verification and application of separate codes and the system as a whole. (authors)

  18. Application and Analysis of Performance of DQPSK Advanced Modulation Format in Spectral Amplitude Coding OCDMA

    Directory of Open Access Journals (Sweden)

    Abdul Latif Memon

    2014-04-01

    Full Text Available SAC (Spectral Amplitude Coding is a technique of OCDMA (Optical Code Division Multiple Access to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying is applied, simulated and analyzed. m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK

  19. Application and analysis of performance of dqpsk advanced modulation format in spectral amplitude coding ocdma

    International Nuclear Information System (INIS)

    Memon, A.

    2015-01-01

    SAC (Spectral Amplitude Coding) is a technique of OCDMA (Optical Code Division Multiple Access) to encode and decode data bits by utilizing spectral components of the broadband source. Usually OOK (ON-Off-Keying) modulation format is used in this encoding scheme. To make SAC OCDMA network spectrally efficient, advanced modulation format of DQPSK (Differential Quaternary Phase Shift Keying) is applied, simulated and analyzed, m-sequence code is encoded in the simulated setup. Performance regarding various lengths of m-sequence code is also analyzed and displayed in the pictorial form. The results of the simulation are evaluated with the help of electrical constellation diagram, eye diagram and bit error rate graph. All the graphs indicate better transmission quality in case of advanced modulation format of DQPSK used in SAC OCDMA network as compared with OOK. (author)

  20. THEHYCO-3DT: Thermal hydrodynamic code for the 3 dimensional transient calculation of advanced LMFBR core

    Energy Technology Data Exchange (ETDEWEB)

    Vitruk, S.G.; Korsun, A.S. [Moscow Engineering Physics Institute (Russian Federation); Ushakov, P.A. [Institute of Physics and Power Engineering, Obninsk (R)] [and others

    1995-09-01

    The multilevel mathematical model of neutron thermal hydrodynamic processes in a passive safety core without assemblies duct walls and appropriate computer code SKETCH, consisted of thermal hydrodynamic module THEHYCO-3DT and neutron one, are described. A new effective discretization technique for energy, momentum and mass conservation equations is applied in hexagonal - z geometry. The model adequacy and applicability are presented. The results of the calculations show that the model and the computer code could be used in conceptual design of advanced reactors.

  1. THEHYCO-3DT: Thermal hydrodynamic code for the 3 dimensional transient calculation of advanced LMFBR core

    International Nuclear Information System (INIS)

    Vitruk, S.G.; Korsun, A.S.; Ushakov, P.A.

    1995-01-01

    The multilevel mathematical model of neutron thermal hydrodynamic processes in a passive safety core without assemblies duct walls and appropriate computer code SKETCH, consisted of thermal hydrodynamic module THEHYCO-3DT and neutron one, are described. A new effective discretization technique for energy, momentum and mass conservation equations is applied in hexagonal - z geometry. The model adequacy and applicability are presented. The results of the calculations show that the model and the computer code could be used in conceptual design of advanced reactors

  2. Advanced thermal-hydraulic and neutronic codes: current and future applications. Summary and conclusions

    International Nuclear Information System (INIS)

    2001-05-01

    An OECD Workshop on Advanced Thermal-Hydraulic and Neutronic Codes Applications was held from 10 to 13 April 2000, in Barcelona, Spain, sponsored by the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Spanish Nuclear Safety Council (CSN) and hosted by CSN and the Polytechnic University of Catalonia (UPC) in collaboration with the Spanish Electricity Association (UNESA). The objectives of the Workshop were to review the developments since the previous CSNI Workshop held in Annapolis [NEA/CSNI/ R(97)4; NUREG/CP-0159], to analyse the present status of maturity and remnant needs of thermal-hydraulic (TH) and neutronic system codes and methods, and finally to evaluate the role of these tools in the evolving regulatory environment. The Technical Sessions and Discussion Sessions covered the following topics: - Regulatory requirements for Best-Estimate (BE) code assessment; - Application of TH and neutronic codes for current safety issues; - Uncertainty analysis; - Needs for integral plant transient and accident analysis; - Simulators and fast running codes; - Advances in next generation TH and neutronic codes; - Future trends in physical modeling; - Long term plans for development of advanced codes. The focus of the Workshop was on system codes. An incursion was made, however, in the new field of applying Computational Fluid Dynamic (CFD) codes to nuclear safety analysis. As a general conclusion, the Barcelona Workshop can be considered representative of the progress towards the targets marked at Annapolis almost four years ago. The Annapolis Workshop had identified areas where further development and specific improvements were needed, among them: multi-field models, transport of interfacial area, 2D and 3D thermal-hydraulics, 3-D neutronics consistent with level of details of thermal-hydraulics. Recommendations issued at Annapolis included: developing small pilot/test codes for

  3. Proceedings of the workshop on advanced thermal-hydraulic and neutronic codes: current and future applications

    International Nuclear Information System (INIS)

    2001-01-01

    An OECD Workshop on Advanced Thermal-Hydraulic and Neutronic Codes Applications was held from 10 to 13 April 2000, in Barcelona, Spain, sponsored by the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Spanish Nuclear Safety Council (CSN) and hosted by CSN and the Polytechnic University of Catalonia (UPC) in collaboration with the Spanish Electricity Association (UNESA). The objectives of the Workshop were to review the developments since the previous CSNI Workshop held in Annapolis [NEA/CSNI/ R(97)4; NUREG/CP-0159], to analyse the present status of maturity and remnant needs of thermal-hydraulic (TH) and neutronic system codes and methods, and finally to evaluate the role of these tools in the evolving regulatory environment. The Technical Sessions and Discussion Sessions covered the following topics: - Regulatory requirements for Best-Estimate (BE) code assessment; - Application of TH and neutronic codes for current safety issues; - Uncertainty analysis; - Needs for integral plant transient and accident analysis; - Simulators and fast running codes; - Advances in next generation TH and neutronic codes; - Future trends in physical modeling; - Long term plans for development of advanced codes. The focus of the Workshop was on system codes. An incursion was made, however, in the new field of applying Computational Fluid Dynamic (CFD) codes to nuclear safety analysis. As a general conclusion, the Barcelona Workshop can be considered representative of the progress towards the targets marked at Annapolis almost four years ago. The Annapolis Workshop had identified areas where further development and specific improvements were needed, among them: multi-field models, transport of interfacial area, 2D and 3D thermal-hydraulics, 3-D neutronics consistent with level of details of thermal-hydraulics. Recommendations issued at Annapolis included: developing small pilot/test codes for

  4. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, K. W

    2006-01-15

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC.

  5. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    International Nuclear Information System (INIS)

    Hwang, Dae Hyun; Seo, K. W.

    2006-01-01

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC

  6. Processes of code status transitions in hospitalized patients with advanced cancer.

    Science.gov (United States)

    El-Jawahri, Areej; Lau-Min, Kelsey; Nipp, Ryan D; Greer, Joseph A; Traeger, Lara N; Moran, Samantha M; D'Arpino, Sara M; Hochberg, Ephraim P; Jackson, Vicki A; Cashavelly, Barbara J; Martinson, Holly S; Ryan, David P; Temel, Jennifer S

    2017-12-15

    Although hospitalized patients with advanced cancer have a low chance of surviving cardiopulmonary resuscitation (CPR), the processes by which they change their code status from full code to do not resuscitate (DNR) are unknown. We conducted a mixed-methods study on a prospective cohort of hospitalized patients with advanced cancer. Two physicians used a consensus-driven medical record review to characterize processes that led to code status order transitions from full code to DNR. In total, 1047 hospitalizations were reviewed among 728 patients. Admitting clinicians did not address code status in 53% of hospitalizations, resulting in code status orders of "presumed full." In total, 275 patients (26.3%) transitioned from full code to DNR, and 48.7% (134 of 275 patients) of those had an order of "presumed full" at admission; however, upon further clarification, the patients expressed that they had wished to be DNR before the hospitalization. We identified 3 additional processes leading to order transition from full code to DNR acute clinical deterioration (15.3%), discontinuation of cancer-directed therapy (17.1%), and education about the potential harms/futility of CPR (15.3%). Compared with discontinuing therapy and education, transitions because of acute clinical deterioration were associated with less patient involvement (P = .002), a shorter time to death (P cancer were because of full code orders in patients who had a preference for DNR before hospitalization. Transitions due of acute clinical deterioration were associated with less patient engagement and a higher likelihood of inpatient death. Cancer 2017;123:4895-902. © 2017 American Cancer Society. © 2017 American Cancer Society.

  7. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  8. Channel coding/decoding alternatives for compressed TV data on advanced planetary missions.

    Science.gov (United States)

    Rice, R. F.

    1972-01-01

    The compatibility of channel coding/decoding schemes with a specific TV compressor developed for advanced planetary missions is considered. Under certain conditions, it is shown that compressed data can be transmitted at approximately the same rate as uncompressed data without any loss in quality. Thus, the full gains of data compression can be achieved in real-time transmission.

  9. Functions of Code-Switching among Iranian Advanced and Elementary Teachers and Students

    Science.gov (United States)

    Momenian, Mohammad; Samar, Reza Ghafar

    2011-01-01

    This paper reports on the findings of a study carried out on the advanced and elementary teachers' and students' functions and patterns of code-switching in Iranian English classrooms. This concept has not been adequately examined in L2 (second language) classroom contexts than in outdoor natural contexts. Therefore, besides reporting on the…

  10. Grammar Coding in the "Oxford Advanced Learner's Dictionary of Current English."

    Science.gov (United States)

    Wekker, Herman

    1992-01-01

    Focuses on the revised system of grammar coding for verbs in the fourth edition of the "Oxford Advanced Learner's Dictionary of Current English" (OALD4), comparing it with two other similar dictionaries. It is shown that the OALD4 is found to be more favorable on many criteria than the other comparable dictionaries. (16 references) (VWL)

  11. The 419 codes as business unusual: the advance fee fraud online ...

    African Journals Online (AJOL)

    The 419 codes as business unusual: the advance fee fraud online discourse. A Adogame. Abstract. No Abstract. International Journal of Humanistic Studies Vol. 5 2006: pp. 54-72. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners ...

  12. Development of Advanced Suite of Deterministic Codes for VHTR Physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, J. Y.; Lee, K. H. (and others)

    2007-07-15

    Advanced Suites of deterministic codes for VHTR physics analysis has been developed for detailed analysis of current and advanced reactor designs as part of a US-ROK collaborative I-NERI project. These code suites include the conventional 2-step procedure in which a few group constants are generated by a transport lattice calculation, and the reactor physics analysis is performed by a 3-dimensional diffusion calculation, and a whole core transport code that can model local heterogeneities directly at the core level. Particular modeling issues in physics analysis of the gas-cooled VHTRs were resolved, which include a double heterogeneity of the coated fuel particles, a neutron streaming in the coolant channels, a strong core-reflector interaction, and large spectrum shifts due to changes of the surrounding environment, temperature and burnup. And the geometry handling capability of the DeCART code were extended to deal with the hexagonal fuel elements of the VHTR core. The developed code suites were validated and verified by comparing the computational results with those of the Monte Carlo calculations for the benchmark problems.

  13. Strategies for developing subchannel capability in an advanced system thermalhydraulic code: a literature review

    International Nuclear Information System (INIS)

    Cheng, J.; Rao, Y.F.

    2015-01-01

    In the framework of developing next generation safety analysis tools, Canadian Nuclear Laboratories (CNL) has planned to incorporate subchannel analysis capability into its advanced system thermalhydraulic code CATHENA 4. This paper provides a literature review and an assessment of current subchannel codes. It also evaluates three code-development methods: (i) static coupling of CATHENA 4 with the subchannel code ASSERT-PV, (ii) dynamic coupling of the two codes, and (iii) fully implicit implementation for a new, standalone CATHENA 4 version with subchannel capability. Results of the review and assessment suggest that the current ASSERT-PV modules can be used as the base for the fully implicit implementation of subchannel capability in CATHENA 4, and that this option may be the most cost-effective in the long run, resulting in savings in user application and maintenance costs. In addition, improved versatility of the tool could be accomplished by the addition of new features that could be added as part of its development. The new features would improve the capabilities of the existing subchannel code in handling low, reverse, and stagnant flows often encountered in system thermalhydraulic analysis. Therefore, the method of fully implicit implementation is preliminarily recommended for further exploration. A feasibility study will be performed in an attempt to extend the present work into a preliminary development plan. (author)

  14. Association of code status discussion with invasive procedures among advanced-stage cancer and noncancer patients

    Directory of Open Access Journals (Sweden)

    Sasaki A

    2017-07-01

    Full Text Available Akinori Sasaki,1 Eiji Hiraoka,1 Yosuke Homma,2 Osamu Takahashi,3 Yasuhiro Norisue,4 Koji Kawai,5 Shigeki Fujitani4 1Department of Internal Medicine, 2Department of Emergency Medicine, Tokyo Bay Urayasu Ichikawa Medical Center, Urayasu City, Chiba, 3Department of Internal Medicine, St. Luke’s International Hospital, Chuo-ku, Tokyo, 4Department of Critical Care Medicine, Tokyo Bay Urayasu Ichikawa Medical Center, Urayasu City, Chiba, 5Department of Gastroenterology, Ito Municipal Hospital, Ito City, Shizuoka, Japan Background: Code status discussion is associated with a decrease in invasive procedures among terminally ill cancer patients. We investigated the association between code status discussion on admission and incidence of invasive procedures, cardiopulmonary resuscitation (CPR, and opioid use among inpatients with advanced stages of cancer and noncancer diseases. Methods: We performed a retrospective cohort study in a single center, Ito Municipal Hospital, Japan. Participants were patients who were admitted to the Department of Internal Medicine between October 1, 2013 and August 30, 2015, with advanced-stage cancer and noncancer. We collected demographic data and inquired the presence or absence of code status discussion within 24 hours of admission and whether invasive procedures, including central venous catheter placement, intubation with mechanical ventilation, and CPR for cardiac arrest, and opioid treatment were performed. We investigated the factors associated with CPR events by using multivariate logistic regression analysis. Results: Among the total 232 patients, code status was discussed with 115 patients on admission, of which 114 (99.1% patients had do-not-resuscitate (DNR orders. The code status was not discussed with the remaining 117 patients on admission, of which 69 (59% patients had subsequent code status discussion with resultant DNR orders. Code status discussion on admission decreased the incidence of central venous

  15. Algorithm for advanced canonical coding of planar chemical structures that considers stereochemical and symmetric information.

    Science.gov (United States)

    Koichi, Shungo; Iwata, Satoru; Uno, Takeaki; Koshino, Hiroyuki; Satoh, Hiroko

    2007-01-01

    We describe a rigorous and fast algorithm for advanced canonical coding of planar chemical structures based on the algorithm of Faulon et al. (J. Chem. Inf. Comput. Sci. 2004, 44, 427-436). Our algorithm works well even for highly symmetric structures; moreover, an advantage of our algorithm includes providing a rigorous canonical numbering of atoms with a consideration of stereochemistry and recognizing symmetric moieties. The planar structural line notation with the canonical numbering is also fit for use with stereochemical line notation. These capabilities are usable for general purposes in chemical structural coding and are particularly essential for detecting equivalent atoms in NMR studies. This algorithm was implemented on a 13C NMR chemical shift prediction system CAST/CNMR. Applications of the algorithm to several organic compounds demonstrate the practical efficiency of the rigorous coding.

  16. Validation of the COBRA code for dry out power calculation in CANDU type advanced fuels

    International Nuclear Information System (INIS)

    Daverio, Hernando J.

    2003-01-01

    Stern Laboratories perform a full scale CHF testing of the CANFLEX bundle under AECL request. This experiment is modeled with the COBRA IV HW code to verify it's capacity for the dry out power calculation . Good results were obtained: errors below 10 % with respect to all data measured and 1 % for standard operating conditions in CANDU reactors range . This calculations were repeated for the CNEA advanced fuel CARA obtaining the same performance as the CANFLEX fuel. (author)

  17. Mechanistic Insight into the Degradation of Nitrosamines via Aqueous-Phase UV Photolysis or a UV-Based Advanced Oxidation Process: Quantum Mechanical Calculations

    Directory of Open Access Journals (Sweden)

    Daisuke Minakata

    2018-02-01

    Full Text Available Nitrosamines are a group of carcinogenic chemicals that are present in aquatic environments that result from byproducts of industrial processes and disinfection products. As indirect and direct potable reuse increase, the presence of trace nitrosamines presents challenges to water infrastructures that incorporate effluent from wastewater treatment. Ultraviolet (UV photolysis or UV-based advanced oxidation processes that produce highly reactive hydroxyl radicals are promising technologies to remove nitrosamines from water. However, complex reaction mechanisms involving radicals limit our understandings of the elementary reaction pathways embedded in the overall reactions identified experimentally. In this study, we perform quantum mechanical calculations to identify the hydroxyl radical-induced initial elementary reactions with N-nitrosodimethylamine (NDMA, N-nitrosomethylethylamine, and N-nitrosomethylbutylamine. We also investigate the UV-induced NDMA degradation mechanisms. Our calculations reveal that the alkyl side chains of nitrosamine affect the reaction mechanism of hydroxyl radicals with each nitrosamine investigated in this study. Nitrosamines with one- or two-carbon alkyl chains caused the delocalization of the electron density, leading to slower subsequent degradation. Additionally, three major initial elementary reactions and the subsequent radical-involved reaction pathways are identified in the UV-induced NDMA degradation process. This study provides mechanistic insight into the elementary reaction pathways, and a future study will combine these results with the kinetic information to predict the time-dependent concentration profiles of nitrosamines and their transformation products.

  18. Mechanistic Insight into the Degradation of Nitrosamines via Aqueous-Phase UV Photolysis or a UV-Based Advanced Oxidation Process: Quantum Mechanical Calculations.

    Science.gov (United States)

    Minakata, Daisuke; Coscarelli, Erica

    2018-02-28

    Nitrosamines are a group of carcinogenic chemicals that are present in aquatic environments that result from byproducts of industrial processes and disinfection products. As indirect and direct potable reuse increase, the presence of trace nitrosamines presents challenges to water infrastructures that incorporate effluent from wastewater treatment. Ultraviolet (UV) photolysis or UV-based advanced oxidation processes that produce highly reactive hydroxyl radicals are promising technologies to remove nitrosamines from water. However, complex reaction mechanisms involving radicals limit our understandings of the elementary reaction pathways embedded in the overall reactions identified experimentally. In this study, we perform quantum mechanical calculations to identify the hydroxyl radical-induced initial elementary reactions with N -nitrosodimethylamine (NDMA), N -nitrosomethylethylamine, and N -nitrosomethylbutylamine. We also investigate the UV-induced NDMA degradation mechanisms. Our calculations reveal that the alkyl side chains of nitrosamine affect the reaction mechanism of hydroxyl radicals with each nitrosamine investigated in this study. Nitrosamines with one- or two-carbon alkyl chains caused the delocalization of the electron density, leading to slower subsequent degradation. Additionally, three major initial elementary reactions and the subsequent radical-involved reaction pathways are identified in the UV-induced NDMA degradation process. This study provides mechanistic insight into the elementary reaction pathways, and a future study will combine these results with the kinetic information to predict the time-dependent concentration profiles of nitrosamines and their transformation products.

  19. Advanced Design of Dumbbell-shaped Genetic Minimal Vectors Improves Non-coding and Coding RNA Expression.

    Science.gov (United States)

    Jiang, Xiaoou; Yu, Han; Teo, Cui Rong; Tan, Genim Siu Xian; Goh, Sok Chin; Patel, Parasvi; Chua, Yiqiang Kevin; Hameed, Nasirah Banu Sahul; Bertoletti, Antonio; Patzel, Volker

    2016-09-01

    Dumbbell-shaped DNA minimal vectors lacking nontherapeutic genes and bacterial sequences are considered a stable, safe alternative to viral, nonviral, and naked plasmid-based gene-transfer systems. We investigated novel molecular features of dumbbell vectors aiming to reduce vector size and to improve the expression of noncoding or coding RNA. We minimized small hairpin RNA (shRNA) or microRNA (miRNA) expressing dumbbell vectors in size down to 130 bp generating the smallest genetic expression vectors reported. This was achieved by using a minimal H1 promoter with integrated transcriptional terminator transcribing the RNA hairpin structure around the dumbbell loop. Such vectors were generated with high conversion yields using a novel protocol. Minimized shRNA-expressing dumbbells showed accelerated kinetics of delivery and transcription leading to enhanced gene silencing in human tissue culture cells. In primary human T cells, minimized miRNA-expressing dumbbells revealed higher stability and triggered stronger target gene suppression as compared with plasmids and miRNA mimics. Dumbbell-driven gene expression was enhanced up to 56- or 160-fold by implementation of an intron and the SV40 enhancer compared with control dumbbells or plasmids. Advanced dumbbell vectors may represent one option to close the gap between durable expression that is achievable with integrating viral vectors and short-term effects triggered by naked RNA.

  20. ESCADRE and ICARE code systems

    International Nuclear Information System (INIS)

    Reocreux, M.; Gauvain, J.

    1992-01-01

    The French sever accident code development program is following two parallel approaches: the first one is dealing with ''integral codes'' which are designed for giving immediate engineer answers, the second one is following a more mechanistic way in order to have the capability of detailed analysis of experiments, in order to get a better understanding of the scaling problem and reach a better confidence in plant calculations. In the first approach a complete system has been developed and is being used for practical cases: this is the ESCADRE system. In the second approach, a set of codes dealing first with primary circuit is being developed: a mechanistic core degradation code, ICARE, has been issued and is being coupled with the advanced thermalhydraulic code CATHARE. Fission product codes have been also coupled to CATHARE. The ''integral'' ESCADRE system and the mechanistic ICARE and associated codes are described. Their main characteristics are reviewed and the status of their development and assessment given. Future studies are finally discussed. 36 refs, 4 figs, 1 tab

  1. TANDA TANGAN DIGITAL MENGGUNAKAN QR CODE DENGAN METODE ADVANCED ENCRYPTION STANDARD

    Directory of Open Access Journals (Sweden)

    Abdul Gani Putra Suratma

    2017-04-01

    Full Text Available Tanda tangan digital (digital signature adalah sebuah skema matematis yang secara unik mengidentifikasikan seorang pengirim, sekaligus untuk membuktikan keaslian dari pemilik sebuah pesan atau dokumen digital, sehingga sebuah tanda tangan digital yang autentik (sah, sudah cukup menjadi alasan bagi penerima un- tuk percaya bahwa sebuah pesan atau dokumen yang diterima adalah berasal dari pengirim yang telah diketahui. Perkembangan teknologi memungkinkan adanya tanda tangan digital yang dapat digunakan untuk melakukan pembuktian secara matematis, sehingga informasi yang didapat oleh satu pihak dari pihak lain dapat diidentifikasi untuk memastikan keaslian informasi yang diterima. Tanda tangan digital merupakan mekanisme otentikasi yang memungkinkan pembuat pesan menambahkan sebuah kode yang bertindak sebagai tanda tangannya. Tujuan dari penelitian ini menerapkan QR Code atau yang dikenal dengan istilah QR (Quick Respon dan Algoritma yang akan ditambahkan yaitu AES (Advanced Encryption Standard sebagai tanda tangan digital sehingga hasil dari penelitian penerapan QR Code menggunakan algoritma Advanced Encryption Standard sebagai tanda tangan digital dapat berfungsi sebagai otentikasi tanda tangan pimpinan serta ve- rivikasi dokumen pengambilan barang yang sah. dari penelitian ini akurasi klasifi- kasi QR Code dengan menggunakan naïve bayes classifier sebesar 90% dengan precision positif sebesar 80% dan precision negatif sebesar 100%.

  2. Basic research and industrialization of CANDU advanced fuel - A research for the improvement of RFSP code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Jang, Chang Sun; Han, Tae Young [Seoul National University, Seoul (Korea)

    2000-03-01

    The objective of this project is to improve the RFSP code by adopting three dimensional two neutron energy group model and accelerated iterative solution scheme (FDM3D) to 2 group diffusion equations as well. The major contents of this research are the derivation of the finite difference equation to three dimensional two neutron energy group diffusion equation, application of accelerated iterative solution scheme to the finite difference diffusion equation and validation of the improved RFSP code (FDM3D) through benchmark tests. We have shown that SOR/Chebyshev two parameter method and BICG-STAB/Wielandt method are more effective than that of RFSP in terms of computing speed. SOR/Chebyshev two parameter method shows better efficiency than BICG-STAB/Wielandt method. Because calculation efficiency of the latter depends on the right choice of pre-conditioner, however, it is considered that more studies are necessary to improve the efficiency of this latter method and to validate it. We have incorporated the new efficient method into the existing RFSP so that the resulting RFSP becomes much faster and more accurate. RFSP currently uses POWDERPUFS code as main lattice code, which is adequate to the neutron energy group model of RFSP. Because of this, we can not make the full advantage of advanced RFSP without adopting lattice code WIMS-AECL which can generate exact two neutron energy group constants. Therefore, we suggest developing a new CANDU design and analysis code which incorporate WIMS-AECL into FDM3D. 16 refs., 10 figs., 23 tabs. (Author)

  3. Development of an advanced fluid-dynamic analysis code: α-flow

    International Nuclear Information System (INIS)

    Akiyama, Mamoru

    1990-01-01

    A Project for development of large scale three-dimensional fluid-dynamic analysis code, α-FLOW, coping with the recent advancement of supercomputers and workstations, has been in progress. This project is called the α-Project, which has been promoted by the Association for Large Scale Fluid Dynamics Analysis Code comprising private companies and research institutions such as universities. The developmental period for the α-FLOW is four years, March 1989 to March 1992. To date, the major portions of basic design and program preparation have been completed and the project is in the stage of testing each module. In this paper, the present status of the α-Project, design policy and outline of α-FLOW are described. (author)

  4. CHF predictor derived from a 3D thermal-hydraulic code and an advanced statistical method

    International Nuclear Information System (INIS)

    Banner, D.; Aubry, S.

    2004-01-01

    A rod bundle CHF predictor has been determined by using a 3D code (THYC) to compute local thermal-hydraulic conditions at the boiling crisis location. These local parameters have been correlated to the critical heat flux by using an advanced statistical method based on spline functions. The main characteristics of the predictor are presented in conjunction with a detailed analysis of predictions (P/M ratio) in order to prove that the usual safety methodology can be applied with such a predictor. A thermal-hydraulic design criterion is obtained (1.13) and the predictor is compared with the WRB-1 correlation. (author)

  5. Requirements on mechanistic NPP models used in CSS for diagnostics and predictions

    International Nuclear Information System (INIS)

    Juslin, K.

    1996-01-01

    Mechanistic models have for several years with good experience been used for operators' support in electric power dispatching centres. Some models of limited scope have already been in use at nuclear power plants. It is considered that also advanced mechanistic models in combination with present computer technology with preference could be used in Computerized Support Systems (CSS) for the assistance of Nuclear Power Plant (NPP) operators. Requirements with respect to accuracy, validity range, speed flexibility and level of detail on the models used for such purposes are discussed. Quality Assurance, Verification and Validation efforts are considered. A long term commitment in the field of mechanistic modelling and real time simulation is considered as the key to successful implementations. The Advanced PROcess Simulation (APROS) code system and simulation environment developed at the Technical Research Centre of Finland (VTT) is intended also for CSS applications in NPP control rooms. (author). 4 refs

  6. CSAU (code scaling, applicability and uncertainty), a tool to prioritize advanced reactor research

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1990-01-01

    Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab

  7. Integration of CFD codes and advanced combustion models for quantitative burnout determination

    Energy Technology Data Exchange (ETDEWEB)

    Javier Pallares; Inmaculada Arauzo; Alan Williams [University of Zaragoza, Zaragoza (Spain). Centre of Research for Energy Resources and Consumption (CIRCE)

    2007-10-15

    CFD codes and advanced kinetics combustion models are extensively used to predict coal burnout in large utility boilers. Modelling approaches based on CFD codes can accurately solve the fluid dynamics equations involved in the problem but this is usually achieved by including simple combustion models. On the other hand, advanced kinetics combustion models can give a detailed description of the coal combustion behaviour by using a simplified description of the flow field, this usually being obtained from a zone-method approach. Both approximations describe correctly general trends on coal burnout, but fail to predict quantitative values. In this paper a new methodology which takes advantage of both approximations is described. In the first instance CFD solutions were obtained of the combustion conditions in the furnace in the Lamarmora power plant (ASM Brescia, Italy) for a number of different conditions and for three coals. Then, these furnace conditions were used as inputs for a more detailed chemical combustion model to predict coal burnout. In this, devolatilization was modelled using a commercial macromolecular network pyrolysis model (FG-DVC). For char oxidation an intrinsic reactivity approach including thermal annealing, ash inhibition and maceral effects, was used. Results from the simulations were compared against plant experimental values, showing a reasonable agreement in trends and quantitative values. 28 refs., 4 figs., 4 tabs.

  8. Advancing Kohlberg through Codes: Using Professional Codes To Reach the Moral Reasoning Objective in Undergraduate Ethics Courses.

    Science.gov (United States)

    Whitehouse, Ginny; Ingram, Michael T.

    The development of moral reasoning as a key course objective in undergraduate communication ethics classes can be accomplished by the critical and deliberate introduction of professional codes of ethics and the internalization of values found in those codes. Notably, "fostering moral reasoning skills" and "surveying current ethical…

  9. Evaluation of Advanced Thermohydraulic System Codes for Design and Safety Analysis of Integral Type Reactors

    International Nuclear Information System (INIS)

    2014-02-01

    The integral pressurized water reactor (PWR) concept, which incorporates the nuclear steam supply systems within the reactor vessel, is one of the innovative reactor types with high potential for near term deployment. An International Collaborative Standard Problem (ICSP) on Integral PWR Design, Natural Circulation Flow Stability and Thermohydraulic Coupling of Primary System and Containment during Accidents was established in 2010. Oregon State University, which made available the use of its experimental facility built to demonstrate the feasibility of the Multi-application Small Light Water Reactor (MASLWR) design, and sixteen institutes from seven Member States participated in this ICSP. The objective of the ICSP is to assess computer codes for reactor system design and safety analysis. This objective is achieved through the production of experimental data and computer code simulation of experiments. A loss of feedwater transient with subsequent automatic depressurization system blowdown and long term cooling was selected as the reference event since many different modes of natural circulation phenomena, including the coupling of primary system, high pressure containment and cooling pool are expected to occur during this transient. The power maneuvering transient is also tested to examine the stability of natural circulation during the single and two phase conditions. The ICSP was conducted in three phases: pre-test (with designed initial and boundary conditions established before the experiment was conducted), blind (with real initial and boundary conditions after the experiment was conducted) and open simulation (after the observation of real experimental data). Most advanced thermohydraulic system analysis codes such as TRACE, RELAPS and MARS have been assessed against experiments conducted at the MASLWR test facility. The ICSP has provided all participants with the opportunity to evaluate the strengths and weaknesses of their system codes in the transient

  10. Preface: Research advances in vadose zone hydrology through simulations with the TOUGH codes

    International Nuclear Information System (INIS)

    Finsterle, Stefan; Oldenburg, Curtis M.

    2004-01-01

    Numerical simulators are playing an increasingly important role in advancing our fundamental understanding of hydrological systems. They are indispensable tools for managing groundwater resources, analyzing proposed and actual remediation activities at contaminated sites, optimizing recovery of oil, gas, and geothermal energy, evaluating subsurface structures and mining activities, designing monitoring systems, assessing the long-term impacts of chemical and nuclear waste disposal, and devising improved irrigation and drainage practices in agricultural areas, among many other applications. The complexity of subsurface hydrology in the vadose zone calls for sophisticated modeling codes capable of handling the strong nonlinearities involved, the interactions of coupled physical, chemical and biological processes, and the multiscale heterogeneities inherent in such systems. The papers in this special section of ''Vadose Zone Journal'' are illustrative of the enormous potential of such numerical simulators as applied to the vadose zone. The papers describe recent developments and applications of one particular set of codes, the TOUGH family of codes, as applied to nonisothermal flow and transport in heterogeneous porous and fractured media (http://www-esd.lbl.gov/TOUGH2). The contributions were selected from presentations given at the TOUGH Symposium 2003, which brought together developers and users of the TOUGH codes at the Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California, for three days of information exchange in May 2003 (http://www-esd.lbl.gov/TOUGHsymposium). The papers presented at the symposium covered a wide range of topics, including geothermal reservoir engineering, fracture flow and vadose zone hydrology, nuclear waste disposal, mining engineering, reactive chemical transport, environmental remediation, and gas transport. This Special Section of ''Vadose Zone Journal'' contains revised and expanded versions of selected papers from the

  11. Development of steady thermal-hydraulic analysis code for China advanced research reactor

    International Nuclear Information System (INIS)

    Tian Wenxi; Qiu Suizheng; Guo Yun; Su Guanghui; Jia Dounan; Liu Tiancai; Zhang Jianwei

    2006-01-01

    A multi-channel model steady-state thermal-hydraulic analysis code was developed for China Advanced Research Reactor (CARR). By simulating the whole reactor core, the detailed flow distribution in the core was obtained. The result shows that the structure size plays the most important role in flow distribution and the influence of core power could be neglected under single-phase flow. The temperature field of fuel element under unsymmetrical cooling condition was also obtained, which is necessary for the further study such as stress analysis etc. of the fuel element. At the same time, considering the hot channel effect including engineering factor and nuclear factor, calculation of hot channel was carried out and it is proved that all thermal-hydraulic parameters accord with the Safety Regulation of CARR. (authors)

  12. Development of a steady thermal-hydraulic analysis code for the China Advanced Research Reactor

    Institute of Scientific and Technical Information of China (English)

    TIAN Wenxi; QIU Suizheng; GUO Yun; SU Guanghui; JIA Dounan; LIU Tiancai; ZHANG Jianwei

    2007-01-01

    A multi-channel model steady-state thermalhydraulic analysis code was developed for the China Advanced Research Reactor (CARR). By simulating the whole reactor core, the detailed mass flow distribution in the core was obtained. The result shows that structure size plays the most important role in mass flow distribution, and the influence of core power could be neglected under singlephase flow. The temperature field of the fuel element under unsymmetrical cooling condition was also obtained, which is necessary for further study such as stress analysis, etc. Of the fuel element. At the same time, considering the hot channel effect including engineering factor and nuclear factor, calculation of the mean and hot channel was carried out and it is proved that all thermal-hydraulic parameters satisfy the "Safety design regulation of CARR".

  13. Discrete rod burnup analysis capability in the Westinghouse advanced nodal code

    International Nuclear Information System (INIS)

    Buechel, R.J.; Fetterman, R.J.; Petrunyak, M.A.

    1992-01-01

    Core design analysis in the last several years has evolved toward the adoption of nodal-based methods to replace traditional fine-mesh models as the standard neutronic tool for first core and reload design applications throughout the nuclear industry. The accuracy, speed, and reduction in computation requirements associated with the nodal methods have made three-dimensional modeling the preferred approach to obtain the most realistic core model. These methods incorporate detailed rod power reconstruction as well. Certain design applications such as confirmation of fuel rod design limits and fuel reconstitution considerations, for example, require knowledge of the rodwise burnup distribution to avoid unnecessary conservatism in design analyses. The Westinghouse Advanced Nodal Code (ANC) incorporates the capability to generate the intra-assembly pin burnup distribution using an efficient algorithm

  14. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Jussie Soares da; Lira, Carlos A.B.O.; Magalhaes, Mardson A. de Sa, E-mail: cabol@ufpe.b [Universidade Federal de Pernambuco (DEN/UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear

    2011-07-01

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  15. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    International Nuclear Information System (INIS)

    Rocha, Jussie Soares da; Lira, Carlos A.B.O.; Magalhaes, Mardson A. de Sa

    2011-01-01

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  16. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC): gap analysis for high fidelity and performance assessment code development

    International Nuclear Information System (INIS)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-01-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  17. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  18. Final Report for Project 13-4791: New Mechanistic Models of Creep-Fatigue Crack Growth Interactions for Advanced High Temperature Reactor Components

    Energy Technology Data Exchange (ETDEWEB)

    Kruzic, Jamie J [Oregon State Univ., Corvallis, OR (United States); Siegmund, Thomas [Purdue Univ., West Lafayette, IN (United States); Tomar, Vikas [Purdue Univ., West Lafayette, IN (United States)

    2018-03-20

    This project developed and validated a novel, multi-scale, mechanism-based model to quantitatively predict creep-fatigue crack growth and failure for Ni-based Alloy 617 at 800°C. Alloy 617 is a target material for intermediate heat exchangers in Generation IV very high temperature reactor designs, and it is envisioned that this model will aid in the design of safe, long lasting nuclear power plants. The technical effectiveness of the model was shown by demonstrating that experimentally observed crack growth rates can be predicted under both steady state and overload crack growth conditions. Feasibility was considered by incorporating our model into a commercially available finite element method code, ABAQUS, that is commonly used by design engineers. While the focus of the project was specifically on an alloy targeted for Generation IV nuclear reactors, the benefits to the public are expected to be wide reaching. Indeed, creep-fatigue failure is a design consideration for a wide range of high temperature mechanical systems that rely on Ni-based alloys, including industrial gas power turbines, advanced ultra-super critical steam turbines, and aerospace turbine engines. It is envisioned that this new model can be adapted to a wide range of engineering applications.

  19. Advances in the development of interaction between the codes MCNPX and ANSYS Fluent and their fusion applications

    International Nuclear Information System (INIS)

    Colomer, C.; Salellas, J.; Ahmed, R.; Fabbrio, M.; Aleman, A.

    2012-01-01

    The advances are presented in the project for the development of a code of interaction between MCNPX y el ANSYS Fluent. Following the flow of the work carried out during the development of the project will study of the most appropriate remeshing algorithms between both codes. In addition explain the selection and implementation of methods to verify internally the correct transmission of the variables involved between both nets. Finally the selection of cases for verification and validation of the interaction between both codes in each of the possible fields of application will be exposed.

  20. Tri-Lab Co-Design Milestone: In-Depth Performance Portability Analysis of Improved Integrated Codes on Advanced Architecture.

    Energy Technology Data Exchange (ETDEWEB)

    Hoekstra, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Simon David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Richards, David [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergen, Ben [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-01

    This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.

  1. Recent advances in coding theory for near error-free communications

    Science.gov (United States)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  2. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration

  3. REVA Advanced Fuel Design and Codes and Methods - Increasing Reliability, Operating Margin and Efficiency in Operation

    Energy Technology Data Exchange (ETDEWEB)

    Frichet, A.; Mollard, P.; Gentet, G.; Lippert, H. J.; Curva-Tivig, F.; Cole, S.; Garner, N.

    2014-07-01

    Since three decades, AREVA has been incrementally implementing upgrades in the BWR and PWR Fuel design and codes and methods leading to an ever greater fuel efficiency and easier licensing. For PWRs, AREVA is implementing upgraded versions of its HTP{sup T}M and AFA 3G technologies called HTP{sup T}M-I and AFA3G-I. These fuel assemblies feature improved robustness and dimensional stability through the ultimate optimization of their hold down system, the use of Q12, the AREVA advanced quaternary alloy for guide tube, the increase in their wall thickness and the stiffening of the spacer to guide tube connection. But an even bigger step forward has been achieved a s AREVA has successfully developed and introduces to the market the GAIA product which maintains the resistance to grid to rod fretting (GTRF) of the HTP{sup T}M product while providing addition al thermal-hydraulic margin and high resistance to Fuel Assembly bow. (Author)

  4. Performance evaluations of advanced massively parallel platforms based on gyrokinetic toroidal five-dimensional Eulerian code GT5D

    International Nuclear Information System (INIS)

    Idomura, Yasuhiro; Jolliet, Sebastien

    2010-01-01

    A gyrokinetic toroidal five dimensional Eulerian code GT5D is ported on six advanced massively parallel platforms and comprehensive benchmark tests are performed. A parallelisation technique based on physical properties of the gyrokinetic equation is presented. By extending the parallelisation technique with a hybrid parallel model, the scalability of the code is improved on platforms with multi-core processors. In the benchmark tests, a good salability is confirmed up to several thousands cores on every platforms, and the maximum sustained performance of ∼18.6 Tflops is achieved using 16384 cores of BX900. (author)

  5. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Ideas for Advancing Code Sharing: A Different Kind of Hack Day

    Science.gov (United States)

    Teuben, P.; Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Wallin, J. F.

    2014-05-01

    How do we as a community encourage the reuse of software for telescope operations, data processing, and ? How can we support making codes used in research available for others to examine? Continuing the discussion from last year Bring out your codes! BoF session, participants separated into groups to brainstorm ideas to mitigate factors which inhibit code sharing and nurture those which encourage code sharing. The BoF concluded with the sharing of ideas that arose from the brainstorming sessions and a brief summary by the moderator.

  7. Development of essential system technologies for advanced reactor - Development of natural circulation analysis code for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Park, Ik Gyu; Kim, Jae Hak; Lee, Sang Min; Kim, Tae Wan [Seoul National University, Seoul (Korea)

    1999-04-01

    The objective of this study is to understand the natural circulation characteristics of integral type reactors and to develope the natural circulation analysis code for integral type reactors. This study is focused on the asymmetric 3-dimensional flow during natural circulation such as 1/4 steam generator section isolation and the inclination of the reactor systems. Natural circulation experiments were done using small-scale facilities of integral reactor SMART (System-Integrated Modular Advanced ReacTor). CFX4 code was used to investigate the flow patterns and thermal mixing phenomena in upper pressure header and downcomer. Differences between normal operation of all steam generators and the 1/4 section isolation conditions were observed and the results were used as the data 1/4 section isolation conditions were observed and the results were used as the data for RETRAN-03/INT code validation. RETRAN-03 code was modified for the development of natural circulation analysis code for integral type reactors, which was development of natural circulation analysis code for integral type reactors, which was named as RETRAN-03/INT. 3-dimensional analysis models for asymmetric flow in integral type reactors were developed using vector momentum equations in RETRAN-03. Analysis results using RETRAN-03/INT were compared with experimental and CFX4 analysis results and showed good agreements. The natural circulation characteristics obtained in this study will provide the important and fundamental design features for the future small and medium integral reactors. (author). 29 refs., 75 figs., 18 tabs.

  8. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    Science.gov (United States)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.

  9. Best estimate LB LOCA approach based on advanced thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Sauvage, J.Y.; Gandrille, J.L.; Gaurrand, M.; Rochwerger, D.; Thibaudeau, J.; Viloteau, E.

    2004-01-01

    Improvements achieved in thermal-hydraulics with development of Best Estimate computer codes, have led number of Safety Authorities to preconize realistic analyses instead of conservative calculations. The potentiality of a Best Estimate approach for the analysis of LOCAs urged FRAMATOME to early enter into the development with CEA and EDF of the 2nd generation code CATHARE, then of a LBLOCA BE methodology with BWNT following the Code Scaling Applicability and Uncertainty (CSAU) proceeding. CATHARE and TRAC are the basic tools for LOCA studies which will be performed by FRAMATOME according to either a deterministic better estimate (dbe) methodology or a Statistical Best Estimate (SBE) methodology. (author)

  10. Validation of thermal hydraulic computer codes for advanced light water reactor

    International Nuclear Information System (INIS)

    Macek, J.

    2001-01-01

    The Czech Republic operates 4 WWER-440 units, two WWER-1000 units are being finalised (one of them is undergoing commissioning). Thermal-hydraulics Department of the Nuclear Research Institute Rez performs accident analyses for these plants using a number of computer codes. To model the primary and secondary circuits behaviour the system codes ATHLET, CATHARE, RELAP, TRAC are applied. Containment and pressure-suppressure system are modelled with RALOC and MELCOR codes, the reactor power calculations (point and space-neutron kinetics) are made with DYN3D, NESTLE and CDF codes (FLUENT, TRIO) are used for some specific problems. An integral part of the current Czech project 'New Energy Sources' is selection of a new nuclear source. Within this and the preceding projects financed by the Czech Ministry of Industry and Trade and the EU PHARE, the Department carries and has carried out the systematic validation of thermal-hydraulic and reactor physics computer codes applying data obtained on several experimental facilities as well as the real operational data. The paper provides a concise information on these activities of the NRI and its Thermal-hydraulics Department. A detailed example of the system code validation and the consequent utilisation of the results for a real NPP purposes is included. (author)

  11. Advancements in reactor physics modelling methodology of Monte Carlo Burnup Code MCB dedicated to higher simulation fidelity of HTR cores

    International Nuclear Information System (INIS)

    Cetnar, Jerzy

    2014-01-01

    The recent development of MCB - Monte Carlo Continuous Energy Burn-up code is directed towards advanced description of modern reactors, including double heterogeneity structures that exist in HTR-s. In this, we exploit the advantages of MCB methodology in integrated approach, where physics, neutronics, burnup, reprocessing, non-stationary process modeling (control rod operation) and refined spatial modeling are carried in a single flow. This approach allows for implementations of advanced statistical options like analysis of error propagation, perturbation in time domain, sensitivity and source convergence analyses. It includes statistical analysis of burnup process, emitted particle collection, thermal-hydraulic coupling, automatic power profile calculations, advanced procedures of burnup step normalization and enhanced post processing capabilities. (author)

  12. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  13. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  14. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    Science.gov (United States)

    Blyth, Taylor S.

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.

  15. Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF

    Energy Technology Data Exchange (ETDEWEB)

    Blyth, Taylor S. [Pennsylvania State Univ., University Park, PA (United States); Avramova, Maria [North Carolina State Univ., Raleigh, NC (United States)

    2017-04-01

    The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.

  16. Advanced Wall Boiling Model with Wide Range Applicability for the Subcooled Boiling Flow and its Application into the CFD Code

    International Nuclear Information System (INIS)

    Yun, B. J.; Song, C. H.; Splawski, A.; Lo, S.

    2010-01-01

    Subcooled boiling is one of the crucial phenomena for the design, operation and safety analysis of a nuclear power plant. It occurs due to the thermally nonequilibrium state in the two-phase heat transfer system. Many complicated phenomena such as a bubble generation, a bubble departure, a bubble growth, and a bubble condensation are created by this thermally nonequilibrium condition in the subcooled boiling flow. However, it has been revealed that most of the existing best estimate safety analysis codes have a weakness in the prediction of the subcooled boiling phenomena in which multi-dimensional flow behavior is dominant. In recent years, many investigators are trying to apply CFD (Computational Fluid Dynamics) codes for an accurate prediction of the subcooled boiling flow. In the CFD codes, evaporation heat flux from heated wall is one of the key parameters to be modeled for an accurate prediction of the subcooled boiling flow. The evaporate heat flux for the CFD codes is expressed typically as follows, q' e = πD 3 d /6 ρ g h fg fN' where, D d , f ,N' are bubble departure size, bubble departure frequency and active nucleation site density, respectively. In the most of the commercial CFD codes, Tolubinsky bubble departure size model, Kurul and Podowski active nucleation site density model and Ceumem-Lindenstjerna bubble departure frequency model are adopted as a basic wall boiling model. However, these models do not consider their dependency on the flow, pressure and fluid type. In this paper, an advanced wall boiling model was proposed in order to improve subcooled boiling model for the CFD codes

  17. A Tough Call : Mitigating Advanced Code-Reuse Attacks at the Binary Level

    NARCIS (Netherlands)

    Veen, Victor Van Der; Goktas, Enes; Contag, Moritz; Pawoloski, Andre; Chen, Xi; Rawat, Sanjay; Bos, Herbert; Holz, Thorsten; Athanasopoulos, Ilias; Giuffrida, Cristiano

    2016-01-01

    Current binary-level Control-Flow Integrity (CFI) techniques are weak in determining the set of valid targets for indirect control flow transfers on the forward edge. In particular, the lack of source code forces existing techniques to resort to a conservative address-taken policy that

  18. Implementation of advanced finite element technology in structural analysis computer codes

    International Nuclear Information System (INIS)

    Kohli, T.D.; Wiley, J.W.; Koss, P.W.

    1975-01-01

    Advances in finite element technology over the last several years have been rapid and have largely outstripped the ability of general purpose programs in the public domain to assimilate them. As a result, it has become the burden of the structural analyst to incorporate these advances himself. This paper discusses the implementation and extension of specific technological advances in Bechtel structural analysis programs. In general these advances belong in two categories: (1) the finite elements themselves and (2) equation solution algorithms. Improvements in the finite elements involve increased accuracy of the elements and extension of their applicability to various specialized modelling situations. Improvements in solution algorithms have been almost exclusively aimed at expanding problem solving capacity. (Auth.)

  19. Advanced Technology and Mitigation (ATDM) SPARC Re-Entry Code Fiscal Year 2017 Progress and Accomplishments for ECP.

    Energy Technology Data Exchange (ETDEWEB)

    Crozier, Paul [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Howard, Micah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freno, Brian Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bova, Steven W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carnes, Brian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    The SPARC (Sandia Parallel Aerodynamics and Reentry Code) will provide nuclear weapon qualification evidence for the random vibration and thermal environments created by re-entry of a warhead into the earth’s atmosphere. SPARC incorporates the innovative approaches of ATDM projects on several fronts including: effective harnessing of heterogeneous compute nodes using Kokkos, exascale-ready parallel scalability through asynchronous multi-tasking, uncertainty quantification through Sacado integration, implementation of state-of-the-art reentry physics and multiscale models, use of advanced verification and validation methods, and enabling of improved workflows for users. SPARC is being developed primarily for the Department of Energy nuclear weapon program, with additional development and use of the code is being supported by the Department of Defense for conventional weapons programs.

  20. International benchmark study of advanced thermal hydraulic safety analysis codes against measurements on IEA-R1 research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hainoun, A., E-mail: pscientific2@aec.org.sy [Atomic Energy Commission of Syria (AECS), Nuclear Engineering Department, P.O. Box 6091, Damascus (Syrian Arab Republic); Doval, A. [Nuclear Engineering Department, Av. Cmdt. Luis Piedrabuena 4950, C.P. 8400 S.C de Bariloche, Rio Negro (Argentina); Umbehaun, P. [Centro de Engenharia Nuclear – CEN, IPEN-CNEN/SP, Av. Lineu Prestes 2242-Cidade Universitaria, CEP-05508-000 São Paulo, SP (Brazil); Chatzidakis, S. [School of Nuclear Engineering, Purdue University, West Lafayette, IN 47907 (United States); Ghazi, N. [Atomic Energy Commission of Syria (AECS), Nuclear Engineering Department, P.O. Box 6091, Damascus (Syrian Arab Republic); Park, S. [Research Reactor Design and Engineering Division, Basic Science Project Operation Dept., Korea Atomic Energy Research Institute (Korea, Republic of); Mladin, M. [Institute for Nuclear Research, Campului Street No. 1, P.O. Box 78, 115400 Mioveni, Arges (Romania); Shokr, A. [Division of Nuclear Installation Safety, Research Reactor Safety Section, International Atomic Energy Agency, A-1400 Vienna (Austria)

    2014-12-15

    Highlights: • A set of advanced system thermal hydraulic codes are benchmarked against IFA of IEA-R1. • Comparative safety analysis of IEA-R1 reactor during LOFA by 7 working teams. • This work covers both experimental and calculation effort and presents new out findings on TH of RR that have not been reported before. • LOFA results discrepancies from 7% to 20% for coolant and peak clad temperatures are predicted conservatively. - Abstract: In the framework of the IAEA Coordination Research Project on “Innovative methods in research reactor analysis: Benchmark against experimental data on neutronics and thermal hydraulic computational methods and tools for operation and safety analysis of research reactors” the Brazilian research reactor IEA-R1 has been selected as reference facility to perform benchmark calculations for a set of thermal hydraulic codes being widely used by international teams in the field of research reactor (RR) deterministic safety analysis. The goal of the conducted benchmark is to demonstrate the application of innovative reactor analysis tools in the research reactor community, validation of the applied codes and application of the validated codes to perform comprehensive safety analysis of RR. The IEA-R1 is equipped with an Instrumented Fuel Assembly (IFA) which provided measurements for normal operation and loss of flow transient. The measurements comprised coolant and cladding temperatures, reactor power and flow rate. Temperatures are measured at three different radial and axial positions of IFA summing up to 12 measuring points in addition to the coolant inlet and outlet temperatures. The considered benchmark deals with the loss of reactor flow and the subsequent flow reversal from downward forced to upward natural circulation and presents therefore relevant phenomena for the RR safety analysis. The benchmark calculations were performed independently by the participating teams using different thermal hydraulic and safety

  1. Advances in Monte-Carlo code TRIPOLI-4®'s treatment of the electromagnetic cascade

    Science.gov (United States)

    Mancusi, Davide; Bonin, Alice; Hugot, François-Xavier; Malouch, Fadhel

    2018-01-01

    TRIPOLI-4® is a Monte-Carlo particle-transport code developed at CEA-Saclay (France) that is employed in the domains of nuclear-reactor physics, criticality-safety, shielding/radiation protection and nuclear instrumentation. The goal of this paper is to report on current developments, validation and verification made in TRIPOLI-4 in the electron/positron/photon sector. The new capabilities and improvements concern refinements to the electron transport algorithm, the introduction of a charge-deposition score, the new thick-target bremsstrahlung option, the upgrade of the bremsstrahlung model and the improvement of electron angular straggling at low energy. The importance of each of the developments above is illustrated by comparisons with calculations performed with other codes and with experimental data.

  2. IAEA programme to support development and validation of advanced design and safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J., E-mail: J.H.Choi@iaea.org [International Atomic Energy Agency, Vienna (Austria)

    2013-07-01

    The International Atomic Energy Agency (IAEA) has been organized many international collaboration programs to support the development and validation of design and safety analysis computer codes for nuclear power plants. These programs are normally implemented with a frame of Coordinated Research Project (CRP) or International Collaborative Standard Problem (ICSP). This paper introduces CRPs and ICSPs currently being organized or recently completed by IAEA for this purpose. (author)

  3. Joint ICTP-IAEA advanced workshop on model codes for spallation reactions

    International Nuclear Information System (INIS)

    Filges, D.; Leray, S.; Yariv, Y.; Mengoni, A.; Stanculescu, A.; Mank, G.

    2008-08-01

    The International Atomic Energy Agency (IAEA) and the Abdus Salam International Centre for Theoretical Physics (ICTP) organised an expert meeting at the ICTP from 4 to 8 February 2008 to discuss model codes for spallation reactions. These nuclear reactions play an important role in a wide domain of applications ranging from neutron sources for condensed matter and material studies, transmutation of nuclear waste and rare isotope production to astrophysics, simulation of detector set-ups in nuclear and particle physics experiments, and radiation protection near accelerators or in space. The simulation tools developed for these domains use nuclear model codes to compute the production yields and characteristics of all the particles and nuclei generated in these reactions. These codes are generally Monte-Carlo implementations of Intra-Nuclear Cascade (INC) or Quantum Molecular Dynamics (QMD) models, followed by de-excitation (principally evaporation/fission) models. Experts have discussed in depth the physics contained within the different models in order to understand their strengths and weaknesses. Such codes need to be validated against experimental data in order to determine their accuracy and reliability with respect to all forms of application. Agreement was reached during the course of the workshop to organise an international benchmark of the different models developed by different groups around the world. The specifications of the benchmark, including the set of selected experimental data to be compared to the models, were also defined during the workshop. The benchmark will be organised under the auspices of the IAEA in 2008, and the first results will be discussed at the next Accelerator Applications Conference (AccApp'09) to be held in Vienna in May 2009. (author)

  4. Analysis of L test series of ACE (Advanced Containment Experiments) project with modified corcon UW code

    International Nuclear Information System (INIS)

    Laguna Velasco, H.

    1994-01-01

    A series of experimental tests (so call L, Large scale) have been performance under sponsored of many research institutions around the world and management by Electric Power Research Institute at U.S.A. The goal of these tests is to analyze the phenomena of core-concrete interaction at the same conditions as severe accident in light water nuclear reactor. Results of these tests provides experimental data about thermohydraulic phenomenon and aerosol and fission products release. With these results, improves many codes that already have been developed to simulate core-concrete interaction during severe accident ; in case of CORCON.UW code is a improved version developed in University of Wisconsin at CORCON MOD 2. Scope of this work is shown results obtained from CORCON.UW improved. The improves consist of add data about BaSiO 3 , Ba 2 SiO 4 , BaZrO 3 , SrSiO 4 and SrZrO 3 , append Kutateladze's heat transfer correlation, and finally make more efficient the resolution of energy equations system through use a better algorithm. The results obtained by this improved code to the downward power and H 2 , H 2 O, CO and CO 2 release are agree with experimental results, and also it saved 40% of C.P.U. consumption during execution, due improve of energy equation system. Conclusions are, the increase of thermodynamics data in CORCON.UW produce a well results comparative with experimental results and update heat transfer correlations and algorithm brings a versatile code and reliable results. (Author)

  5. Advanced local dose rate calculations with the Monte Carlo code MCNP for plutonium nitrate storage containers

    International Nuclear Information System (INIS)

    Quade, U.

    1994-01-01

    Neutron- und Gamma dose rate calculations were performed for the storage containers filled with plutonium nitrate of the MOX fabrication facility of Siemens. For the particle transport calculations the Monte Carlo Code MCNP 4.2 was used. The calculated results were compared with experimental dose rate measurements. It can be stated that the choice of the code system was appropriate since all aspects of the many facettes of the problem were well reproduced in the calculations. The position dependency as well as the influence of the shieldings, the reflections and the mutual influences of the sources were well described by the calculations for the gamma and for the neutron dose rates. However, good agreement with the experimental results on the gamma dose rates could only be reached when the lead shielding of the detector was integrated into the geometry modelling of the calculations. For some few cases of thick shieldings and soft gamma ray sources the statistics of the calculational results were not sufficient. In such cases more elaborate variance reduction methods must be applied in future calculations. Thus the MCNP code in connection with NGSRC has been proven as an effective tool for the solution of this type of problems. (orig./HP) [de

  6. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  7. Advanced methodology to simulate boiling water reactor transient using coupled thermal-hydraulic/neutron-kinetic codes

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Christoph Oliver

    2016-06-13

    Coupled Thermal-hydraulic/Neutron-kinetic (TH/NK) simulations of Boiling Water Reactor transients require well validated and accurate simulation tools. The generation of cross-section (XS) libraries, depending on the individual thermal-hydraulic state parameters, is of paramount importance for coupled simulations. Problem-dependent XS-sets for 3D core simulations are being generated mainly by well validated, fast running commercial and user-friendly lattice codes such as CASMO and HELIOS. In this dissertation a computational route, based on the lattice code SCALE6/TRITON, the cross-section interface GenPMAXS, the best-estimate thermal-hydraulic system code TRACE and the core simulator PARCS, for best-estimate simulations of Boiling Water (BWR) transients has been developed and validated. The computational route has been supplemented by a subsequent uncertainty and sensitivity study based on Monte Carlo sampling and propagation of the uncertainties of input parameters to the output (SUSA code). The analysis of a single BWR fuel assembly depletion problem with PARCS using SCALE/TRITON cross-sections has been shown a good agreement with the results obtained with CASMO cross-section sets. However, to compensate the deficiencies of the interface program GenPMAXS, PYTHON scripts had to be developed to incorporate missing data, as the yields of Iodine, Xenon and Promethium, into the cross-section-data sets (PMAXS-format) generated by GenPMAXS from the SCALE/TRITON output. The results of the depletion analysis of a full BWR core with PARCS have indicated the importance of considering history effects, adequate modeling of the reflector region and the control rods, as the PARCS simulations for depleted fuel and all control rods inserted (ARI) differs significantly at the fuel assembly top and bottom. Systematic investigations with the coupled codes TRACE/PARCS have been performed to analyse the core behaviour at different thermal conditions using nuclear data (XS

  8. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  9. Recent advances in extracellular vesicles enriched with non-coding RNAs related to cancers

    Directory of Open Access Journals (Sweden)

    Song Yang

    2018-03-01

    Full Text Available As membrane-bound structures that could be shedded by a parental cell, and fuse with others after shedding, and then release its contents, extracellular vesicles (EVs are considered as an indispensable part of intercellular communication system. The EV contents might be all kinds of bioactive molecules including non-coding RNAs (ncRNAs, a large and complex group of RNAs with various subtypes that function to regulate biological events but classically do not code for proteins. In this review we covered the recently published works that validated the underlying molecular mechanisms regulating EV-associated ncRNAs' biogenesis, signaling, and particularly the systemic bio-effects related mostly to any stage of cancer progression, and the clinical potential of ncRNA-carrying EVs as diagnostic biomarkers and drug-delivery system that is being engineered for better loading and targeting capacity. Our views on the future direction of basic research and applications of EVs containing ncRNAs have also been shared.

  10. ASSERT-PV 3.2: Advanced subchannel thermalhydraulics code for CANDU fuel bundles

    International Nuclear Information System (INIS)

    Rao, Y.F.; Cheng, Z.; Waddington, G.M.; Nava-Dominguez, A.

    2014-01-01

    Highlights: • Introduction to a new version of the Canadian subchannel code, ASSERT-PV 3.2. • Enhanced models for flow-distribution, CHF and post-dryout heat transfer prediction. • Model changes focused on unique features of horizontal CANDU bundles. • Detailed description of model changes for all major thermalhydraulics models. • Discussion on rationale and limitation of the model changes. - Abstract: Atomic Energy of Canada Limited (AECL) has developed the subchannel thermalhydraulics code ASSERT-PV for the Canadian nuclear industry. The most recent release version, ASSERT-PV 3.2 has enhanced phenomenon models for improved predictions of flow distribution, dryout power and CHF location, and post-dryout (PDO) sheath temperature in horizontal CANDU fuel bundles. The focus of the improvements is mainly on modeling considerations for the unique features of CANDU bundles such as horizontal flows, small pitch to diameter ratios, high mass fluxes, and mixed and irregular subchannel geometries, compared to PWR/BWR fuel assemblies. This paper provides a general introduction to ASSERT-PV 3.2, and describes the model changes or additions in the new version to improve predictions of flow distribution, dryout power and CHF location, and PDO sheath temperatures in CANDU fuel bundles

  11. The MICHELLE 2D/3D ES PIC Code Advances and Applications

    CERN Document Server

    Petillo, John; De Ford, John F; Dionne, Norman J; Eppley, Kenneth; Held, Ben; Levush, Baruch; Nelson, Eric M; Panagos, Dimitrios; Zhai, Xiaoling

    2005-01-01

    MICHELLE is a new 2D/3D steady-state and time-domain particle-in-cell (PIC) code* that employs electrostatic and now magnetostatic finite-element field solvers. The code has been used to design and analyze a wide variety of devices that includes multistage depressed collectors, gridded guns, multibeam guns, annular-beam guns, sheet-beam guns, beam-transport sections, and ion thrusters. Latest additions to the MICHELLE/Voyager tool are as follows: 1) a prototype 3D self magnetic field solver using the curl-curl finite-element formulation for the magnetic vector potential, employing edge basis functions and accumulating current with MICHELLE's new unstructured grid particle tracker, 2) the electrostatic field solver now accommodates dielectric media, 3) periodic boundary conditions are now functional on all grids, not just structured grids, 4) the addition of a global optimization module to the user interface where both electrical parameters (such as electrode voltages)can be optimized, and 5) adaptive mesh ref...

  12. ASSERT-PV 3.2: Advanced subchannel thermalhydraulics code for CANDU fuel bundles

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Y.F., E-mail: raoy@aecl.ca; Cheng, Z., E-mail: chengz@aecl.ca; Waddington, G.M., E-mail: waddingg@aecl.ca; Nava-Dominguez, A., E-mail: navadoma@aecl.ca

    2014-08-15

    Highlights: • Introduction to a new version of the Canadian subchannel code, ASSERT-PV 3.2. • Enhanced models for flow-distribution, CHF and post-dryout heat transfer prediction. • Model changes focused on unique features of horizontal CANDU bundles. • Detailed description of model changes for all major thermalhydraulics models. • Discussion on rationale and limitation of the model changes. - Abstract: Atomic Energy of Canada Limited (AECL) has developed the subchannel thermalhydraulics code ASSERT-PV for the Canadian nuclear industry. The most recent release version, ASSERT-PV 3.2 has enhanced phenomenon models for improved predictions of flow distribution, dryout power and CHF location, and post-dryout (PDO) sheath temperature in horizontal CANDU fuel bundles. The focus of the improvements is mainly on modeling considerations for the unique features of CANDU bundles such as horizontal flows, small pitch to diameter ratios, high mass fluxes, and mixed and irregular subchannel geometries, compared to PWR/BWR fuel assemblies. This paper provides a general introduction to ASSERT-PV 3.2, and describes the model changes or additions in the new version to improve predictions of flow distribution, dryout power and CHF location, and PDO sheath temperatures in CANDU fuel bundles.

  13. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  14. An advanced frequency-domain code for boiling water reactor (BWR) stability analysis and design

    International Nuclear Information System (INIS)

    Behrooz, A.

    2008-01-01

    The two-phase flow instability is of interest for the design and operation of many industrial systems such as boiling water reactors (BWRs), chemical reactors, and steam generators. In case of BWRs, the flow instabilities are coupled to the power instabilities via neutronic-thermal hydraulic feedbacks. Since these instabilities produce also local pressure oscillations, the coolant flashing plays a very important role at low pressure. Many frequency-domain codes have been used for two-phase flow stability analysis of thermal hydraulic industrial systems with particular emphasis to BWRs. Some were ignoring the effect of the local pressure, or the effect of 3D power oscillations, and many were not able to deal with the neutronics-thermal hydraulics problems considering the entire core and all its fuel assemblies. The new frequency domain tool uses the best available nuclear, thermal hydraulic, algebraic and control theory methods for simulating BWRs and analyzing their stability in either off-line or on-line fashion. The novel code takes all necessary information from plant files via an interface, solves and integrates, for all reactor fuel assemblies divided into a number of segments, the thermal-hydraulic non-homogenous non-equilibrium coupled linear differential equations, and solves the 3D, two-energy-group diffusion equations for the entire core (with spatial expansion of the neutron fluxes in Legendre polynomials).It is important to note that the neutronics equations written in terms of flux harmonics for a discretized system (nodal-modal equations) generate a set of large sparse matrices. The eigenvalue problem associated to the discretized core statics equations is solved by the implementation of the implicit restarted Arnoldi method (IRAM) with implicit shifted QR mechanism. The results of the steady state are then used for the calculation of the local transfer functions and system transfer matrices. The later are large-dense and complex matrices, (their size

  15. Advanced Neutron Source Dynamic Model (ANSDM) code description and user guide

    International Nuclear Information System (INIS)

    March-Leuba, J.

    1995-08-01

    A mathematical model is designed that simulates the dynamic behavior of the Advanced Neutron Source (ANS) reactor. Its main objective is to model important characteristics of the ANS systems as they are being designed, updated, and employed; its primary design goal, to aid in the development of safety and control features. During the simulations the model is also found to aid in making design decisions for thermal-hydraulic systems. Model components, empirical correlations, and model parameters are discussed; sample procedures are also given. Modifications are cited, and significant development and application efforts are noted focusing on examination of instrumentation required during and after accidents to ensure adequate monitoring during transient conditions

  16. Nonlinear dynamics of laser systems with elements of a chaos: Advanced computational code

    Science.gov (United States)

    Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Kuznetsova, A. A.; Buyadzhi, A. A.; Prepelitsa, G. P.; Ternovsky, V. B.

    2017-10-01

    A general, uniform chaos-geometric computational approach to analysis, modelling and prediction of the non-linear dynamics of quantum and laser systems (laser and quantum generators system etc) with elements of the deterministic chaos is briefly presented. The approach is based on using the advanced generalized techniques such as the wavelet analysis, multi-fractal formalism, mutual information approach, correlation integral analysis, false nearest neighbour algorithm, the Lyapunov’s exponents analysis, and surrogate data method, prediction models etc There are firstly presented the numerical data on the topological and dynamical invariants (in particular, the correlation, embedding, Kaplan-York dimensions, the Lyapunov’s exponents, Kolmogorov’s entropy and other parameters) for laser system (the semiconductor GaAs/GaAlAs laser with a retarded feedback) dynamics in a chaotic and hyperchaotic regimes.

  17. Development of a Code for the Long Term Radiological Safety Assessment of Radioactive Wastes from Advanced Nuclear Fuel Cycle Facilities in Republic of Korea

    International Nuclear Information System (INIS)

    Hwang, Yong Soo

    2010-01-01

    For the purpose of evaluating annual individual doses from a potential repository disposing of radioactive wastes from the operation of the prospective advanced nuclear fuel cycle facilities in Korea, the new safety assessment code based on the Goldsim has been developed. It was designed to compare the environmental impacts from many fuel cycle options such as direct disposal, wet and dry recycling. The code based on the compartment theory can be applied to assess both normal and what if scenarios

  18. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. (Advanced Fuel Research, Inc., East Hartford, CT (United States) Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  19. Probable mode prediction for H.264 advanced video coding P slices using removable SKIP mode distortion estimation

    Science.gov (United States)

    You, Jongmin; Jeong, Jechang

    2010-02-01

    The H.264/AVC (advanced video coding) is used in a wide variety of applications including digital broadcasting and mobile applications, because of its high compression efficiency. The variable block mode scheme in H.264/AVC contributes much to its high compression efficiency but causes a selection problem. In general, rate-distortion optimization (RDO) is the optimal mode selection strategy, but it is computationally intensive. For this reason, the H.264/AVC encoder requires a fast mode selection algorithm for use in applications that require low-power and real-time processing. A probable mode prediction algorithm for the H.264/AVC encoder is proposed. To reduce the computational complexity of RDO, the proposed method selects probable modes among all allowed block modes using removable SKIP mode distortion estimation. Removable SKIP mode distortion is used to estimate whether or not a further divided block mode is appropriate for a macroblock. It is calculated using a no-motion reference block with a few computations. Then the proposed method reduces complexity by performing the RDO process only for probable modes. Experimental results show that the proposed algorithm can reduce encoding time by an average of 55.22% without significant visual quality degradation and increased bit rate.

  20. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  1. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    Energy Technology Data Exchange (ETDEWEB)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  2. Challenge problem and milestones for: Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC)

    International Nuclear Information System (INIS)

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe Jr.

    2010-01-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  3. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  4. Adaption, validation and application of advanced codes with 3-dimensional neutron kinetics for accident analysis calculations - STC with Bulgaria

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Mittag, S.; Rohde, U.; Seidel, A.; Panayotov, D.; Ilieva, B.

    2001-08-01

    In the frame of a project on scientific-technical co-operation funded by BMBF/BMWi, the program code DYN3D and the coupled code ATHLET-DYN3D have been transferred to the Institute for Nuclear Research and Nuclear Energy (INRNE) Sofia. The coupled code represents an implementation of the 3D core model DYN3D developed by FZR into the GRS thermal-hydraulics code system ATHLET. For the purpose of validation of these codes, a measurement data base about a start-up experiment obtained at the unit 6 of Kozloduy NPP (VVER-1000/V-320) has been generated. The results of performed validation calculations were compared with measurement values from the data base. A simplified model for estimation of cross flow mixing between fuel assemblies has been implemented into the program code DYN3D by Bulgarian experts. Using this cross flow model, transient processes with asymmetrical boundary conditions can be analysed more realistic. The validation of the implemented model were performed with help of comparison calculations between modified DYD3D code and thermal-hydraulics code COBRA-4I, and also on the base of the collected measurement data from Kozloduy NPP. (orig.) [de

  5. Advanced GF(32) nonbinary LDPC coded modulation with non-uniform 9-QAM outperforming star 8-QAM.

    Science.gov (United States)

    Liu, Tao; Lin, Changyu; Djordjevic, Ivan B

    2016-06-27

    In this paper, we first describe a 9-symbol non-uniform signaling scheme based on Huffman code, in which different symbols are transmitted with different probabilities. By using the Huffman procedure, prefix code is designed to approach the optimal performance. Then, we introduce an algorithm to determine the optimal signal constellation sets for our proposed non-uniform scheme with the criterion of maximizing constellation figure of merit (CFM). The proposed nonuniform polarization multiplexed signaling 9-QAM scheme has the same spectral efficiency as the conventional 8-QAM. Additionally, we propose a specially designed GF(32) nonbinary quasi-cyclic LDPC code for the coded modulation system based on the 9-QAM non-uniform scheme. Further, we study the efficiency of our proposed non-uniform 9-QAM, combined with nonbinary LDPC coding, and demonstrate by Monte Carlo simulation that the proposed GF(23) nonbinary LDPC coded 9-QAM scheme outperforms nonbinary LDPC coded uniform 8-QAM by at least 0.8dB.

  6. Measurement and modeling of advanced coal conversion processes. Annual report, October 1990--September 1991

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. [Advanced Fuel Research, Inc., East Hartford, CT (United States)]|[Brigham Young Univ., Provo, UT (United States)

    1991-12-31

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  7. Development of LIFE4-CN: a combined code for steady-state and transient analyses of advanced LMFBR fuels

    International Nuclear Information System (INIS)

    Liu, Y.Y.; Zawadzki, S.; Billone, M.C.; Nayak, U.P.; Roth, T.

    1979-01-01

    The methodology used to develop the LMFBR carbide/nitride fuels code, LIFE4-CN, is described in detail along with some subtleties encountered in code development. Fuel primary and steady-state thermal creep have been used as an example to illustrate the need for physical modeling and the need to recognize the importance of the materials characteristics. A self-consistent strategy for LIFE4-CN verification against irradiation data has been outlined with emphasis on the establishment of the gross uncertainty bands. These gross uncertainty bands can be used as an objective measure to gauge the overall success of the code predictions. Preliminary code predictions for sample steady-state and transient cases are given

  8. Advanced burnup calculation code system in a subcritical state with continuous-energy Monte Carlo code for fusion-fission hybrid reactor

    International Nuclear Information System (INIS)

    Matsunaka, Masayuki; Ohta, Masayuki; Miyamaru, Hiroyuki; Murata, Isao

    2009-01-01

    The fusion-fission (FF) hybrid reactor is a promising energy source that is thought to act as a bridge between the existing fission reactor and the genuine fusion reactor in the future. The burnup calculation system that aims at precise burnup calculations of a subcritical system was developed for the detailed design of the FF hybrid reactor, and the system consists of MCNP, ORIGEN, and postprocess codes. In the present study, the calculation system was substantially modified to improve the calculation accuracy and at the same time the calculation speed as well. The reaction rate estimation can be carried out accurately with the present system that uses track-length (TL) data in the continuous-energy treatment. As for the speed-up of the reaction rate calculation, a new TL data bunching scheme was developed so that only necessary TL data are used as long as the accuracy of the point-wise nuclear data is conserved. With the present system, an example analysis result for our proposed FF hybrid reactor is described, showing that the computation time could really be saved with the same accuracy as before. (author)

  9. Fuel swelling importance in PCI mechanistic modelling

    International Nuclear Information System (INIS)

    Arimescu, V.I.

    2005-01-01

    Under certain conditions, fuel pellet swelling is the most important factor in determining the intensity of the pellet-to-cladding mechanical interaction (PCMI). This is especially true during power ramps, which lead to a temperature increase to a higher terminal plateau that is maintained for hours. The time-dependent gaseous swelling is proportional to temperature and is also enhanced by the increased gas atom migration to the grain boundary during the power ramp. On the other hand, gaseous swelling is inhibited by a compressive hydrostatic stress in the pellet. Therefore, PCMI is the net result of combining gaseous swelling and pellet thermal expansion with the opposing feedback from the cladding mechanical reaction. The coupling of the thermal and mechanical processes, mentioned above, with various feedback loops is best simulated by a mechanistic fuel code. This paper discusses a mechanistic swelling model that is coupled with a fission gas release model as well as a mechanical model of the fuel pellet. The role of fuel swelling is demonstrated for typical power ramps at different burn-ups. Also, fuel swelling plays a significant role in avoiding the thermal instability for larger gap fuel rods, by limiting the potentially exponentially increasing gap due to the positive feedback loop effect of increasing fission gas release and the associated over-pressure inside the cladding. (author)

  10. TITAN: an advanced three-dimensional neutronics/thermal-hydraulics code for light water reactor safety analysis

    International Nuclear Information System (INIS)

    Griggs, D.P.; Kazimi, M.S.; Henry, A.F.

    1982-01-01

    The initial development of TITAN, a three-dimensional coupled neutronics/thermal-hydraulics code for LWR safety analysis, has been completed. The transient neutronics code QUANDRY has been joined to the two-fluid thermal-hydraulics code THERMIT with the appropriate feedback mechanisms modeled. A detailed steady-state and transient coupling scheme based on the tandem technique was implemented in accordance with the important structural and operational characteristics of QUANDRY and THERMIT. A two channel sample problem formed the basis for steady-state and transient analyses performed with TITAN. TITAN steady-state results were compared with those obtained with MEKIN and showed good agreement. Null transients, simulated turbine trip transients, and a rod withdrawal transient were analyzed with TITAN and reasonable results were obtained

  11. Experimental and Thermalhydraulic Code Assessment of the Transient Behavior of the Passive Condenser System in an Advanced Boiling Water Reactor

    Energy Technology Data Exchange (ETDEWEB)

    S.T. Revankar; W. Zhou; Gavin Henderson

    2008-07-08

    The main goal of the project was to study analytically and experimentally the condensation heat transfer for the passive condenser system such as GE Economic Simplified Boiling Water Reactor (ESBWR). The effect of noncondensable gas in condenser tube and the reduction of secondary pool water level to the condensation heat transfer coefficient was the main focus in this research. The objectives of this research were to : 1) obtain experimental data on the local and tube averaged condensation heat transfer rates for the PCCS with non-condensable and with change in the secondary pool water, 2) assess the RELAP5 and TRACE computer code against the experimental data, and 3) develop mathematical model and ehat transfer correlation for the condensation phenomena for system code application. The project involves experimentation, theoretical model development and verification, and thermal- hydraulic codes assessment.

  12. Experimental and Thermalhydraulic Code Assessment of the Transient Behavior of the Passive Condenser System in an Advanced Boiling Water Reactor

    International Nuclear Information System (INIS)

    S.T. Revankar; W. Zhou; Gavin Henderson

    2008-01-01

    The main goal of the project was to study analytically and experimentally the condensation heat transfer for the passive condenser system such as GE Economic Simplified Boiling Water Reactor (ESBWR). The effect of noncondensable gas in condenser tube and the reduction of secondary pool water level to the condensation heat transfer coefficient was the main focus in this research. The objectives of this research were to: (1) obtain experimental data on the local and tube averaged condensation heat transfer rates for the PCCS with non-condensable and with change in the secondary pool water, (2) assess the RELAP5 and TRACE computer code against the experimental data, and (3) develop mathematical model and heat transfer correlation for the condensation phenomena for system code application. The project involves experimentation, theoretical model development and verification, and thermal-hydraulic codes assessment

  13. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  14. Generative mechanistic explanation building in undergraduate molecular and cellular biology

    Science.gov (United States)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-09-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among scientists, we created and applied a theoretical framework to explore the strategies students use to construct explanations for 'novel' biological phenomena. Specifically, we explored how students navigated the multi-level nature of complex biological systems using generative mechanistic reasoning. Interviews were conducted with introductory and upper-division biology students at a large public university in the United States. Results of qualitative coding revealed key features of students' explanation building. Students used modular thinking to consider the functional subdivisions of the system, which they 'filled in' to varying degrees with mechanistic elements. They also hypothesised the involvement of mechanistic entities and instantiated abstract schema to adapt their explanations to unfamiliar biological contexts. Finally, we explored the flexible thinking that students used to hypothesise the impact of mutations on multi-leveled biological systems. Results revealed a number of ways that students drew mechanistic connections between molecules, functional modules (sets of molecules with an emergent function), cells, tissues, organisms and populations.

  15. Assessment of United States industry structural codes and standards for application to advanced nuclear power reactors: Appendices. Volume 2

    International Nuclear Information System (INIS)

    Adams, T.M.; Stevenson, J.D.

    1995-10-01

    Throughout its history, the USNRC has remained committed to the use of industry consensus standards for the design, construction, and licensing of commercial nuclear power facilities. The existing industry standards are based on the current class of light water reactors and as such may not adequately address design and construction features of the next generation of Advanced Light Water Reactors and other types of Advanced Reactors. As part of their on-going commitment to industry standards, the USNRC commissioned this study to evaluate US industry structural standards for application to Advanced Light Water Reactors and Advanced Reactors. The initial review effort included (1) the review and study of the relevant reactor design basis documentation for eight Advanced Light Water Reactors and Advanced Reactor Designs, (2) the review of the USNRCs design requirements for advanced reactors, (3) the review of the latest revisions of the relevant industry consensus structural standards, and (4) the identification of the need for changes to these standards. The results of these studies were used to develop recommended changes to industry consensus structural standards which will be used in the construction of Advanced Light Water Reactors and Advanced Reactors. Over seventy sets of proposed standard changes were recommended and the need for the development of four new structural standards was identified. In addition to the recommended standard changes, several other sets of information and data were extracted for use by USNRC in other on-going programs. This information included (1) detailed observations on the response of structures and distribution system supports to the recent Northridge, California (1994) and Kobe, Japan (1995) earthquakes, (2) comparison of versions of certain standards cited in the standard review plan to the most current versions, and (3) comparison of the seismic and wind design basis for all the subject reactor designs

  16. A 3-D CFD approach to the mechanistic prediction of forced convective critical heat flux at low quality

    International Nuclear Information System (INIS)

    Jean-Marie Le Corre; Cristina H Amon; Shi-Chune Yao

    2005-01-01

    Full text of publication follows: The prediction of the Critical Heat Flux (CHF) in a heat flux controlled boiling heat exchanger is important to assess the maximal thermal capability of the system. In the case of a nuclear reactor, CHF margin gain (using improved mixing vane grid design, for instance) can allow power up-rate and enhanced operating flexibility. In general, current nuclear core design procedures use quasi-1D approach to model the coolant thermal-hydraulic conditions within the fuel bundles coupled with fully empirical CHF prediction methods. In addition, several CHF mechanistic models have been developed in the past and coupled with 1D and quasi-1D thermal-hydraulic codes. These mechanistic models have demonstrated reasonable CHF prediction characteristics and, more remarkably, correct parametric trends over wide range of fluid conditions. However, since the phenomena leading to CHF are localized near the heater, models are needed to relate local quantities of interest to area-averaged quantities. As a consequence, large CHF prediction uncertainties may be introduced and 3D fluid characteristics (such as swirling flow) cannot be accounted properly. Therefore, a fully mechanistic approach to CHF prediction is, in general, not possible using the current approach. The development of CHF-enhanced fuel assembly designs requires the use of more advanced 3D coolant properties computations coupled with a CHF mechanistic modeling. In the present work, the commercial CFD code CFX-5 is used to compute 3D coolant conditions in a vertical heated tube with upward flow. Several CHF mechanistic models at low quality available in the literature are coupled with the CFD code by developing adequate models between local coolant properties and local parameters of interest to predict CHF. The prediction performances of these models are assessed using CHF databases available in the open literature and the 1995 CHF look-up table. Since CFD can reasonably capture 3D fluid

  17. Advances in Monte-Carlo code TRIPOLI-4®’s treatment of the electromagnetic cascade

    Directory of Open Access Journals (Sweden)

    Mancusi Davide

    2018-01-01

    Full Text Available TRIPOLI-4® is a Monte-Carlo particle-transport code developed at CEA-Saclay (France that is employed in the domains of nuclear-reactor physics, criticality-safety, shielding/radiation protection and nuclear instrumentation. The goal of this paper is to report on current developments, validation and verification made in TRIPOLI-4 in the electron/positron/photon sector. The new capabilities and improvements concern refinements to the electron transport algorithm, the introduction of a charge-deposition score, the new thick-target bremsstrahlung option, the upgrade of the bremsstrahlung model and the improvement of electron angular straggling at low energy. The importance of each of the developments above is illustrated by comparisons with calculations performed with other codes and with experimental data.

  18. Development of an advanced PFM code for the integrity evaluation of nuclear piping system under combined aging mechanisms

    International Nuclear Information System (INIS)

    Datta, Debashis

    2010-02-01

    A nuclear piping system is composed of several straight pipes and elbows joined by welding. These weld sections are usually the most susceptible failure parts susceptible to various degradation mechanisms. Whereas a specific location of a reactor piping system might fail by a combination of different aging mechanisms, e.g. fatigue and/or stress corrosion cracking, the majority of the piping probabilistic fracture mechanics (PFM) codes can only consider a single aging mechanism at a time. So, a probabilistic fracture mechanics computer code capable of considering multiple aging mechanisms was developed for an accurate failure analysis of each specific component of a nuclear piping section. The newly proposed crack morphology based probabilistic leak flow rate module is introduced in this code to separately treat fatigue and SCC type cracks. Improved models e.g. stressors models, elbow failure model, SIFs model, local seismic occurrence probability model, performance based crack detection models, etc., are also included in this code. Recent probabilistic fatigue (S-N) and SCC crack initiation (S-T) and subsequent crack growth rate models are coded. An integrated probabilistic risk assessment and probabilistic fracture mechanics methodology is proposed. A complete flow chart regarding the combined aging mechanism model is presented. The combined aging mechanism based module can significantly reduce simulation efforts and time. Two NUREG benchmark problems, e.g. reactor pressure vessel outlet nozzle section and a surge line elbow located just below the pressurizer are reinvestigated by this code. The results showed that, contribution of pre-existing cracks in addition to initiating cracks, can significantly increase the overall failure probability. Inconel weld location of reactor pressure vessel outlet nozzle section showed the weakest point in terms of relative through-wall leak failure probability in the order of about 10 -2 at the 40-year plant life. Considering

  19. Replacing the IRAF/PyRAF Code-base at STScI: The Advanced Camera for Surveys (ACS)

    Science.gov (United States)

    Lucas, Ray A.; Desjardins, Tyler D.; STScI ACS (Advanced Camera for Surveys) Team

    2018-06-01

    IRAF/PyRAF are no longer viable on the latest hardware often used by HST observers, therefore STScI no longer actively supports IRAF or PyRAF for most purposes. STScI instrument teams are in the process of converting all of our data processing and analysis code from IRAF/PyRAF to Python, including our calibration reference file pipelines and data reduction software. This is exemplified by our latest ACS Data Handbook, version 9.0, which was recently published in February 2018. Examples of IRAF and PyRAF commands have now been replaced by code blocks in Python, with references linked to documentation on how to download and install the latest Python software via Conda and AstroConda. With the temporary exception of the ACS slitless spectroscopy tool aXe, all ACS-related software is now independent of IRAF/PyRAF. A concerted effort has been made across STScI divisions to help the astronomical community transition from IRAF/PyRAF to Python, with tools such as Python Jupyter notebooks being made to give users workable examples. In addition to our code changes, the new ACS data handbook discusses the latest developments in charge transfer efficiency (CTE) correction, bias de-striping, and updates to the creation and format of calibration reference files among other topics.

  20. TITAN: an advanced three-dimensional coupled neutronic/thermal-hydraulics code for light water nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Griggs, D.P.; Kazimi, M.S.; Henry, A.F.

    1984-06-01

    The three-dimensional nodal neutronics code QUANDRY and the three-dimensional two-fluid thermal-hydraulics code THERMIT are combined into TITAN. Steady-state and transient coupling methodologies based upon a tandem structure were devised and implemented. Additional models for nuclear feedback, equilibrium xenon and direct moderator heating were added. TITAN was tested using a boiling water two channel problem and the coupling methodologies were shown to be effective. Simulated turbine trip transients and several control rod withdrawal transients were analyzed with good results. Sensitivity studies indicated that the time-step size can affect transient results significantly. TITAN was also applied to a quarter core PWR problem based on a real reactor geometry. The steady-state results were compared to a solution produced by MEKIN-B and poor agreement between the horizontal power shapes was found. Calculations with various mesh spacings showed that the mesh spacings in the MEKIN-B analysis were too large to produce accurate results with a finite difference method. The TITAN results were shown to be reasonable. A pair of control rod ejection accidents were also analyzed with TITAN. A comparison of the TITAN PWR control rod ejection results with results from coupled point kinetics/thermal-hydraulics analyses showed that the point kinetics method used (adiabatic method for control rod reactivities, steady-state flux shape for core-averaged reactivity feedback) underpredicted the power excursion in one case and overpredicted it in the other. It was therefore concluded that point kinetics methods should be used with caution and that three-dimensional codes like TITAN are superior for analyzing PWR control rod ejection transients

  1. Analysis of the three dimensional core kinetics NESTLE code coupling with the advanced thermo-hydraulic code systems, RELAP5/SCDAPSIM and its application to the Laguna Verde Central reactor

    International Nuclear Information System (INIS)

    Salazar C, J.H.; Nunez C, A.; Chavez M, C.

    2004-01-01

    The objective of the written present is to propose a methodology for the joining of the codes RELAP5/SCDAPSIM and NESTLE. The development of this joining will be carried out inside a doctoral program of Engineering in Energy with nuclear profile of the Ability of Engineering of the UNAM together with the National Commission of Nuclear Security and Safeguards (CNSNS). The general purpose of this type of developments, is to have tools that are implemented by multiple programs or codes such a that systems or models of the three-dimensional kinetics of the core can be simulated and those of the dynamics of the reactor (water heater-hydraulics). In the past, by limitations for the calculation of the complete answer of both systems, the developed models they were carried out for separate, putting a lot of emphasis in one but neglecting the other one. These methodologies, calls of better estimate, will be good to the nuclear industry to evaluate, with more high grades of detail, the designs of the nuclear power plant (for modifications to those already existent or for new concepts in the designs of advanced reactors), besides analysing events (transitory and have an accident), among other applications. The coupled system was applied to design studies and investigation of the Laguna Verde Nuclear power plant (CNLV). (Author)

  2. Computational code in atomic and nuclear quantum optics: Advanced computing multiphoton resonance parameters for atoms in a strong laser field

    Science.gov (United States)

    Glushkov, A. V.; Gurskaya, M. Yu; Ignatenko, A. V.; Smirnov, A. V.; Serga, I. N.; Svinarenko, A. A.; Ternovsky, E. V.

    2017-10-01

    The consistent relativistic energy approach to the finite Fermi-systems (atoms and nuclei) in a strong realistic laser field is presented and applied to computing the multiphoton resonances parameters in some atoms and nuclei. The approach is based on the Gell-Mann and Low S-matrix formalism, multiphoton resonance lines moments technique and advanced Ivanov-Ivanova algorithm of calculating the Green’s function of the Dirac equation. The data for multiphoton resonance width and shift for the Cs atom and the 57Fe nucleus in dependence upon the laser intensity are listed.

  3. Cellular Automata as an Example for Advanced Beginners’ Level Coding Exercises in a MOOC on Test Driven Development

    Directory of Open Access Journals (Sweden)

    Thomas Staubitz

    2017-06-01

    Full Text Available Programming tasks are an important part of teaching computer programming as they foster students to develop essential programming skills and techniques through practice.  The design of educational problems plays a crucial role in the extent to which the experiential knowledge is imparted to the learner both in terms of quality and quantity. Badly designed tasks have been known to put-off students from practicing programming. Hence, there is a need for carefully designed problems. Cellular Automata programming lends itself as a very suitable candidate among problems designed for programming practice. In this paper, we describe how various types of problems can be designed using concepts from Cellular Automata and discuss the features which make them good practice problems with regard to instructional pedagogy. We also present a case study on a Cellular Automata programming exercise used in a MOOC on Test Driven Development using JUnit, and discuss the automated evaluation of code submissions and the feedback about the reception of this exercise by participants in this course. Finally, we suggest two ideas to facilitate an easier approach of creating such programming exercises.

  4. High burnup models in computer code fair

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, B K; Swami Prasad, P; Kushwaha, H S; Mahajan, S C; Kakodar, A [Bhabha Atomic Research Centre, Bombay (India)

    1997-08-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ``Light water reactor fuel rod modelling code evaluation`` and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs.

  5. High burnup models in computer code fair

    International Nuclear Information System (INIS)

    Dutta, B.K.; Swami Prasad, P.; Kushwaha, H.S.; Mahajan, S.C.; Kakodar, A.

    1997-01-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ''Light water reactor fuel rod modelling code evaluation'' and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs

  6. Summary Report for ASC L2 Milestone #4782: Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes

    Energy Technology Data Exchange (ETDEWEB)

    Neely, J. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hornung, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Black, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Robinson, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-29

    This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completion signed by the review committee will act as proof of completion for this milestone.

  7. Modeling constituent redistribution in U–Pu–Zr metallic fuel using the advanced fuel performance code BISON

    International Nuclear Information System (INIS)

    Galloway, J.; Unal, C.; Carlson, N.; Porter, D.; Hayes, S.

    2015-01-01

    Highlights: • An improved constituent distribution formulation in metallic nuclear fuels. • The new algorithm is implemented into the advanced fuel performance framework BISON. • Experimental Breeder Reactor-II data, T179, DP16, T459 are reanalyzed. • Phase dependent diffusion coefficients are improved. • Most influential phase is gamma, followed by alpha and thirdly the beta phase. - Abstract: An improved robust formulation for constituent distribution in metallic nuclear fuels is developed and implemented into the advanced fuel performance framework BISON. The coupled thermal diffusion equations are solved simultaneously to reanalyze the constituent redistribution in post irradiation data from fuel tests performed in Experimental Breeder Reactor-II (EBR-II). Deficiencies observed in previously published formulation and numerical implementations are also improved. The present model corrects an inconsistency between the enthalpies of solution and the solubility limit curves of the phase diagram while also adding an artificial diffusion term when in the 2-phase regime that stabilizes the standard Galerkin finite element (FE) method used by BISON. An additional improvement is in the formulation of zirconium flux as it relates to the Soret term. With these new modifications, phase dependent diffusion coefficients are revaluated and compared with the previously recommended values. The model validation included testing against experimental data from fuel pins T179, DP16 and T459, irradiated in EBR-II. A series of viable material properties for U–Pu–Zr based materials was determined through a sensitivity study, which resulted in three cases with differing parameters that showed strong agreement with one set of experimental data, rod T179. Subsequently a full-scale simulation of T179 was performed to reduce uncertainties, particularly relating to the temperature boundary condition for the fuel. In addition a new thermal conductivity model combining all

  8. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  9. FASTDART - A fast, accurate and friendly version of DART code

    International Nuclear Information System (INIS)

    Rest, Jeffrey; Taboada, Horacio

    2000-01-01

    A new enhanced, visual version of DART code is presented. DART is a mechanistic model based code, developed for the performance calculation and assessment of aluminum dispersion fuel. Major issues of this new version are the development of a new, time saving calculation routine, able to be run on PC, a friendly visual input interface and a plotting facility. This version, available for silicide and U-Mo fuels, adds to the classical accuracy of DART models for fuel performance prediction, a faster execution and visual interfaces. It is part of a collaboration agreement between ANL and CNEA in the area of Low Enriched Uranium Advanced Fuels, held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy. (author)

  10. Advances in the development of interaction between the codes MCNPX and ANSYS Fluent and their fusion applications; Avances en el desarrollo de la interaccion entre los codigos MCNPX y ANSYS Fluente y sus aplicaciones para fusion

    Energy Technology Data Exchange (ETDEWEB)

    Colomer, C.; Salellas, J.; Ahmed, R.; Fabbrio, M.; Aleman, A.

    2012-07-01

    The advances are presented in the project for the development of a code of interaction between MCNPX y el ANSYS Fluent. Following the flow of the work carried out during the development of the project will study of the most appropriate remeshing algorithms between both codes. In addition explain the selection and implementation of methods to verify internally the correct transmission of the variables involved between both nets. Finally the selection of cases for verification and validation of the interaction between both codes in each of the possible fields of application will be exposed.

  11. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  12. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  13. Advanced Best-Estimate Methodologies for Thermal-Hydraulics Stability Analyses with TRACG code and Improvements on Operating Boiling Water Reactors

    International Nuclear Information System (INIS)

    Vedovi, J.; Trueba, M.; Ibarra, L; Espino, M.; Hoang, H.

    2016-01-01

    In recent years GE Hitachi has introduced two advanced methodologies to address the thermal-hydraulics instabilities in Boiling Water Reactors (BWRs); the “Detect and Suppress Solution - Confirmation Density (DSS-CD)” and the “GEH Simplified Stability Solution (GS3).” These two methodologies are based on Best-Estimate Plus Uncertainty (BEPU) analyses and provide significant improvement on safety, plant maneuvering and fuel economics with respect to existing solutions. DSS-CD and GS3 solutions have been recently approved by the United States Nuclear Regulatory Commission. This paper describes the main characteristics of these two stability methodologies and shares the experience of their recent implementation in operating BWRs. The BEPU approach provided a much deeper understanding of the parameters affecting instabilities in operating BWRs and allowed for better calculation of plant setpoints by improving plant manoeuvring restrictions and reducing manual operator actions. DSS-CD and GS3 methodologies are both based on safety analyses performed with the best-estimate system code TRACG. The assessment of uncertainty is performed following the Code Scaling, Applicability and Uncertainty (CSAU) methodology documented in NUREG/CR-5249. The two solutions have been already implemented in a combined 18 BWR units with 7 more units in the process of transitioning. The main results demonstrate a significant decrease (>0.1) in the stability based Operating Limit Minimum Critical Power Ratio (OLMCPR), which possibly results in significant fuel savings and the increase in allowable stability plant setpoints that address instability events such as the one occurred at the Fermi 2 plant in 2015 and can help prevent unnecessary Scrams. The paper also describes the advantages of reduced plant manoeuvring as a result to transitioning to these solutions; in particular the history of a BWR/6 transition to DSS-CD is discussed.

  14. Mechanistic Indicators of Childhood Asthma (MICA): piloting ...

    Science.gov (United States)

    Background: Modem methods in molecular biology and advanced computational tools show promise in elucidating complex interactions that occur between genes and environmental factors in diseases such as asthma; however appropriately designed studies are critical for these methods to reach their full potential. Objective: We used a case-control study to investigate whether genomic data (blood gene expression), viewed together with a spectrum of exposure effects and susceptibility markers (blood, urine and nail), can provide a mechanistic explanation for the increased susceptibility of asthmatics to ambient air pollutants. Methods: We studied 205 non-asthmatic and asthmatic children, (9-12 years of age) who participated in a clinical study in Detroit, Michigan. The study combines a traditional epidemiological design with an integrative approach to investigate the environmental exposure of children to indoor-outdoor air. The study includes measurements of internal dose (metals, allergen specific IgE, PAH and VOC metabolites) and clinical measures of health outcome (immunological, cardiovascular and respiratory). Results: Expected immunological indications of asthma have been obtained. In addition, initial results from our analyses point to the complex nature of childhood health and risk factors linked to metabolic syndrome (obesity, blood pressure and dyslipidemia). For example, 31% and 34% of the asthmatic MICA subjects were either overweight (BMI > 25) o

  15. Rational and Mechanistic Perspectives on Reinforcement Learning

    Science.gov (United States)

    Chater, Nick

    2009-01-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…

  16. Full Scope Modeling and Analysis on the Secondary Circuit of Chinese Large-Capacity Advanced PWR Based on RELAP5 Code

    Directory of Open Access Journals (Sweden)

    Dao-gang Lu

    2015-01-01

    Full Text Available Chinese large-capacity advanced PWR under construction in China is a new and indispensable reactor type in the developing process of NPP fields. At the same time of NPP construction, accident sequences prediction and operators training are in progress. Since there are some possible events such as feedwater pumps trip in secondary circuit may lead to severe accident in NPP, training simulators and engineering simulators of CI are necessary. And, with an increasing proportion of nuclear power in China, NPP will participate in regulating peak load in power network, which requires accuracy calculation and control of secondary circuit. In order to achieve real-time and full scope simulation in the power change transient and accident scenarios, RELAP5/MOD 3.4 code has been adopted to model the secondary circuit for its advantage of high calculation accuracy. This paper describes the model of steady state and turbine load transient from 100% to 40% of secondary circuit using RELAP5 and provides a reasonable equivalent method to solve the calculation divergence problem caused by dramatic two-phase condition change while guaranteeing the heat transfer efficiency. The validation of the parameters shows that all the errors between the calculation values and design values are reasonable and acceptable.

  17. Overexpression of long non-coding RNA colon cancer-associated transcript 2 is associated with advanced tumor progression and poor prognosis in patients with colorectal cancer.

    Science.gov (United States)

    Zhang, Junling; Jiang, Yong; Zhu, Jing; Wu, Tao; Ma, Ju; Du, Chuang; Chen, Shanwen; Li, Tengyu; Han, Jinsheng; Wang, Xin

    2017-12-01

    The aim of the present study was to explore the clinicopathological and prognostic significance of long non-coding RNA (lncRNA) colon cancer-associated transcript 2 (CCAT2) expression in human colorectal cancer (CRC). Expression levels of lncRNA CCAT2 in CRC, adjacent non-tumor and healthy colon mucosa tissues were detected by quantitative polymerase chain reaction. The disease-free survival and overall survival rates were evaluated using the Kaplan-Meier method, and multivariate analysis was performed using Cox proportional hazard analysis. The expression level of lncRNA CCAT2 in CRC tissues was increased significantly compared with adjacent normal tissues or non-cancerous tissues. CCAT2 expression was observed to be progressively increased between tumor-node-metastasis (TNM) stages I and IV. A high level of CCAT2 expression was revealed to be associated with poor cell differentiation, deeper tumor infiltration, lymph node metastasis, distance metastasis, vascular invasion and advanced TNM stage. Compared with patients with low levels of CCAT2 expression, patients with high levels of CCAT2 expression had shorter disease-free survival and overall survival times. Multivariate analyses indicated that high CCAT2 expression was an independent poor prognostic factor. Therefore, increased lncRNA CCAT2 expression maybe a potential diagnostic biomarker for CRC, and an independent predictor of prognosis in patients with CRC.

  18. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  19. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  20. Progress on DART code optimization

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego; Rest, Jeffrey

    1999-01-01

    This work consists about the progress made on the design and development of a new optimized version of DART code (DART-P), a mechanistic computer model for the performance calculation and assessment of aluminum dispersion fuel. It is part of a collaboration agreement between CNEA and ANL in the area of Low Enriched Uranium Advanced Fuels. It is held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy, signed on October 16, 1997 between US DOE and the National Atomic Energy Commission of the Argentine Republic. DART optimization is a biannual program; it is operative since February 8, 1999 and has the following goals: 1. Design and develop a new DART calculation kernel for implementation within a parallel processing architecture. 2. Design and develop new user-friendly I/O routines to be resident on Personal Computer (PC)/WorkStation (WS) platform. 2.1. The new input interface will be designed and developed by means of a Visual interface, able to guide the user in the construction of the problem to be analyzed with the aid of a new database (described in item 3, below). The new I/O interface will include input data check controls in order to avoid corrupted input data. 2.2. The new output interface will be designed and developed by means of graphical tools, able to translate numeric data output into 'on line' graphic information. 3. Design and develop a new irradiated materials database, to be resident on PC/WS platform, so as to facilitate the analysis of the behavior of different fuel and meat compositions with DART-P. Currently, a different version of DART is used for oxide, silicide, and advanced alloy fuels. 4. Develop rigorous general inspection algorithms in order to provide valuable DART-P benchmarks. 5. Design and develop new models, such as superplasticity, elastoplastic feedback, improved models for the calculation of fuel deformation and the evolution of the fuel microstructure for

  1. Post-prandial reflux suppression by a raft-forming alginate (Gaviscon Advance) compared to a simple antacid documented by magnetic resonance imaging and pH-impedance monitoring: mechanistic assessment in healthy volunteers and randomised, controlled, double-blind study in reflux patients.

    Science.gov (United States)

    Sweis, R; Kaufman, E; Anggiansah, A; Wong, T; Dettmar, P; Fried, M; Schwizer, W; Avvari, R K; Pal, A; Fox, M

    2013-06-01

    Alginates form a raft above the gastric contents, which may suppress gastro-oesophageal reflux; however, inconsistent effects have been reported in mechanistic and clinical studies. To visualise reflux suppression by an alginate-antacid [Gaviscon Advance (GA), Reckitt Benckiser, UK] compared with a nonraft-forming antacid using magnetic resonance imaging (MRI), and to determine the feasibility of pH-impedance monitoring for assessment of reflux suppression by alginates. Two studies were performed: (i) GA and antacid (Alucol, Wander Ltd, Switzerland) were visualised in the stomach after ingestion in 12 healthy volunteers over 30 min after a meal by MRI, with reflux events documented by manometry. (ii) A randomised controlled, double-blind cross-over trial of post-prandial reflux suppression documented by pH-impedance in 20 patients randomised to GA or antacid (Milk of Magnesia; Boots, UK) after two meals taken 24 h apart. MRI visualized a "mass" of GA form at the oesophago-gastric junction (OGJ); simple antacid sank to the distal stomach. The number of post-prandial common cavity reflux events was less with GA than antacid [median 2 (0-5) vs. 5 (1-11); P < 0.035]. Distal reflux events and acid exposure measured by pH-impedance were similar after GA and antacid. There was a trend to reduced proximal reflux events with GA compared with antacid [10.5 (8.9) vs. 13.9 (8.3); P = 0.070]. Gaviscon Advance forms a 'mass' close to the OGJ and significantly suppresses reflux compared with a nonraft-forming antacid. Standard pH-impedance monitoring is suitable for clinical studies of GA in gastro-oesophageal reflux disease patients where proximal reflux is the primary outcome. © 2013 Blackwell Publishing Ltd.

  2. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  3. Evaluation of mechanistic DNB models using HCLWR CHF data

    International Nuclear Information System (INIS)

    Iwamura, Takamichi; Watanabe, Hironori; Okubo, Tsutomu; Araya, Fumimasa; Murao, Yoshio.

    1992-03-01

    An onset of departure from nucleate boiling (DNB) in light water reactor (LWR) has been generally predicted with empirical correlations. Since these correlations have less physical bases and contain adjustable empirical constants determined by best fitting of test data, applicable geometries and flow conditions are limited within the original experiment ranges. In order to obtain more universal prediction method, several mechanistic DNB models based on physical approaches have been proposed in recent years. However, the predictive capabilities of mechanistic DNB models have not been verified successfully especially for advanced LWR design purposes. In this report, typical DNB mechanistic models are reviewed and compared with critical heat flux (CHF) data for high conversion light water reactor (HCLWR). The experiments were performed using triangular 7-rods array with non-uniform axial heat flux distribution. Test pressure was 16 MPa, mass velocities ranged from 800 t0 3100 kg/s·m 2 and exit qualities from -0.07 to 0.19. The evaluated models are: 1) Wisman-Pei, 2) Chang-Lee, 3) Lee-Mudawwar, 4) Lin-Lee-Pei, and 5) Katto. The first two models are based on near-wall bubble crowding model and the other three models on sublayer dryout model. The comparison with experimental data indicated that the Weisman-Pei model agreed relatively well with the CHF data. Effects of empirical constants in each model on CHF calculation were clarified by sensitivity studies. It was also found that the magnitudes of physical quantities obtained in the course of calculation were significantly different for each model. Therefore, microscopic observation of the onset of DNB on heated surface is essential to clarify the DNB mechanism and establish a general DNB mechanistic model based on physical phenomenon. (author)

  4. Smart time-pulse coding photoconverters as basic components 2D-array logic devices for advanced neural networks and optical computers

    Science.gov (United States)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Michalnichenko, Nikolay N.

    2004-04-01

    The article deals with a conception of building arithmetic-logic devices (ALD) with a 2D-structure and optical 2D-array inputs-outputs as advanced high-productivity parallel basic operational training modules for realization of basic operation of continuous, neuro-fuzzy, multilevel, threshold and others logics and vector-matrix, vector-tensor procedures in neural networks, that consists in use of time-pulse coding (TPC) architecture and 2D-array smart optoelectronic pulse-width (or pulse-phase) modulators (PWM or PPM) for transformation of input pictures. The input grayscale image is transformed into a group of corresponding short optical pulses or time positions of optical two-level signal swing. We consider optoelectronic implementations of universal (quasi-universal) picture element of two-valued ALD, multi-valued ALD, analog-to-digital converters, multilevel threshold discriminators and we show that 2D-array time-pulse photoconverters are the base elements for these devices. We show simulation results of the time-pulse photoconverters as base components. Considered devices have technical parameters: input optical signals power is 200nW_200μW (if photodiode responsivity is 0.5A/W), conversion time is from tens of microseconds to a millisecond, supply voltage is 1.5_15V, consumption power is from tens of microwatts to a milliwatt, conversion nonlinearity is less than 1%. One cell consists of 2-3 photodiodes and about ten CMOS transistors. This simplicity of the cells allows to carry out their integration in arrays of 32x32, 64x64 elements and more.

  5. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Mechanistic species distribution modelling as a link between physiology and conservation.

    Science.gov (United States)

    Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W

    2015-01-01

    Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and

  7. Mechanistic assessment of hillslope transpiration controls of diel subsurface flow: a steady-state irrigation approach

    Science.gov (United States)

    H.R. Barnard; C.B. Graham; W.J. van Verseveld; J.R. Brooks; B.J. Bond; J.J. McDonnell

    2010-01-01

    Mechanistic assessment of how transpiration influences subsurface flow is necessary to advance understanding of catchment hydrology. We conducted a 24-day, steady-state irrigation experiment to quantify the relationships among soil moisture, transpiration and hillslope subsurface flow. Our objectives were to: (1) examine the time lag between maximum transpiration and...

  8. Contributions to the validation of advanced codes for accident analysis calculations with 3-dimensional neutron kinetics. STC with the Ukraine. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Rohde, U.; Khalimonchuk, V.; Kuchin, A.; Seidel, A.

    2000-10-01

    In the frame of a project of scientific-technical cooperation funded by BMBF/BMWi, the coupled code ATHLET-DYN3D has been transferred to the Scientific and Technical Centre on Nuclear and Radiation Safety Kiev (Ukraine). This program code represents an implementation of the 3D core model DYN3D developed by FZR into the GRS thermohydraulics code system ATHLET. For the purpose of validation of this coupled code, a measurement data base has been generated. In the data base suitable experimental data for operational transients from NPPs are collected. The data collection and documentation was performed in accordance with a directive about requirements to measurement data for code validation, which has been elaborated within the project. The validation calculations have been performed for two selected transients. The results of these calculations were compared with measurement values from the data base. The function of the code DYN3D was expanded with a subroutine for reactivity coefficients calculation. Using this modification of the code DYN3D, investigations of reactivity contributions on different operational processes can be performed. (orig.) [de

  9. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - models and correlations

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.; Mallen, A.N.; Neymotin, L.Y.

    1998-03-01

    This document describes the major modifications and improvements made to the modeling of the RAMONA-3B/MOD0 code since 1981, when the code description and assessment report was completed. The new version of the code is RAMONA-4B. RAMONA-4B is a systems transient code for application to different versions of Boiling Water Reactors (BWR) such as the current BWR, the Advanced Boiling Water Reactor (ABWR), and the Simplified Boiling Water Reactor (SBWR). This code uses a three-dimensional neutron kinetics model coupled with a multichannel, non-equilibrium, drift-flux, two-phase flow formulation of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients and instability issues. Chapter 1 is an overview of the code`s capabilities and limitations; Chapter 2 discusses the neutron kinetics modeling and the implementation of reactivity edits. Chapter 3 is an overview of the heat conduction calculations. Chapter 4 presents modifications to the thermal-hydraulics model of the vessel, recirculation loop, steam separators, boron transport, and SBWR specific components. Chapter 5 describes modeling of the plant control and safety systems. Chapter 6 presents and modeling of Balance of Plant (BOP). Chapter 7 describes the mechanistic containment model in the code. The content of this report is complementary to the RAMONA-3B code description and assessment document. 53 refs., 81 figs., 13 tabs.

  10. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  11. Testing mechanistic models of growth in insects.

    Science.gov (United States)

    Maino, James L; Kearney, Michael R

    2015-11-22

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).

  12. Mechanistic model for microbial growth on hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Mallee, F M; Blanch, H W

    1977-12-01

    Based on available information describing the transport and consumption of insoluble alkanes, a mechanistic model is proposed for microbial growth on hydrocarbons. The model describes the atypical growth kinetics observed, and has implications in the design of large scale equipment for single cell protein (SCP) manufacture from hydrocarbons. The model presents a framework for comparison of the previously published experimental kinetic data.

  13. Mechanistic Indicators of Childhood Asthma (MICA) Study

    Science.gov (United States)

    The Mechanistic Indicators of Childhood Asthma (MICA) Study has been designed to incorporate state-of-the-art technologies to examine the physiological and environmental factors that interact to increase the risk of asthmatic responses. MICA is primarily a clinically-bases obser...

  14. Mechanistic aspects of ionic reactions in flames

    DEFF Research Database (Denmark)

    Egsgaard, H.; Carlsen, L.

    1993-01-01

    Some fundamentals of the ion chemistry of flames are summarized. Mechanistic aspects of ionic reactions in flames have been studied using a VG PlasmaQuad, the ICP-system being substituted by a simple quartz burner. Simple hydrocarbon flames as well as sulfur-containing flames have been investigated...

  15. Assessment of the GOTHIC code for prediction of hydrogen flame propagation in small scale experiments

    International Nuclear Information System (INIS)

    Lee, Jin-Yong . E-mail jinyong1@fnctech.com; Lee, Jung-Jae; Park, Goon-Cherl . E-mail parkgc@snu.ac.kr

    2006-01-01

    With the rising concerns regarding the time and space dependent hydrogen behavior in severe accidents, the calculation for local hydrogen combustion in compartment has been attempted using CFD codes like GOTHIC. In particular, the space resolved hydrogen combustion analysis is essential to address certain safety issues such as the safety components survivability, and to determine proper positions for hydrogen control devices as e.q. recombiners or igniters. In the GOTHIC 6.1b code, there are many advanced features associated with the hydrogen burn models to enhance its calculation capability. In this study, we performed premixed hydrogen/air combustion experiments with an upright, rectangular shaped, combustion chamber of dimensions 1 m x 0.024 m x 1 m. The GOTHIC 6.1b code was used to simulate the hydrogen/air combustion experiments, and its prediction capability was assessed by comparing the experimental with multidimensional calculational results. Especially, the prediction capability of the GOTHIC 6.1b code for local hydrogen flame propagation phenomena was examined. For some cases, comparisons are also presented for lumped modeling of hydrogen combustion. By evaluating the effect of parametric simulations, we present some instructions for local hydrogen combustion analysis using the GOTHIC 6.1b code. From the analyses results, it is concluded that the modeling parameter of GOTHIC 6.1b code should be modified when applying the mechanistic burn model for hydrogen propagation analysis in small geometry

  16. Mechanistic and Economical Characteristics of Asphalt Rubber Mixtures

    Directory of Open Access Journals (Sweden)

    Mena I. Souliman

    2016-01-01

    Full Text Available Load associated fatigue cracking is one of the major distress types occurring in flexible pavement systems. Flexural bending beam fatigue laboratory test has been used for several decades and is considered to be an integral part of the new superpave advanced characterization procedure. One of the most significant solutions to prolong the fatigue life for an asphaltic mixture is to utilize flexible materials as rubber. A laboratory testing program was performed on a conventional and Asphalt Rubber- (AR- gap-graded mixtures to investigate the impact of added rubber on the mechanical, mechanistic, and economical attributes of asphaltic mixtures. Strain controlled fatigue tests were conducted according to American Association of State Highway and Transportation Officials (AASHTO procedures. The results from the beam fatigue tests indicated that the AR-gap-graded mixtures would have much longer fatigue life compared with the reference (conventional mixtures. In addition, a mechanistic analysis using 3D-Move software coupled with a cost analysis study based on the fatigue performance on the two mixtures was performed. Overall, analysis showed that AR modified asphalt mixtures exhibited significantly lower cost of pavement per 1000 cycles of fatigue life per mile compared to conventional HMA mixture.

  17. An Applied Study of Implementation of the Advanced Decommissioning Costing Methodology for Intermediate Storage Facility for Spent Fuel in Studsvik, Sweden with special emphasis to the application of the Omega code

    Energy Technology Data Exchange (ETDEWEB)

    Kristofova, Kristina; Vasko, Marek; Daniska, Vladimir; Ondra, Frantisek; Bezak, Peter [DECOM Slovakia, spol. s.r.o., J. Bottu 2, SK-917 01 Trnava (Slovakia); Lindskog, Staffan [Swedish Nuclear Power Inspectorate, Stockholm (Sweden)

    2007-01-15

    The presented study is focused on an analysis of decommissioning costs for the Intermediate Storage Facility for Spent Fuel (FA) facility in Studsvik prepared by SVAFO and a proposal of the advanced decommissioning costing methodology application. Therefore, this applied study concentrates particularly in the following areas: 1. Analysis of FA facility cost estimates prepared by SVAFO including description of FA facility in Studsvik, summarised input data, applied cost estimates methodology and summarised results from SVAFO study. 2. Discussion of results of the SVAFO analysis, proposals for enhanced cost estimating methodology and upgraded structure of inputs/outputs for decommissioning study for FA facility. 3. Review of costing methodologies with the special emphasis on the advanced costing methodology and cost calculation code OMEGA. 4. Discussion on implementation of the advanced costing methodology for FA facility in Studsvik together with: - identification of areas of implementation; - analyses of local decommissioning infrastructure; - adaptation of the data for the calculation database; - inventory database; and - implementation of the style of work with the computer code OMEGA.

  18. An Applied Study of Implementation of the Advanced Decommissioning Costing Methodology for Intermediate Storage Facility for Spent Fuel in Studsvik, Sweden with special emphasis to the application of the Omega code

    International Nuclear Information System (INIS)

    Kristofova, Kristina; Vasko, Marek; Daniska, Vladimir; Ondra, Frantisek; Bezak, Peter; Lindskog, Staffan

    2007-01-01

    The presented study is focused on an analysis of decommissioning costs for the Intermediate Storage Facility for Spent Fuel (FA) facility in Studsvik prepared by SVAFO and a proposal of the advanced decommissioning costing methodology application. Therefore, this applied study concentrates particularly in the following areas: 1. Analysis of FA facility cost estimates prepared by SVAFO including description of FA facility in Studsvik, summarised input data, applied cost estimates methodology and summarised results from SVAFO study. 2. Discussion of results of the SVAFO analysis, proposals for enhanced cost estimating methodology and upgraded structure of inputs/outputs for decommissioning study for FA facility. 3. Review of costing methodologies with the special emphasis on the advanced costing methodology and cost calculation code OMEGA. 4. Discussion on implementation of the advanced costing methodology for FA facility in Studsvik together with: - identification of areas of implementation; - analyses of local decommissioning infrastructure; - adaptation of the data for the calculation database; - inventory database; and - implementation of the style of work with the computer code OMEGA

  19. Advanced Design of a Novel Stellarator Using the Free Boundary VMEC Magnetic Equilibrium Code. Final Technical Report for period March 1, 1999 - February 28, 2002

    International Nuclear Information System (INIS)

    Knowlton, S. F.

    2005-01-01

    This report describes the goals and accomplishments of a 3-year EPSCoR Laboratory Partnership award to design an advanced stellarator device for magnetic confinement of toroidal plasmas for fusion research

  20. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  1. Toward a Rational and Mechanistic Account of Mental Effort.

    Science.gov (United States)

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  2. Mechanistic systems modeling to guide drug discovery and development.

    Science.gov (United States)

    Schmidt, Brian J; Papin, Jason A; Musante, Cynthia J

    2013-02-01

    A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. MELMRK 2.0: A description of computer models and results of code testing

    International Nuclear Information System (INIS)

    Wittman, R.S.; Denny, V.; Mertol, A.

    1992-01-01

    An advanced version of the MELMRK computer code has been developed that provides detailed models for conservation of mass, momentum, and thermal energy within relocating streams of molten metallics during meltdown of Savannah River Site (SRS) reactor assemblies. In addition to a mechanistic treatment of transport phenomena within a relocating stream, MELMRK 2.0 retains the MOD1 capability for real-time coupling of the in-depth thermal response of participating assembly heat structure and, further, augments this capability with models for self-heating of relocating melt owing to steam oxidation of metallics and fission product decay power. As was the case for MELMRK 1.0, the MOD2 version offers state-of-the-art numerics for solving coupled sets of nonlinear differential equations. Principal features include application of multi-dimensional Newton-Raphson techniques to accelerate convergence behavior and direct matrix inversion to advance primitive variables from one iterate to the next. Additionally, MELMRK 2.0 provides logical event flags for managing the broad range of code options available for treating such features as (1) coexisting flow regimes, (2) dynamic transitions between flow regimes, and (3) linkages between heatup and relocation code modules. The purpose of this report is to provide a detailed description of the MELMRK 2.0 computer models for melt relocation. Also included are illustrative results for code testing, as well as an integrated calculation for meltdown of a Mark 31a assembly

  4. Conceptual models for waste tank mechanistic analysis

    International Nuclear Information System (INIS)

    Allemann, R.T.; Antoniak, Z.I.; Eyler, L.L.; Liljegren, L.M.; Roberts, J.S.

    1992-02-01

    Pacific Northwest Laboratory (PNL) is conducting a study for Westinghouse Hanford Company (Westinghouse Hanford), a contractor for the US Department of Energy (DOE). The purpose of the work is to study possible mechanisms and fluid dynamics contributing to the periodic release of gases from double-shell waste storage tanks at the Hanford Site in Richland, Washington. This interim report emphasizing the modeling work follows two other interim reports, Mechanistic Analysis of Double-Shell Tank Gas Release Progress Report -- November 1990 and Collection and Analysis of Existing Data for Waste Tank Mechanistic Analysis Progress Report -- December 1990, that emphasized data correlation and mechanisms. The approach in this study has been to assemble and compile data that are pertinent to the mechanisms, analyze the data, evaluate physical properties and parameters, evaluate hypothetical mechanisms, and develop mathematical models of mechanisms

  5. Toward mechanistic classification of enzyme functions.

    Science.gov (United States)

    Almonacid, Daniel E; Babbitt, Patricia C

    2011-06-01

    Classification of enzyme function should be quantitative, computationally accessible, and informed by sequences and structures to enable use of genomic information for functional inference and other applications. Large-scale studies have established that divergently evolved enzymes share conserved elements of structure and common mechanistic steps and that convergently evolved enzymes often converge to similar mechanisms too, suggesting that reaction mechanisms could be used to develop finer-grained functional descriptions than provided by the Enzyme Commission (EC) system currently in use. Here we describe how evolution informs these structure-function mappings and review the databases that store mechanisms of enzyme reactions along with recent developments to measure ligand and mechanistic similarities. Together, these provide a foundation for new classifications of enzyme function. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Supporting Mechanistic Reasoning in Domain-Specific Contexts

    Science.gov (United States)

    Weinberg, Paul J.

    2017-01-01

    Mechanistic reasoning is an epistemic practice central within science, technology, engineering, and mathematics disciplines. Although there has been some work on mechanistic reasoning in the research literature and standards documents, much of this work targets domain-general characterizations of mechanistic reasoning; this study provides…

  7. The Filiation by Assisted Human Reproductions Techniques in the Argentinian Civil and Commercial Code. An Advance that Allows to Harmonize the Rule with the Reality

    Directory of Open Access Journals (Sweden)

    Adriana Noemí Krasnow

    2017-07-01

    Full Text Available This article describes the contributions and changes that the Argentinian Civil and Commercial Code introduce in the filiation. The focus of attention is moved to the assisted human reproduction techniques in relation with the informed consent as an exteriorization of the will to procreate. Moreover, it is intended a study space about two proceedings that were silenced in the norm as the gestational surrogacy and the post mortem fertilization.

  8. A graph model for opportunistic network coding

    KAUST Repository

    Sorour, Sameh; Aboutoraby, Neda; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2015-01-01

    © 2015 IEEE. Recent advancements in graph-based analysis and solutions of instantly decodable network coding (IDNC) trigger the interest to extend them to more complicated opportunistic network coding (ONC) scenarios, with limited increase

  9. Specification of advanced safety modeling requirements (Rev. 0)

    International Nuclear Information System (INIS)

    Fanning, T. H.; Tautges, T. J.

    2008-01-01

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  10. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  11. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  12. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  13. Mechanistic studies of carbon monoxide reduction

    Energy Technology Data Exchange (ETDEWEB)

    Geoffroy, G.L.

    1990-06-12

    The progress made during the current grant period (1 January 1988--1 April 1990) in three different areas of research is summarized. The research areas are: (1) oxidatively-induced double carbonylation reactions to form {alpha}-ketoacyl complexes and studies of the reactivity of the resulting compounds, (2) mechanistic studies of the carbonylation of nitroaromatics to form isocyanates, carbamates, and ureas, and (3) studies of the formation and reactivity of unusual metallacycles and alkylidene ligands supported on binuclear iron carbonyl fragments. 18 refs., 5 figs., 1 tab.

  14. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  15. Melanie Klein's metapsychology: phenomenological and mechanistic perspective.

    Science.gov (United States)

    Mackay, N

    1981-01-01

    Freud's metapsychology is the subject of an important debate. This is over whether psychoanalysis is best construed as a science of the natural science type or as a special human science. The same debate applies to Melanie Klein's work. In Klein's metapsychology are two different and incompatible models of explanation. One is taken over from Freud's structural theory and appears to be similarly mechanistic. The other is clinically based and phenomenological. These two are discussed with special reference to the concepts of "phantasy" and "internal object".

  16. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  17. Appropriateness of mechanistic and non-mechanistic models for the application of ultrafiltration to mixed waste

    International Nuclear Information System (INIS)

    Foust, Henry; Ghosehajra, Malay

    2007-01-01

    This study asks two questions: (1) How appropriate is the use of a basic filtration equation to the application of ultrafiltration of mixed waste, and (2) How appropriate are non-parametric models for permeate rates (volumes)? To answer these questions, mechanistic and non-mechanistic approaches are developed for permeate rates and volumes associated with an ultrafiltration/mixed waste system in dia-filtration mode. The mechanistic approach is based on a filtration equation which states that t/V vs. V is a linear relationship. The coefficients associated with this linear regression are composed of physical/chemical parameters of the system and based the mass balance equation associated with the membrane and associated developing cake layer. For several sets of data, a high correlation is shown that supports the assertion that t/V vs. V is a linear relationship. It is also shown that non-mechanistic approaches, i.e., the use of regression models to are not appropriate. One models considered is Q(p) = a*ln(Cb)+b. Regression models are inappropriate because the scale-up from a bench scale (pilot scale) study to full-scale for permeate rates (volumes) is not simply the ratio of the two membrane surface areas. (authors)

  18. Development of design technology on thermal-hydraulic performance in tight-lattice rod bundle. 4. Large paralleled simulation by the advanced two-fluid model code

    International Nuclear Information System (INIS)

    Misawa, Takeharu; Yoshida, Hiroyuki; Akimoto, Hajime

    2008-01-01

    In Japan Atomic Energy Agency (JAEA), the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) has been developed. For thermal design of FLWR, it is necessary to develop analytical method to predict boiling transition of FLWR. Japan Atomic Energy Agency (JAEA) has been developing three-dimensional two-fluid model analysis code ACE-3D, which adopts boundary fitted coordinate system to simulate complex shape channel flow. In this paper, as a part of development of ACE-3D to apply to rod bundle analysis, introduction of parallelization to ACE-3D and assessments of ACE-3D are shown. In analysis of large-scale domain such as a rod bundle, even two-fluid model requires large number of computational cost, which exceeds upper limit of memory amount of 1 CPU. Therefore, parallelization was introduced to ACE-3D to divide data amount for analysis of large-scale domain among large number of CPUs, and it is confirmed that analysis of large-scale domain such as a rod bundle can be performed by parallel computation with keeping parallel computation performance even using large number of CPUs. ACE-3D adopts two-phase flow models, some of which are dependent upon channel geometry. Therefore, analyses in the domains, which simulate individual subchannel and 37 rod bundle, are performed, and compared with experiments. It is confirmed that the results obtained by both analyses using ACE-3D show agreement with past experimental result qualitatively. (author)

  19. Mechanistic approach to the sodium leakage and fire analysis

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Muramatsu, Toshiharu; Ohira, Hiroaki; Ida, Masao

    1997-04-01

    In December 1995, a thermocouple well was broken and liquid sodium leaked out of the intermediate heat transport system of the prototype fast breeder reactor Monju. In the initiating process of the incident, liquid sodium flowed out through the hollow thermocouple well, nipple and connector. As a result, liquid sodium, following ignition and combustion, was dropping from the connector to colide with the duct and grating placed below. The collision may cause fragmentation and scattering of the sodium droplet that finally was piled up on the floor. This report deals with the development of computer programs for the phenomena based on mechanistics approach. Numerical analyses are also made for fundamental sodium leakage and combustion phenomenon, sodium combustion experiment, and Monju incident condition. The contents of this report is listed below: (1) Analysis of chemical reaction process based on molecular orbital method, (2) Thermalhy draulic analysis of the sodium combustion experiment II performed in 1996 at O-arai Engineering Center, PNC, (3) Thermalhy draulic analysis of room A-446 of Monju reactor when the sodium leakage took place, (4) Direct numerical simulation of sodium droplet, (5) Sodium leakage and scattering analysis using three dimensional particle method, (6) Multi-dimensional combustion analysis and multi-point approximation combustion analysis code. Subsequent to the development work of the programs, they are to be applied to the safety analysis of the Fast Breeder Reactor. (author)

  20. Causation at Different Levels: Tracking the Commitments of Mechanistic Explanations

    DEFF Research Database (Denmark)

    Fazekas, Peter; Kertész, Gergely

    2011-01-01

    connections transparent. These general commitments get confronted with two claims made by certain proponents of the mechanistic approach: William Bechtel often argues that within the mechanistic framework it is possible to balance between reducing higher levels and maintaining their autonomy at the same time...... their autonomy at the same time than standard reductive accounts are, and that what mechanistic explanations are able to do at best is showing that downward causation does not exist....

  1. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  2. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  3. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  4. The general theory of convolutional codes

    Science.gov (United States)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  5. Development of computer code in PNC, 3

    International Nuclear Information System (INIS)

    Ohtaki, Akira; Ohira, Hiroaki

    1990-01-01

    Super-COPD, a code which is integrated by calculation modules, has been developed in order to evaluate kinds of dynamics of LMFBR plant by improving COPD. The code involves all models and its advanced models of COPD in module structures. The code makes it possible to simulate the system dynamics of LMFBR plant of any configurations and components. (author)

  6. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  7. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  8. Mechanistic Systems Modeling to Improve Understanding and Prediction of Cardiotoxicity Caused by Targeted Cancer Therapeutics

    Directory of Open Access Journals (Sweden)

    Jaehee V. Shim

    2017-09-01

    Full Text Available Tyrosine kinase inhibitors (TKIs are highly potent cancer therapeutics that have been linked with serious cardiotoxicity, including left ventricular dysfunction, heart failure, and QT prolongation. TKI-induced cardiotoxicity is thought to result from interference with tyrosine kinase activity in cardiomyocytes, where these signaling pathways help to control critical processes such as survival signaling, energy homeostasis, and excitation–contraction coupling. However, mechanistic understanding is limited at present due to the complexities of tyrosine kinase signaling, and the wide range of targets inhibited by TKIs. Here, we review the use of TKIs in cancer and the cardiotoxicities that have been reported, discuss potential mechanisms underlying cardiotoxicity, and describe recent progress in achieving a more systematic understanding of cardiotoxicity via the use of mechanistic models. In particular, we argue that future advances are likely to be enabled by studies that combine large-scale experimental measurements with Quantitative Systems Pharmacology (QSP models describing biological mechanisms and dynamics. As such approaches have proven extremely valuable for understanding and predicting other drug toxicities, it is likely that QSP modeling can be successfully applied to cardiotoxicity induced by TKIs. We conclude by discussing a potential strategy for integrating genome-wide expression measurements with models, illustrate initial advances in applying this approach to cardiotoxicity, and describe challenges that must be overcome to truly develop a mechanistic and systematic understanding of cardiotoxicity caused by TKIs.

  9. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  10. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  11. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  12. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  13. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  14. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  15. Analysis of the three dimensional core kinetics NESTLE code coupling with the advanced thermo-hydraulic code systems, RELAP5/SCDAPSIM and its application to the Laguna Verde Central reactor; Analisis para el acoplamiento del codigo NESTLE para la cinetica tridimensional del nucleo al codigo avanzado de sistemas termo-hidraulicos, RELAP5/SCDAPSIM y su aplicacion al reactor de la CNLV

    Energy Technology Data Exchange (ETDEWEB)

    Salazar C, J H; Nunez C, A [CNSNS, Dr. Jose Ma. Barragan No. 779, Col. Narvarte, 03020 Mexico D.F. (Mexico); Chavez M, C [UNAM, Facultad de Ingenieria, DEPFI Campus Morelos (Mexico)

    2004-07-01

    The objective of the written present is to propose a methodology for the joining of the codes RELAP5/SCDAPSIM and NESTLE. The development of this joining will be carried out inside a doctoral program of Engineering in Energy with nuclear profile of the Ability of Engineering of the UNAM together with the National Commission of Nuclear Security and Safeguards (CNSNS). The general purpose of this type of developments, is to have tools that are implemented by multiple programs or codes such a that systems or models of the three-dimensional kinetics of the core can be simulated and those of the dynamics of the reactor (water heater-hydraulics). In the past, by limitations for the calculation of the complete answer of both systems, the developed models they were carried out for separate, putting a lot of emphasis in one but neglecting the other one. These methodologies, calls of better estimate, will be good to the nuclear industry to evaluate, with more high grades of detail, the designs of the nuclear power plant (for modifications to those already existent or for new concepts in the designs of advanced reactors), besides analysing events (transitory and have an accident), among other applications. The coupled system was applied to design studies and investigation of the Laguna Verde Nuclear power plant (CNLV). (Author)

  16. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  17. Mechanistic insight into neurotoxicity induced by developmental insults

    International Nuclear Information System (INIS)

    Tamm, Christoffer; Ceccatelli, Sandra

    2017-01-01

    Epidemiological and/or experimental studies have shown that unfavorable prenatal environmental factors, such as stress or exposure to certain neurotoxic environmental contaminants, may have adverse consequences for neurodevelopment. Alterations in neurogenesis can have harmful effects not only for the developing nervous system, but also for the adult brain where neurogenesis is believed to play a role in learning, memory, and even in depression. Many recent advances in the understanding of the complex process of nervous system development can be integrated into the field of neurotoxicology. In the past 15 years we have been using cultured neural stem or progenitor cells to investigate the effects of neurotoxic stimuli on cell survival, proliferation and differentiation, with special focus on heritable effects. This is an overview of the work performed by our group in the attempt to elucidate the mechanisms of developmental neurotoxicity and possibly provide relevant information for the understanding of the etiopathogenesis of complex brain disorders. - Highlights: • The developing nervous system is highly sensitive to toxic insults. • Neural stem cells are relevant models for mechanistic studies as well as for identifying heritable effects due to epigenetic changes. • Depending on the dose, the outcome of exposure to neurotoxicants ranges from altered proliferation and differentiation to cell death. • The elucidation of neurotoxicity mechanisms is relevant for understanding the etiopathogenesis of developmental and adult nervous system disorders.

  18. Enzymatic Halogenation and Dehalogenation Reactions: Pervasive and Mechanistically Diverse.

    Science.gov (United States)

    Agarwal, Vinayak; Miles, Zachary D; Winter, Jaclyn M; Eustáquio, Alessandra S; El Gamal, Abrahim A; Moore, Bradley S

    2017-04-26

    Naturally produced halogenated compounds are ubiquitous across all domains of life where they perform a multitude of biological functions and adopt a diversity of chemical structures. Accordingly, a diverse collection of enzyme catalysts to install and remove halogens from organic scaffolds has evolved in nature. Accounting for the different chemical properties of the four halogen atoms (fluorine, chlorine, bromine, and iodine) and the diversity and chemical reactivity of their organic substrates, enzymes performing biosynthetic and degradative halogenation chemistry utilize numerous mechanistic strategies involving oxidation, reduction, and substitution. Biosynthetic halogenation reactions range from simple aromatic substitutions to stereoselective C-H functionalizations on remote carbon centers and can initiate the formation of simple to complex ring structures. Dehalogenating enzymes, on the other hand, are best known for removing halogen atoms from man-made organohalogens, yet also function naturally, albeit rarely, in metabolic pathways. This review details the scope and mechanism of nature's halogenation and dehalogenation enzymatic strategies, highlights gaps in our understanding, and posits where new advances in the field might arise in the near future.

  19. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, D.; Wagenmakers, E.-J.; Romeijn, J.-W.

    2011-01-01

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  20. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  1. "Ratio via Machina": Three Standards of Mechanistic Explanation in Sociology

    Science.gov (United States)

    Aviles, Natalie B.; Reed, Isaac Ariail

    2017-01-01

    Recently, sociologists have expended much effort in attempts to define social mechanisms. We intervene in these debates by proposing that sociologists in fact have a choice to make between three standards of what constitutes a good mechanistic explanation: substantial, formal, and metaphorical mechanistic explanation. All three standards are…

  2. Multi codes and multi-scale analysis for void fraction prediction in hot channel for VVER-1000/V392

    International Nuclear Information System (INIS)

    Hoang Minh Giang; Hoang Tan Hung; Nguyen Huu Tiep

    2015-01-01

    Recently, an approach of multi codes and multi-scale analysis is widely applied to study core thermal hydraulic behavior such as void fraction prediction. Better results are achieved by using multi codes or coupling codes such as PARCS and RELAP5. The advantage of multi-scale analysis is zooming of the interested part in the simulated domain for detail investigation. Therefore, in this study, the multi codes between MCNP5, RELAP5, CTF and also the multi-scale analysis based RELAP5 and CTF are applied to investigate void fraction in hot channel of VVER-1000/V392 reactor. Since VVER-1000/V392 reactor is a typical advanced reactor that can be considered as the base to develop later VVER-1200 reactor, then understanding core behavior in transient conditions is necessary in order to investigate VVER technology. It is shown that the item of near wall boiling, Γ w in RELAP5 proposed by Lahey mechanistic method may not give enough accuracy of void fraction prediction as smaller scale code as CTF. (author)

  3. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  4. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  5. Recent developments in the CONTAIN-LMR code

    International Nuclear Information System (INIS)

    Murata, K.K.

    1990-01-01

    Through an international collaborative effort, a special version of the CONTAIN code is being developed for integrated mechanistic analysis of the conditions in liquid metal reactor (LMR) containments during severe accidents. The capabilities of the most recent code version, CONTAIN LMR/1B-Mod.1, are discussed. These include new models for the treatment of two condensables, sodium condensation on aerosols, chemical reactions, hygroscopic aerosols, and concrete outgassing. This code version also incorporates all of the previously released LMR model enhancements. The results of an integral demonstration calculation of a sever core-melt accident scenario are given to illustrate the features of this code version. 11 refs., 7 figs., 1 tab

  6. Chirality-Controlled Growth of Single-Wall Carbon Nanotubes Using Vapor Phase Epitaxy: Mechanistic Understanding and Scalable Production

    Science.gov (United States)

    2016-09-15

    AFRL-AFOSR-VA-TR-2016-0319 Chirality -Controlled Growth of Single-Wall Carbon Nanotubes Using Vapor Phase Epitaxy: Mechanistic Understanding and...TELEPHONE NUMBER (Include area code) DISTRIBUTION A: Distribution approved for public release. 15-06-2016 final Jun 2014 - Jun 2016 Chirality ...for Public Release; Distribution is Unlimited. In this report, we present our efforts in establishing a novel and effective approach for chirality

  7. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  8. Mechanistic Basis of Cocrystal Dissolution Advantage.

    Science.gov (United States)

    Cao, Fengjuan; Amidon, Gordon L; Rodríguez-Hornedo, Naír; Amidon, Gregory E

    2018-01-01

    Current interest in cocrystal development resides in the advantages that the cocrystal may have in solubility and dissolution compared with the parent drug. This work provides a mechanistic analysis and comparison of the dissolution behavior of carbamazepine (CBZ) and its 2 cocrystals, carbamazepine-saccharin (CBZ-SAC) and carbamazepine-salicylic acid (CBZ-SLC) under the influence of pH and micellar solubilization. A simple mathematical equation is derived based on the mass transport analyses to describe the dissolution advantage of cocrystals. The dissolution advantage is the ratio of the cocrystal flux to drug flux and is defined as the solubility advantage (cocrystal to drug solubility ratio) times the diffusivity advantage (cocrystal to drug diffusivity ratio). In this work, the effective diffusivity of CBZ in the presence of surfactant was determined to be different and less than those of the cocrystals. The higher effective diffusivity of drug from the dissolved cocrystals, the diffusivity advantage, can impart a dissolution advantage to cocrystals with lower solubility than the parent drug while still maintaining thermodynamic stability. Dissolution conditions where cocrystals can display both thermodynamic stability and a dissolution advantage can be obtained from the mass transport models, and this information is useful for both cocrystal selection and formulation development. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  9. Mechanistic movement models to understand epidemic spread.

    Science.gov (United States)

    Fofana, Abdou Moutalab; Hurford, Amy

    2017-05-05

    An overlooked aspect of disease ecology is considering how and why animals come into contact with one and other resulting in disease transmission. Mathematical models of disease spread frequently assume mass-action transmission, justified by stating that susceptible and infectious hosts mix readily, and foregoing any detailed description of host movement. Numerous recent studies have recorded, analysed and modelled animal movement. These movement models describe how animals move with respect to resources, conspecifics and previous movement directions and have been used to understand the conditions for the occurrence and the spread of infectious diseases when hosts perform a type of movement. Here, we summarize the effect of the different types of movement on the threshold conditions for disease spread. We identify gaps in the literature and suggest several promising directions for future research. The mechanistic inclusion of movement in epidemic models may be beneficial for the following two reasons. Firstly, the estimation of the transmission coefficient in an epidemic model is possible because animal movement data can be used to estimate the rate of contacts between conspecifics. Secondly, unsuccessful transmission events, where a susceptible host contacts an infectious host but does not become infected can be quantified. Following an outbreak, this enables disease ecologists to identify 'near misses' and to explore possible alternative epidemic outcomes given shifts in ecological or immunological parameters.This article is part of the themed issue 'Opening the black box: re-examining the ecology and evolution of parasite transmission'. © 2017 The Author(s).

  10. Refining the accuracy of validated target identification through coding variant fine-mapping in type 2 diabetes

    DEFF Research Database (Denmark)

    Mahajan, Anubha

    2018-01-01

    , compelling evidence for coding variant causality was obtained for only 16 signals. At 13 others, the associated coding variants clearly represent 'false leads' with potential to generate erroneous mechanistic inference. Coding variant associations offer a direct route to biological insight for complex...

  11. Qualification of the core model DYN3D coupled with the code ATHLET as an advanced tool for the accident analysis of VVER type reactors. Pt. 2. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Rohde, U.

    2002-10-01

    differences in the thermohydraulics were assumed in the difficult modelling of the vertical once-through steam generator with steam superheating. Sensitivity analyses which considered the influence of the nodalisation and the impact of the coolant mixing model were performed for the DYN3D-ATHLET solution of the OECD benchmark. The solution of the benchmarks essentially contributed to the qualification of the code complex DYN3D-ATHLET as an advanced tool for the accident analysis for both VVER type reactors and Western PWRs. (orig.) [de

  12. SARNET, a success story. Survey of major achievements on severe accidents and of knowledge capitalization within the ASTEC code

    International Nuclear Information System (INIS)

    Albiol, T.; Van Dorsselaere, J.P.; Reinke, N.

    2013-01-01

    51 organizations from Europe and Canada cooperated within SARNET (Severe Accident Research Network of Excellence) joining their capacities of research in order to resolve the most important pending issues for enhancing, in regard to Severe Accidents (SA), the safety of existing and future Nuclear Power Plants (NPPs). SARNET defines common research programmes and develops common computer codes and methodologies for safety assessment. The ASTEC integral code, jointly developed by IRSN (France) and GRS (Germany) for Light Water Reactor (LWR) source term SA evaluation, Probabilistic Safety Assessment (PSA) level-2 studies and SA management evaluation, is the main integrating component of SARNET. The scientific knowledge generated in the Corium, Source Term and Containment Topics has been integrated into the code through improved or new physical models. ASTEC constitutes now the reference European SA integral code. During the 4 and half years of SARNET, 30 partners have assessed the successive versions of the ASTEC V1 code through validation. More than 60 scientists have been trained on the code use. Validation tasks on about 65 experiments were performed to cover all physical phenomena occurring in a severe accident: circuit thermalhydraulic, core degradation, fission products (FP) release and transport, Molten-Corium-Concrete-Interaction (MCCI), and in the containment, thermalhydraulic, aerosol and iodine as well as hydrogen behaviour. The overall status of validation can be considered as good, with results often close to results of mechanistic codes. Some reach the limits of present knowledge, for instance on MCCI, and, like in most codes, an adequate model for reflooding of a degraded core is still missing. IRSN and GRS are currently preparing the new series of ASTEC V2 versions that will account for most of the needs of evolution expressed by the SARNET partners. The first version V2.0, planned for March 09, will be applicable to EPR and will include the ICARE2

  13. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  14. Identifying mechanistic similarities in drug responses

    KAUST Repository

    Zhao, C.

    2012-05-15

    Motivation: In early drug development, it would be beneficial to be able to identify those dynamic patterns of gene response that indicate that drugs targeting a particular gene will be likely or not to elicit the desired response. One approach would be to quantitate the degree of similarity between the responses that cells show when exposed to drugs, so that consistencies in the regulation of cellular response processes that produce success or failure can be more readily identified.Results: We track drug response using fluorescent proteins as transcription activity reporters. Our basic assumption is that drugs inducing very similar alteration in transcriptional regulation will produce similar temporal trajectories on many of the reporter proteins and hence be identified as having similarities in their mechanisms of action (MOA). The main body of this work is devoted to characterizing similarity in temporal trajectories/signals. To do so, we must first identify the key points that determine mechanistic similarity between two drug responses. Directly comparing points on the two signals is unrealistic, as it cannot handle delays and speed variations on the time axis. Hence, to capture the similarities between reporter responses, we develop an alignment algorithm that is robust to noise, time delays and is able to find all the contiguous parts of signals centered about a core alignment (reflecting a core mechanism in drug response). Applying the proposed algorithm to a range of real drug experiments shows that the result agrees well with the prior drug MOA knowledge. © The Author 2012. Published by Oxford University Press. All rights reserved.

  15. Mechanistic modeling for mammography screening risks

    International Nuclear Information System (INIS)

    Bijwaard, Harmen

    2008-01-01

    Full text: Western populations show a very high incidence of breast cancer and in many countries mammography screening programs have been set up for the early detection of these cancers. Through these programs large numbers of women (in the Netherlands, 700.000 per year) are exposed to low but not insignificant X-ray doses. ICRP based risk estimates indicate that the number of breast cancer casualties due to mammography screening can be as high as 50 in the Netherlands per year. The number of lives saved is estimated to be much higher, but for an accurate calculation of the benefits of screening a better estimate of these risks is indispensable. Here it is attempted to better quantify the radiological risks of mammography screening through the application of a biologically based model for breast tumor induction by X-rays. The model is applied to data obtained from the National Institutes of Health in the U.S. These concern epidemiological data of female TB patients who received high X-ray breast doses in the period 1930-1950 through frequent fluoroscopy of their lungs. The mechanistic model that is used to describe the increased breast cancer incidence is based on an earlier study by Moolgavkar et al. (1980), in which the natural background incidence of breast cancer was modeled. The model allows for a more sophisticated extrapolation of risks to the low dose X-ray exposures that are common in mammography screening and to the higher ages that are usually involved. Furthermore, it allows for risk transfer to other (non-western) populations. The results have implications for decisions on the frequency of screening, the number of mammograms taken at each screening, minimum and maximum ages for screening and the transfer to digital equipment. (author)

  16. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  17. Preliminary investigation study of code of developed country for developing Korean fuel cycle code

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Ko, Won Il; Lee, Ho Hee; Cho, Dong Keun; Park, Chang Je

    2012-01-01

    In order to develop Korean fuel cycle code, the analyses has been performed with the fuel cycle codes which are used in advanced country. Also, recommendations were proposed for future development. The fuel cycle codes are AS FLOOWS: VISTA which has been developed by IAEA, DANESS code which developed by ANL and LISTO, and VISION developed by INL for the Advanced Fuel Cycle Initiative (AFCI) system analysis. The recommended items were proposed for software, program scheme, material flow model, isotope decay model, environmental impact analysis model, and economics analysis model. The described things will be used for development of Korean nuclear fuel cycle code in future

  18. Mechanistic effect modeling for ecological risk assessment: where to go from here?

    Science.gov (United States)

    Grimm, Volker; Martin, Benjamin T

    2013-07-01

    Mechanistic effect models (MEMs) consider the mechanisms of how chemicals affect individuals and ecological systems such as populations and communities. There is an increasing awareness that MEMs have high potential to make risk assessment of chemicals more ecologically relevant than current standard practice. Here we discuss what kinds of MEMs are needed to improve scientific and regulatory aspects of risk assessment. To make valid predictions for a wide range of environmental conditions, MEMs need to include a sufficient amount of emergence, for example, population dynamics emerging from what individual organisms do. We present 1 example where the life cycle of individuals is described using Dynamic Energy Budget theory. The resulting individual-based population model is thus parameterized at the individual level but correctly predicts multiple patterns at the population level. This is the case for both control and treated populations. We conclude that the state-of-the-art in mechanistic effect modeling has reached a level where MEMs are robust and predictive enough to be used in regulatory risk assessment. Mechanistic effect models will thus be used to advance the scientific basis of current standard practice and will, if their development follows Good Modeling Practice, be included in a standardized way in future regulatory risk assessments. Copyright © 2013 SETAC.

  19. A metabonomic approach for mechanistic exploration of pre-clinical toxicology.

    Science.gov (United States)

    Coen, Muireann

    2010-12-30

    Metabonomics involves the application of advanced analytical tools to profile the diverse metabolic complement of a given biofluid or tissue. Subsequent statistical modelling of the complex multivariate spectral profiles enables discrimination between phenotypes of interest and identifies panels of discriminatory metabolites that represent candidate biomarkers. This review article presents an overview of recent developments in the field of metabonomics with a focus on application to pre-clinical toxicology studies. Recent research investigations carried out as part of the international COMET 2 consortium project on the hepatotoxic action of the aminosugar, galactosamine (galN) are presented. The application of advanced, high-field NMR spectroscopy is demonstrated, together with complementary application of a targeted mass spectrometry platform coupled with ultra-performance liquid chromatography. Much novel mechanistic information has been gleaned on both the mechanism of galN hepatotoxicity in multiple biofluids and tissues, and on the protection afforded by co-administration of glycine and uridine. The simultaneous identification of both the metabolic fate of galN and its associated endogenous consequences in spectral profiles is demonstrated. Furthermore, metabonomic assessment of inter-animal variability in response to galN presents enhanced mechanistic insight on variable response phentoypes and is relevant to understanding wider aspects of individual variability in drug response. This exemplar highlights the analytical and statistical tools commonly applied in metabonomic studies and notably, the approach is applicable to the study of any toxin/drug or intervention of interest. The metabonomic approach holds considerable promise and potential to significantly advance our understanding of the mechanistic bases for adverse drug reactions. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Modern Cryptanalysis Techniques for Advanced Code Breaking

    CERN Document Server

    Swenson, Christopher

    2008-01-01

    As an instructor at the University of Tulsa, Christopher Swenson could find no relevant text for teaching modern cryptanalysis?so he wrote his own. This is the first book that brings the study of cryptanalysis into the 21st century. Swenson provides a foundation in traditional cryptanalysis, examines ciphers based on number theory, explores block ciphers, and teaches the basis of all modern cryptanalysis: linear and differential cryptanalysis. This time-honored weapon of warfare has become a key piece of artillery in the battle for information security.

  1. Advanced Chemical Propulsion Study

    Science.gov (United States)

    Woodcock, Gordon; Byers, Dave; Alexander, Leslie A.; Krebsbach, Al

    2004-01-01

    A study was performed of advanced chemical propulsion technology application to space science (Code S) missions. The purpose was to begin the process of selecting chemical propulsion technology advancement activities that would provide greatest benefits to Code S missions. Several missions were selected from Code S planning data, and a range of advanced chemical propulsion options was analyzed to assess capabilities and benefits re these missions. Selected beneficial applications were found for higher-performing bipropellants, gelled propellants, and cryogenic propellants. Technology advancement recommendations included cryocoolers and small turbopump engines for cryogenic propellants; space storable propellants such as LOX-hydrazine; and advanced monopropellants. It was noted that fluorine-bearing oxidizers offer performance gains over more benign oxidizers. Potential benefits were observed for gelled propellants that could be allowed to freeze, then thawed for use.

  2. Overview of the South African mechanistic pavement design analysis method

    CSIR Research Space (South Africa)

    Theyse, HL

    1996-01-01

    Full Text Available A historical overview of the South African mechanistic pavement design method, from its development in the early 1970s to the present, is presented. Material characterization, structural analysis, and pavement life prediction are discussed...

  3. Mechanistic kinetic models of enzymatic cellulose hydrolysis-A review.

    Science.gov (United States)

    Jeoh, Tina; Cardona, Maria J; Karuna, Nardrapee; Mudinoor, Akshata R; Nill, Jennifer

    2017-07-01

    Bioconversion of lignocellulose forms the basis for renewable, advanced biofuels, and bioproducts. Mechanisms of hydrolysis of cellulose by cellulases have been actively studied for nearly 70 years with significant gains in understanding of the cellulolytic enzymes. Yet, a full mechanistic understanding of the hydrolysis reaction has been elusive. We present a review to highlight new insights gained since the most recent comprehensive review of cellulose hydrolysis kinetic models by Bansal et al. (2009) Biotechnol Adv 27:833-848. Recent models have taken a two-pronged approach to tackle the challenge of modeling the complex heterogeneous reaction-an enzyme-centric modeling approach centered on the molecularity of the cellulase-cellulose interactions to examine rate limiting elementary steps and a substrate-centric modeling approach aimed at capturing the limiting property of the insoluble cellulose substrate. Collectively, modeling results suggest that at the molecular-scale, how rapidly cellulases can bind productively (complexation) and release from cellulose (decomplexation) is limiting, while the overall hydrolysis rate is largely insensitive to the catalytic rate constant. The surface area of the insoluble substrate and the degrees of polymerization of the cellulose molecules in the reaction both limit initial hydrolysis rates only. Neither enzyme-centric models nor substrate-centric models can consistently capture hydrolysis time course at extended reaction times. Thus, questions of the true reaction limiting factors at extended reaction times and the role of complexation and decomplexation in rate limitation remain unresolved. Biotechnol. Bioeng. 2017;114: 1369-1385. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Mechanistic Links Between PARP, NAD, and Brain Inflammation After TBI

    Science.gov (United States)

    2015-10-01

    1 AWARD NUMBER: W81XWH-13-2-0091 TITLE: Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI PRINCIPAL INVESTIGATOR...COVERED 25 Sep 2014 - 24 Sep 2015 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI 5b. GRANT...efficacy of veliparib and NAD as agents for suppressing inflammation and improving outcomes after traumatic brain injury. The animal models include

  5. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  6. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  7. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  8. A novel method of generating and remembering international morse codes

    Digital Repository Service at National Institute of Oceanography (India)

    Charyulu, R.J.K.

    untethered communications have been advanced, despite as S.O.S International Morse Code will be at rescue as an emergency tool, when all other modes fail The details of hte method and actual codes have been enumerated....

  9. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  10. NESTLE: A nodal kinetics code

    International Nuclear Information System (INIS)

    Al-Chalabi, R.M.; Turinsky, P.J.; Faure, F.-X.; Sarsour, H.N.; Engrand, P.R.

    1993-01-01

    The NESTLE nodal kinetics code has been developed for utilization as a stand-alone code for steady-state and transient reactor neutronic analysis and for incorporation into system transient codes, such as TRAC and RELAP. The latter is desirable to increase the simulation fidelity over that obtained from currently employed zero- and one-dimensional neutronic models and now feasible due to advances in computer performance and efficiency of nodal methods. As a stand-alone code, requirements are that it operate on a range of computing platforms from memory-limited personal computers (PCs) to supercomputers with vector processors. This paper summarizes the features of NESTLE that reflect the utilization and requirements just noted

  11. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  12. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  13. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  14. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  15. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  16. The ZPIC educational code suite

    Science.gov (United States)

    Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.

    2017-10-01

    Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.

  17. Analysis of selected Halden overpressure tests using the FALCON code

    Energy Technology Data Exchange (ETDEWEB)

    Khvostov, G., E-mail: grigori.khvostov@psi.ch [Paul Scherrer Institut, CH 5232 Villigen PSI (Switzerland); Wiesenack, W. [Institute for Energy Technology – OECD Halden Reactor Project, P.O. Box 173, N-1751 Halden (Norway)

    2016-12-15

    Highlights: • We analyse four Halden overpressure tests. • We determine a critical overpressure value for lift-off in a BWR fuel sample. • We show the role of bonding in over-pressurized rod behaviour. • We analytically quantify the degree of bonding via its impact on cladding elongation. • We hypothesize on an effect of circumferential cracks on thermal fuel response to overpressure. • We estimate a thermal effect of circumferential cracks based on interpretation of the data. - Abstract: Four Halden overpressure (lift-off) tests using samples with uranium dioxide fuel pre-irradiated in power reactors to a burnup of 60 MWd/kgU are analyzed. The FALCON code coupled to a mechanistic model, GRSW-A for fission gas release and gaseous-bubble swelling is used for the calculation. The advanced version of the FALCON code is shown to be applicable to best-estimate predictive analysis of overpressure tests using rods without, or weak pellet-cladding bonding, as well as scoping analysis of tests with fuels where stronger pellet-cladding bonding occurs. Significant effects of bonding and fuel cracking/relocation on the thermal and mechanical behaviour of highly over-pressurized rods are shown. The effect of bonding is particularly pronounced in the tests with the PWR samples. The present findings are basically consistent with an earlier analysis based on a direct interpretation of the experimental data. Additionally, in this paper, the specific effects are quantified based on the comparison of the data with the results of calculation. It is concluded that the identified effects are largely beyond the current traditional fuel-rod licensing analysis methods.

  18. Incorporation of lysosomal sequestration in the mechanistic model for prediction of tissue distribution of basic drugs.

    Science.gov (United States)

    Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra

    2017-11-15

    cell types. Despite this extensive lysosomal sequestration in the individual cells types, the maximal change in the overall predicted tissue Kpu was model input parameters, in particular lysosomal pH and fraction of the cellular volume occupied by the lysosomes, only partially explained discrepancies between observed and predicted Kpu data in the lung. Improved understanding of the system properties, e.g., cell/organelle composition is required to support further development of mechanistic equations for the prediction of drug tissue distribution. Application of this revised mechanistic model is recommended for prediction of Kpu in lysosome-rich tissue to facilitate the advancement of physiologically-based prediction of volume of distribution and drug exposure in the tissues. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Modeling and validation of a mechanistic tool (MEFISTO) for the prediction of critical power in BWR fuel assemblies

    International Nuclear Information System (INIS)

    Adamsson, Carl; Le Corre, Jean-Marie

    2011-01-01

    Highlights: → The MEFISTO code efficiently and accurately predicts the dryout event in a BWR fuel bundle, using a mechanistic model. → A hybrid approach between a fast and robust sub-channel analysis and a three-field two-phase analysis is adopted. → MEFISTO modeling approach, calibration, CPU usage, sensitivity, trend analysis and performance evaluation are presented. → The calibration parameters and process were carefully selected to preserve the mechanistic nature of the code. → The code dryout prediction performance is near the level of fuel-specific empirical dryout correlations. - Abstract: Westinghouse is currently developing the MEFISTO code with the main goal to achieve fast, robust, practical and reliable prediction of steady-state dryout Critical Power in Boiling Water Reactor (BWR) fuel bundle based on a mechanistic approach. A computationally efficient simulation scheme was used to achieve this goal, where the code resolves all relevant field (drop, steam and multi-film) mass balance equations, within the annular flow region, at the sub-channel level while relying on a fast and robust two-phase (liquid/steam) sub-channel solution to provide the cross-flow information. The MEFISTO code can hence provide highly detailed solution of the multi-film flow in BWR fuel bundle while enhancing flexibility and reducing the computer time by an order of magnitude as compared to a standard three-field sub-channel analysis approach. Models for the numerical computation of the one-dimensional field flowrate distributions in an open channel (e.g. a sub-channel), including the numerical treatment of field cross-flows, part-length rods, spacers grids and post-dryout conditions are presented in this paper. The MEFISTO code is then applied to dryout prediction in BWR fuel bundle using VIPRE-W as a fast and robust two-phase sub-channel driver code. The dryout power is numerically predicted by iterating on the bundle power so that the minimum film flowrate in the

  20. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  1. Predicting interactions from mechanistic information: Can omic data validate theories?

    International Nuclear Information System (INIS)

    Borgert, Christopher J.

    2007-01-01

    To address the most pressing and relevant issues for improving mixture risk assessment, researchers must first recognize that risk assessment is driven by both regulatory requirements and scientific research, and that regulatory concerns may expand beyond the purely scientific interests of researchers. Concepts of 'mode of action' and 'mechanism of action' are used in particular ways within the regulatory arena, depending on the specific assessment goals. The data requirements for delineating a mode of action and predicting interactive toxicity in mixtures are not well defined from a scientific standpoint due largely to inherent difficulties in testing certain underlying assumptions. Understanding the regulatory perspective on mechanistic concepts will be important for designing experiments that can be interpreted clearly and applied in risk assessments without undue reliance on extrapolation and assumption. In like fashion, regulators and risk assessors can be better equipped to apply mechanistic data if the concepts underlying mechanistic research and the limitations that must be placed on interpretation of mechanistic data are understood. This will be critically important for applying new technologies to risk assessment, such as functional genomics, proteomics, and metabolomics. It will be essential not only for risk assessors to become conversant with the language and concepts of mechanistic research, including new omic technologies, but also, for researchers to become more intimately familiar with the challenges and needs of risk assessment

  2. Explanation and inference: Mechanistic and functional explanations guide property generalization

    Directory of Open Access Journals (Sweden)

    Tania eLombrozo

    2014-09-01

    Full Text Available The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1, experimentally provided (Experiment 2, or experimentally induced (Experiment 2. The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  3. Explanation and inference: mechanistic and functional explanations guide property generalization.

    Science.gov (United States)

    Lombrozo, Tania; Gwynne, Nicholas Z

    2014-01-01

    The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  4. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  5. Managing mechanistic and organic structure in health care organizations.

    Science.gov (United States)

    Olden, Peter C

    2012-01-01

    Managers at all levels in a health care organization must organize work to achieve the organization's mission and goals. This requires managers to decide the organization structure, which involves dividing the work among jobs and departments and then coordinating them all toward the common purpose. Organization structure, which is reflected in an organization chart, may range on a continuum from very mechanistic to very organic. Managers must decide how mechanistic versus how organic to make the entire organization and each of its departments. To do this, managers should carefully consider 5 factors for the organization and for each individual department: external environment, goals, work production, size, and culture. Some factors may push toward more mechanistic structure, whereas others may push in the opposite direction toward more organic structure. Practical advice can help managers at all levels design appropriate structure for their departments and organization.

  6. Why did Jacques Monod make the choice of mechanistic determinism?

    Science.gov (United States)

    Loison, Laurent

    2015-06-01

    The development of molecular biology placed in the foreground a mechanistic and deterministic conception of the functioning of macromolecules. In this article, I show that this conception was neither obvious, nor necessary. Taking Jacques Monod as a case study, I detail the way he gradually came loose from a statistical understanding of determinism to finally support a mechanistic understanding. The reasons of the choice made by Monod at the beginning of the 1950s can be understood only in the light of the general theoretical schema supported by the concept of mechanistic determinism. This schema articulates three fundamental notions for Monod, namely that of the rigidity of the sequence of the genetic program, that of the intrinsic stability of macromolecules (DNA and proteins), and that of the specificity of molecular interactions. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  7. Cognitive science as an interface between rational and mechanistic explanation.

    Science.gov (United States)

    Chater, Nick

    2014-04-01

    Cognitive science views thought as computation; and computation, by its very nature, can be understood in both rational and mechanistic terms. In rational terms, a computation solves some information processing problem (e.g., mapping sensory information into a description of the external world; parsing a sentence; selecting among a set of possible actions). In mechanistic terms, a computation corresponds to causal chain of events in a physical device (in engineering context, a silicon chip; in biological context, the nervous system). The discipline is thus at the interface between two very different styles of explanation--as the papers in the current special issue well illustrate, it explores the interplay of rational and mechanistic forces. Copyright © 2014 Cognitive Science Society, Inc.

  8. Coding for effective denial management.

    Science.gov (United States)

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  9. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  10. Nutraceuticals against Neurodegeneration: A Mechanistic Insight.

    Science.gov (United States)

    Dadhania, Vivekkumar P; Trivedi, Priyanka P; Vikram, Ajit; Tripathi, Durga Nand

    2016-01-01

    The mechanisms underlying neurodegenerative disorders are complex and multifactorial; however, accumulating evidences suggest few common shared pathways. These common pathways include mitochondrial dysfunction, intracellular Ca2+ overload, oxidative stress and inflammation. Often multiple pathways co-exist, and therefore limit the benefits of therapeutic interventions. Nutraceuticals have recently gained importance owing to their multifaceted effects. These food-based approaches are believed to target multiple pathways in a slow but more physiological manner without causing severe adverse effects. Available information strongly supports the notion that apart from preventing the onset of neuronal damage, nutraceuticals can potentially attenuate the continued progression of neuronal destruction. In this article, we i) review the common pathways involved in the pathogenesis of the toxicants-induced neurotoxicity and neurodegenerative disorders with special emphasis on Alzheimer`s disease (AD), Parkinson`s disease (PD), Huntington`s disease (HD), Multiple sclerosis (MS) and Amyotrophic lateral sclerosis (ALS), and ii) summarize current research advancements on the effects of nutraceuticals against these detrimental pathways.

  11. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  12. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Profiling the biological activity of oxide nanomaterials with mechanistic models

    NARCIS (Netherlands)

    Burello, E.

    2013-01-01

    In this study we present three mechanistic models for profiling the potential biological and toxicological effects of oxide nanomaterials. The models attempt to describe the reactivity, protein adsorption and membrane adhesion processes of a large range of oxide materials and are based on properties

  14. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  15. Descriptive and mechanistic models of crop–weed competition

    NARCIS (Netherlands)

    Bastiaans, L.; Storkey, J.

    2017-01-01

    Crop-weed competitive relations are an important element of agroecosystems. Quantifying and understanding them helps to design appropriate weed management at operational, tactical and strategic level. This chapter presents and discusses simple descriptive and more mechanistic models for crop-weed

  16. A mechanistic model on methane oxidation in the rice rhizosphere

    NARCIS (Netherlands)

    Bodegom, van P.M.; Leffelaar, P.A.; Goudriaan, J.

    2001-01-01

    A mechanistic model is presented on the processes leading to methane oxidation in rice rhizosphere. The model is driven by oxygen release from a rice root into anaerobic rice soil. Oxygen is consumed by heterotrophic and methanotrophic respiration, described by double Monod kinetics, and by iron

  17. Precision and accuracy of mechanistic-empirical pavement design

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-09-01

    Full Text Available are discussed in general. The effects of variability and error on the design accuracy and design risk are lastly illustrated at the hand of a simple mechanistic-empirical design problem, showing that the engineering models alone determine the accuracy...

  18. The mechanistic bases of the power-time relationship

    DEFF Research Database (Denmark)

    Vanhatalo, Anni; Black, Matthew I; DiMenna, Fred J

    2016-01-01

    .025) and inversely correlated with muscle type IIx fibre proportion (r = -0.76, P = 0.01). There was no relationship between W' (19.4 ± 6.3 kJ) and muscle fibre type. These data indicate a mechanistic link between the bioenergetic characteristics of different muscle fibre types and the power-duration relationship...

  19. Mathematical Description and Mechanistic Reasoning: A Pathway toward STEM Integration

    Science.gov (United States)

    Weinberg, Paul J.

    2017-01-01

    Because reasoning about mechanism is critical to disciplined inquiry in science, technology, engineering, and mathematics (STEM) domains, this study focuses on ways to support the development of this form of reasoning. This study attends to how mechanistic reasoning is constituted through mathematical description. This study draws upon Smith's…

  20. Generative Mechanistic Explanation Building in Undergraduate Molecular and Cellular Biology

    Science.gov (United States)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-01-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among…

  1. Does Mechanistic Thinking Improve Student Success in Organic Chemistry?

    Science.gov (United States)

    Grove, Nathaniel P.; Cooper, Melanie M.; Cox, Elizabeth L.

    2012-01-01

    The use of the curved-arrow notation to depict electron flow during mechanistic processes is one of the most important representational conventions in the organic chemistry curriculum. Our previous research documented a disturbing trend: when asked to predict the products of a series of reactions, many students do not spontaneously engage in…

  2. The Accurate Particle Tracer Code

    OpenAIRE

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi

    2016-01-01

    The Accurate Particle Tracer (APT) code is designed for large-scale particle simulations on dynamical systems. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and non-linear problems. Under the well-designed integrated and modularized framework, APT serves as a universal platform for researchers from different fields, such as plasma physics, accelerator physics, space science, fusio...

  3. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  4. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  5. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  6. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  7. Advanced REACH Tool : Development and application of the substance emission potential modifying factor

    NARCIS (Netherlands)

    Tongeren, M. van; Fransman, W.; Spankie, S.; Tischer, M.; Brouwer, D.; Schinkel, J.; Cherrie, J.W.; Tielemans, E.

    2011-01-01

    The Advanced REACH Tool (ART) is an exposure assessment tool that combines mechanistically modelled inhalation exposure estimates with available exposure data using a Bayesian approach. The mechanistic model is based on nine independent principal modifying factors (MF). One of these MF is the

  8. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  9. SITE-94. Adaptation of mechanistic sorption models for performance assessment calculations

    International Nuclear Information System (INIS)

    Arthur, R.C.

    1996-10-01

    Sorption is considered in most predictive models of radionuclide transport in geologic systems. Most models simulate the effects of sorption in terms of empirical parameters, which however can be criticized because the data are only strictly valid under the experimental conditions at which they were measured. An alternative is to adopt a more mechanistic modeling framework based on recent advances in understanding the electrical properties of oxide mineral-water interfaces. It has recently been proposed that these 'surface-complexation' models may be directly applicable to natural systems. A possible approach for adapting mechanistic sorption models for use in performance assessments, using this 'surface-film' concept, is described in this report. Surface-acidity parameters in the Generalized Two-Layer surface complexation model are combined with surface-complexation constants for Np(V) sorption ob hydrous ferric oxide to derive an analytical model enabling direct calculation of corresponding intrinsic distribution coefficients as a function of pH, and Ca 2+ , Cl - , and HCO 3 - concentrations. The surface film concept is then used to calculate whole-rock distribution coefficients for Np(V) sorption by altered granitic rocks coexisting with a hypothetical, oxidized Aespoe groundwater. The calculated results suggest that the distribution coefficients for Np adsorption on these rocks could range from 10 to 100 ml/g. Independent estimates of K d for Np sorption in similar systems, based on an extensive review of experimental data, are consistent, though slightly conservative, with respect to the calculated values. 31 refs

  10. Performance of code 'FAIR' in IAEA CRP on FUMEX

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Kakodkar, A.

    1996-01-01

    A modern fuel performance analysis code FAIR has been developed for analysing high burnup fuel pins of water/heavy water cooled reactors. The code employs finite element method for modelling thermo mechanical behaviour of fuel pins and mechanistic models for modelling various physical and chemical phenomena affecting the behaviour of nuclear reactor fuel pins. High burnup affects such as pellet thermal conductivity degradation, enhanced fission gas release and radial flux redistribution are incorporated in the code FAIR. The code FAIR is capable of performing statistical analysis of fuel pins using Monte Carlo technique. The code is implemented on BARC parallel processing system ANUPAM. The code has recently participated in an International Atomic Energy Agency (IAEA) coordinated research program (CRP) on fuel modelling at extended burnups (FUMEX). Nineteen agencies from different countries participated in this exercise. In this CRP, spread over a period of three years, a number of high burnup fuel pins irradiated at Halden reactor are analysed. The first phase of the CRP is a blind code comparison exercise, where the computed results are compared with experimental results. The second phase consists of modifications to the code based on the experimental results of first phase and statistical analysis of fuel pins. The performance of the code FAIR in this CRP has been very good. The present report highlights the main features of code FAIR and its performance in the IAEA CRP on FUMEX. 14 refs., 5 tabs., ills

  11. A new release of the S3M code

    International Nuclear Information System (INIS)

    Pavlovic, M.; Bokor, J.; Regodic, M.; Sagatova, A.

    2015-01-01

    This paper presents a new release of the code that contains some additional routines and advanced features of displaying the results. Special attention is paid to the processing of the SRIM range file, which was not included in the previous release of the code. Examples of distributions provided by the S 3 M code for implanted ions in thyroid and iron are presented. (authors)

  12. Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing

    Science.gov (United States)

    Smith, James A.; Blattner, Tim; Messmer, Peter

    2010-01-01

    The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of

  13. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    Energy Technology Data Exchange (ETDEWEB)

    Kutnjak, Josip

    2013-06-27

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  14. Experimental investigation and mechanistic modelling of dilute bubbly bulk boiling

    International Nuclear Information System (INIS)

    Kutnjak, Josip

    2013-01-01

    During evaporation the geometric shape of the vapour is not described using thermodynamics. In bubbly flows the bubble shape is considered spheric with small diameters and changing into various shapes upon growth. The heat and mass transfer happens at the interfacial area. The forces acting on the bubbles depend on the bubble diameter and shape. In this work the prediction of the bubble diameter and/or bubble number density in bulk boiling was considered outside the vicinity of the heat input area. Thus the boiling effects that happened inside the nearly saturated bulk were under investigation. This situation is relevant for nuclear safety analysis concerning a stagnant coolant in the spent fuel pool. In this research project a new experimental set-up to investigate was built. The experimental set-up consists of an instrumented, partly transparent, high and slender boiling container for visual observation. The direct visual observation of the boiling phenomena is necessary for the identification of basic mechanisms, which should be incorporated in the simulation model. The boiling process has been recorded by means of video images and subsequently was evaluated by digital image processing methods, and by that data concerning the characteristics of the boiling process were generated for the model development and validation. Mechanistic modelling is based on the derivation of relevant mechanisms concluded from observation, which is in line with physical knowledge. In this context two mechanisms were identified; the growth/-shrink mechanism (GSM) of the vapour bubbles and sudden increases of the bubble number density. The GSM was implemented into the CFD-Code ANSYS-CFX using the CFX Expression Language (CEL) by calculation of the internal bubble pressure using the Young-Laplace-Equation. This way a hysteresis is realised as smaller bubbles have an increased internal pressure. The sudden increases of the bubble number density are explainable by liquid super

  15. Qualification of the nuclear reactor core model DYN3D coupled to the thermohydraulic system code ATHLET, applied as an advanced tool for accident analysis of VVER-type reactors. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Krepper, E.; Mittag, S; Rohde, U.; Schaefer, F.; Seidel, A.

    1998-03-01

    The nuclear reactor core model DYN3D with 3D neutron kinetics has been coupled to the thermohydraulic system code ATHLET. In the report, activities on qualification of the coupled code complex ATHLET-DYN3D as a validated tool for the accident analysis of russian VVER type reactors are described. That includes: - Contributions to the validation of the single codes ATHLET and DYN3D by the analysis of experiments on natural circulation behaviour in thermohydraulic test facilities and solution of benchmark tasks on reactivity initiated transients, - the acquisition and evaluation of measurement data on transients in nuclear power plants, the validation of ATHLET-DYN3D by calculating an accident with delayed scram and a pump trip in VVER plants, - the complementary improvement of the code DYN3D by extension of the neutron physical data base, implementation of an improved coolant mixing model, consideration of decay heat release and xenon transients, - the analysis of steam leak scenarios for VVER-440 type reactors with failure of different safety systems, investigation of different model options. The analyses showed, that with realistic coolant mixing modelling in the downcomer and the lower plenum, recriticality of the scramed reactor due to overcooling can be reached. The application of the code complex ATHLET-DYN3D in Czech Republic, Bulgaria and the Ukraine has been started. Future work comprises the verification of ATHLET-DYN3D with a DYN3D version for the square fuel element geometry of western PWR. (orig.) [de

  16. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  17. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  18. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  19. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  20. Severe accident analysis code Sampson for impact project

    International Nuclear Information System (INIS)

    Hiroshi, Ujita; Takashi, Ikeda; Masanori, Naitoh

    2001-01-01

    Four years of the IMPACT project Phase 1 (1994-1997) had been completed with financial sponsorship from the Japanese government's Ministry of Economy, Trade and Industry. At the end of the phase, demonstration simulations by combinations of up to 11 analysis modules developed for severe accident analysis in the SAMPSON Code were performed and physical models in the code were verified. The SAMPSON prototype was validated by TMI-2 and Phebus-FP test analyses. Many of empirical correlation and conventional models have been replaced by mechanistic models during Phase 2 (1998-2000). New models for Accident Management evaluation have been also developed. (author)

  1. Benchmark calculation of subchannel analysis codes

    International Nuclear Information System (INIS)

    1996-02-01

    In order to evaluate the analysis capabilities of various subchannel codes used in thermal-hydraulic design of light water reactors, benchmark calculations were performed. The selected benchmark problems and major findings obtained by the calculations were as follows: (1)As for single-phase flow mixing experiments between two channels, the calculated results of water temperature distribution along the flow direction were agreed with experimental results by tuning turbulent mixing coefficients properly. However, the effect of gap width observed in the experiments could not be predicted by the subchannel codes. (2)As for two-phase flow mixing experiments between two channels, in high water flow rate cases, the calculated distributions of air and water flows in each channel were well agreed with the experimental results. In low water flow cases, on the other hand, the air mixing rates were underestimated. (3)As for two-phase flow mixing experiments among multi-channels, the calculated mass velocities at channel exit under steady-state condition were agreed with experimental values within about 10%. However, the predictive errors of exit qualities were as high as 30%. (4)As for critical heat flux(CHF) experiments, two different results were obtained. A code indicated that the calculated CHF's using KfK or EPRI correlations were well agreed with the experimental results, while another code suggested that the CHF's were well predicted by using WSC-2 correlation or Weisman-Pei mechanistic model. (5)As for droplets entrainment and deposition experiments, it was indicated that the predictive capability was significantly increased by improving correlations. On the other hand, a remarkable discrepancy between codes was observed. That is, a code underestimated the droplet flow rate and overestimated the liquid film flow rate in high quality cases, while another code overestimated the droplet flow rate and underestimated the liquid film flow rate in low quality cases. (J.P.N.)

  2. A Mechanistic Source Term Calculation for a Metal Fuel Sodium Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2017-06-26

    A mechanistic source term (MST) calculation attempts to realistically assess the transport and release of radionuclides from a reactor system to the environment during a specific accident sequence. The U.S. Nuclear Regulatory Commission (NRC) has repeatedly stated its expectation that advanced reactor vendors will utilize an MST during the U.S. reactor licensing process. As part of a project to examine possible impediments to sodium fast reactor (SFR) licensing in the U.S., an analysis was conducted regarding the current capabilities to perform an MST for a metal fuel SFR. The purpose of the project was to identify and prioritize any gaps in current computational tools, and the associated database, for the accurate assessment of an MST. The results of the study demonstrate that an SFR MST is possible with current tools and data, but several gaps exist that may lead to possibly unacceptable levels of uncertainty, depending on the goals of the MST analysis.

  3. Improving Predictive Modeling in Pediatric Drug Development: Pharmacokinetics, Pharmacodynamics, and Mechanistic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Slikker, William; Young, John F.; Corley, Rick A.; Dorman, David C.; Conolly, Rory B.; Knudsen, Thomas; Erstad, Brian L.; Luecke, Richard H.; Faustman, Elaine M.; Timchalk, Chuck; Mattison, Donald R.

    2005-07-26

    A workshop was conducted on November 18?19, 2004, to address the issue of improving predictive models for drug delivery to developing humans. Although considerable progress has been made for adult humans, large gaps remain for predicting pharmacokinetic/pharmacodynamic (PK/PD) outcome in children because most adult models have not been tested during development. The goals of the meeting included a description of when, during development, infants/children become adultlike in handling drugs. The issue of incorporating the most recent advances into the predictive models was also addressed: both the use of imaging approaches and genomic information were considered. Disease state, as exemplified by obesity, was addressed as a modifier of drug pharmacokinetics and pharmacodynamics during development. Issues addressed in this workshop should be considered in the development of new predictive and mechanistic models of drug kinetics and dynamics in the developing human.

  4. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  5. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  6. Turbo coding, turbo equalisation and space-time coding for transmission over fading channels

    CERN Document Server

    Hanzo, L; Yeap, B

    2002-01-01

    Against the backdrop of the emerging 3G wireless personal communications standards and broadband access network standard proposals, this volume covers a range of coding and transmission aspects for transmission over fading wireless channels. It presents the most important classic channel coding issues and also the exciting advances of the last decade, such as turbo coding, turbo equalisation and space-time coding. It endeavours to be the first book with explicit emphasis on channel coding for transmission over wireless channels. Divided into 4 parts: Part 1 - explains the necessary background for novices. It aims to be both an easy reading text book and a deep research monograph. Part 2 - provides detailed coverage of turbo conventional and turbo block coding considering the known decoding algorithms and their performance over Gaussian as well as narrowband and wideband fading channels. Part 3 - comprehensively discusses both space-time block and space-time trellis coding for the first time in literature. Par...

  7. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  8. A Life-Cycle Risk-Informed Systems Structured Nuclear Code

    International Nuclear Information System (INIS)

    Hill, Ralph S. III

    2002-01-01

    Current American Society of Mechanical Engineers (ASME) nuclear codes and standards rely primarily on deterministic and mechanistic approaches to design. The design code is a separate volume from the code for inservice inspections and both are separate from the standards for operations and maintenance. The ASME code for inservice inspections and code for nuclear plant operations and maintenance have adopted risk-informed methodologies for inservice inspection, preventive maintenance, and repair and replacement decisions. The American Institute of Steel Construction and the American Concrete Institute have incorporated risk-informed probabilistic methodologies into their design codes. It is proposed that the ASME nuclear code should undergo a planned evolution that integrates the various nuclear codes and standards and adopts a risk-informed approach across a facility life-cycle - encompassing design, construction, operation, maintenance and closure. (author)

  9. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  10. Real depletion in nodal diffusion codes

    International Nuclear Information System (INIS)

    Petkov, P.T.

    2002-01-01

    The fuel depletion is described by more than one hundred fuel isotopes in the advanced lattice codes like HELIOS, but only a few fuel isotopes are accounted for even in the advanced steady-state diffusion codes. The general assumption that the number densities of the majority of the fuel isotopes depend only on the fuel burnup is seriously in error if high burnup is considered. The real depletion conditions in the reactor core differ from the asymptotic ones at the stage of lattice depletion calculations. This study reveals which fuel isotopes should be explicitly accounted for in the diffusion codes in order to predict adequately the real depletion effects in the core. A somewhat strange conclusion is that if the real number densities of the main fissionable isotopes are not explicitly accounted for in the diffusion code, then Sm-149 should not be accounted for either, because the net error in k-inf is smaller (Authors)

  11. Specialists without spirit: limitations of the mechanistic biomedical model.

    Science.gov (United States)

    Hewa, S; Hetherington, R W

    1995-06-01

    This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.

  12. Numerical simulation in steam injection process by a mechanistic approach

    Energy Technology Data Exchange (ETDEWEB)

    De Souza, J.C.Jr.; Campos, W.; Lopes, D.; Moura, L.S.S. [Petrobras, Rio de Janeiro (Brazil)

    2008-10-15

    Steam injection is a common thermal recovery method used in very viscous oil reservoirs. The method involves the injection of heat to reduce viscosity and mobilize oil. A steam generation and injection system consists primarily of a steam source, distribution lines, injection wells and a discarding tank. In order to optimize injection and improve the oil recovery factor, one must determine the parameters of steam flow such as pressure, temperature and steam quality. This study focused on developing a unified mathematical model by means of a mechanistic approach for two-phase steam flow in pipelines and wells. The hydrodynamic and heat transfer mechanistic model was implemented in a computer simulator to model the parameters of steam injection while trying to avoid the use of empirical correlations. A marching algorithm was used to determine the distribution of pressure and temperature along the pipelines and wellbores. The mathematical model for steam flow in injection systems, developed by a mechanistic approach (VapMec) performed well when the simulated values of pressures and temperatures were compared with the values measured during field tests. The newly developed VapMec model was incorporated in the LinVap-3 simulator that constitutes an engineering supporting tool for steam injection wells operated by Petrobras. 23 refs., 7 tabs., 6 figs.

  13. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  14. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code

    International Nuclear Information System (INIS)

    Ramsthaler, J.A.; Lime, J.F.; Sahota, M.S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A

  15. The HELIOS-2 lattice physics code

    International Nuclear Information System (INIS)

    Wemple, C.A.; Gheorghiu, H-N.M.; Stamm'ler, R.J.J.; Villarino, E.A.

    2008-01-01

    Major advances have been made in the HELIOS code, resulting in the impending release of a new version, HELIOS-2. The new code includes a method of characteristics (MOC) transport solver to supplement the existing collision probabilities (CP) solver. A 177-group, ENDF/B-VII nuclear data library has been developed for inclusion with the new code package. Computational tests have been performed to verify the performance of the MOC solver against the CP solver, and validation testing against computational and measured benchmarks is underway. Results to-date of the verification and validation testing are presented, demonstrating the excellent performance of the new transport solver and nuclear data library. (Author)

  16. Study on natural circulation flow under reactor cavity flooding condition in advanced PWRs

    International Nuclear Information System (INIS)

    Tao Jun; Yang Jiang; Cao Jianhua; Lu Xianghui; Guo Dingqing

    2015-01-01

    Cavity flooding is an important severe accident management measure for the in-vessel retention of a degraded core by external reactor vessel cooling in advanced PWRs. A code simulation study on the natural circulation flow in the gap between the reactor vessel wall and insulation material under cavity flooding condition is performed by using a detailed mechanistic thermal-hydraulic code package RELAP 5. By simulating of an experiment carried out for studying the natural circulation flow for APR1400 shows that the code is applicable for analyzing the circulation flow under this condition. The analysis results show that heat removal capacity of the natural circulation flow in AP1000 is sufficient to prevent thermal failure of the reactor vessel under bounding heat load. Several conclusions can be drawn from the sensitivity analysis. Larger coolant inlet area induced larger natural circulation flow rate. The outlet should be large enough and should not be submerged by the cavity water to vent the steam-water mixture. In the implementation of cavity flooding, the flooding water level should be high enough to provide sufficient natural circulation driven force. (authors)

  17. Utility of QR codes in biological collections.

    Science.gov (United States)

    Diazgranados, Mauricio; Funk, Vicki A

    2013-01-01

    The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers' electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.

  18. Utility of QR codes in biological collections

    Directory of Open Access Journals (Sweden)

    Mauricio Diazgranados

    2013-07-01

    Full Text Available The popularity of QR codes for encoding information such as URIs has increased exponentially in step with the technological advances and availability of smartphones, digital tablets, and other electronic devices. We propose using QR codes on specimens in biological collections to facilitate linking vouchers’ electronic information with their associated collections. QR codes can efficiently provide such links for connecting collections, photographs, maps, ecosystem notes, citations, and even GenBank sequences. QR codes have numerous advantages over barcodes, including their small size, superior security mechanisms, increased complexity and quantity of information, and low implementation cost. The scope of this paper is to initiate an academic discussion about using QR codes on specimens in biological collections.

  19. Esophageal function testing: Billing and coding update.

    Science.gov (United States)

    Khan, A; Massey, B; Rao, S; Pandolfino, J

    2018-01-01

    Esophageal function testing is being increasingly utilized in diagnosis and management of esophageal disorders. There have been several recent technological advances in the field to allow practitioners the ability to more accurately assess and treat such conditions, but there has been a relative lack of education in the literature regarding the associated Common Procedural Terminology (CPT) codes and methods of reimbursement. This review, commissioned and supported by the American Neurogastroenterology and Motility Society Council, aims to summarize each of the CPT codes for esophageal function testing and show the trends of associated reimbursement, as well as recommend coding methods in a practical context. We also aim to encourage many of these codes to be reviewed on a gastrointestinal (GI) societal level, by providing evidence of both discrepancies in coding definitions and inadequate reimbursement in this new era of esophageal function testing. © 2017 John Wiley & Sons Ltd.

  20. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  1. (U) Ristra Next Generation Code Report

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-22

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming a common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.

  2. Numerical simulation in steam injection wellbores by mechanistic approach; Simulacao numerica do escoamento de vapor em pocos por uma abordagem mecanicista

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, J.C. de; Campos, W.; Lopes, D.; Moura, L.S.S. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Thomas, A. Clecio F. [Universidade Estadual do Ceara (UECE), CE (Brazil)

    2008-07-01

    This work addresses to the development of a hydrodynamic and heat transfer mechanistic model for steam flow in injection wellbores. The problem of two-phase steam flow in wellbores has been solved recently by using available empirical correlations from petroleum industry (Lopes, 1986) and nuclear industry (Moura, 1991).The good performance achieved by mechanistic models developed by Ansari (1994), Hasan (1995), Gomez (2000) and Kaya (2001) supports the importance of the mechanistic approach for the steam flow problem in injection wellbores. In this study, the methodology to solve the problem consists in the application of a numerical method to the governing equations of steam flow and a marching algorithm to determine the distribution of the pressure and temperature along the wellbore. So, a computer code has been formulated to get numerical results, which provides a comparative study to the main models found in the literature. Finally, when compared to available field data, the mechanistic model for downward vertical steam flow in wellbores gave better results than the empirical correlations. (author)

  3. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  4. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  5. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  6. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  7. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  8. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  9. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  10. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  11. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  12. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  13. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  14. Further development of the computer code ATHLET-CD

    International Nuclear Information System (INIS)

    Weber, Sebastian; Austregesilo, Henrique; Bals, Christine; Band, Sebastian; Hollands, Thorsten; Koellein, Carsten; Lovasz, Liviusz; Pandazis, Peter; Schubert, Johann-Dietrich; Sonnenkalb, Martin

    2016-10-01

    In the framework of the reactor safety research program sponsored by the German Federal Ministry for Economic Affairs and Energy (BMWi), the computer code system ATHLET/ATHLET-CD has been further developed as an analysis tool for the simulation of accidents in nuclear power plants with pressurized and boiling water reactors as well as for the evaluation of accident management procedures. The main objective was to provide a mechanistic analysis tool for best estimate calculations of transients, accidents, and severe accidents with core degradation in light water reactors. With the continued development, the capability of the code system has been largely improved, allowing best estimate calculations of design and beyond design base accidents, and the simulation of advanced core degradation with enhanced model extent in a reasonable calculation time. ATHLET comprises inter alia a 6-equation model, models for the simulation of non-condensable gases and tracking of boron concentration, as well as additional component and process models for the complete system simulation. Among numerous model improvements, the code application has been extended to super critical pressures. The mechanistic description of the dynamic development of flow regimes on the basis of a transport equation for the interface area has been further developed. This ATHLET version is completely integrated in ATHLET-CD. ATHLET-CD further comprises dedicated models for the simulation of fuel and control assembly degradation for both pressurized and boiling water reactors, debris bed with melting in the core region, as well as fission product and aerosol release and transport in the cooling system, inclusive of decay of nuclide inventories and of chemical reactions in the gas phase. The continued development also concerned the modelling of absorber material release, of melting, melt relocation and freezing, and the interaction with the wall of the reactor pressure vessel. The following models were newly

  15. Iterative Systems Biology for Medicine – time for advancing from network signature to mechanistic equations

    KAUST Repository

    Gomez-Cabrero, David; Tegner, Jesper

    2017-01-01

    The rise and growth of Systems Biology following the sequencing of the human genome has been astounding. Early on, an iterative wet-dry methodology was formulated which turned out as a successful approach in deciphering biological complexity

  16. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  17. A comparative analysis of reactor lower head debris cooling models employed in the existing severe accident analysis codes

    International Nuclear Information System (INIS)

    Ahn, K.I.; Kim, D.H.; Kim, S.B.; Kim, H.D.

    1998-08-01

    MELCOR and MAAP4 are the representative severe accident analysis codes which have been developed for the integral analysis of the phenomenological reactor lower head corium cooling behavior. Main objectives of the present study is to identify merits and disadvantages of each relevant model through the comparative analysis of the lower plenum corium cooling models employed in these two codes. The final results will be utilized for the development of LILAC phenomenological models and for the continuous improvement of the existing MELCOR reactor lower head models, which are currently being performed at the KAERI. For these purposes, first, nine reference models are selected featuring the lower head corium behavior based on the existing experimental evidences and related models. Then main features of the selected models have been critically analyzed, and finally merits and disadvantages of each corresponding model have been summarized in the view point of realistic corium behavior and reasonable modeling. Being on these evidences, summarized and presented the potential improvements for developing more advanced models. The present study has been focused on the qualitative comparison of each model and so more detailed quantitative analysis is strongly required to obtain the final conclusions for their merits and disadvantages. In addition, in order to compensate the limitations of the current model, required further studies relating closely the detailed mechanistic models with the molten material movement and heat transfer based on phase-change in the porous medium, to the existing simple models. (author). 36 refs

  18. ASTEC V2 severe accident integral code main features, current V2.0 modelling status, perspectives

    International Nuclear Information System (INIS)

    Chatelard, P.; Reinke, N.; Arndt, S.; Belon, S.; Cantrel, L.; Carenini, L.; Chevalier-Jabet, K.; Cousin, F.; Eckel, J.; Jacq, F.; Marchetto, C.; Mun, C.; Piar, L.

    2014-01-01

    The severe accident integral code ASTEC, jointly developed since almost 20 years by IRSN and GRS, simulates the behaviour of a whole nuclear power plant under severe accident conditions, including severe accident management by engineering systems and procedures. Since 2004, the ASTEC code is progressively becoming the reference European severe accident integral code through in particular the intensification of research activities carried out in the frame of the SARNET European network of excellence. The first version of the new series ASTEC V2 was released in 2009 to about 30 organizations worldwide and in particular to SARNET partners. With respect to the previous V1 series, this new V2 series includes advanced core degradation models (issued from the ICARE2 IRSN mechanistic code) and necessary extensions to be applicable to Gen. III reactor designs, notably a description of the core catcher component to simulate severe accidents transients applied to the EPR reactor. Besides these two key-evolutions, most of the other physical modules have also been improved and ASTEC V2 is now coupled to the SUNSET statistical tool to make easier the uncertainty and sensitivity analyses. The ASTEC models are today at the state of the art (in particular fission product models with respect to source term evaluation), except for quenching of a severely damage core. Beyond the need to develop an adequate model for the reflooding of a degraded core, the main other mean-term objectives are to further progress on the on-going extension of the scope of application to BWR and CANDU reactors, to spent fuel pool accidents as well as to accidents in both the ITER Fusion facility and Gen. IV reactors (in priority on sodium-cooled fast reactors) while making ASTEC evolving towards a severe accident simulator constitutes the main long-term objective. This paper presents the status of the ASTEC V2 versions, focussing on the description of V2.0 models for water-cooled nuclear plants

  19. Mechanistic species distribution modeling reveals a niche shift during invasion.

    Science.gov (United States)

    Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M

    2017-06-01

    Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual

  20. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    International Nuclear Information System (INIS)

    Devau, Nicolas; Cadre, Edith Le; Hinsinger, Philippe; Jaillard, Benoit; Gerard, Frederic

    2009-01-01

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl 2 (P-CaCl 2 ) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH ∼ 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R 2 = 0.9, RMSE = 0.03 mg kg -1 ). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl 2 with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional contribution of soil organic matter.

  1. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Devau, Nicolas [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Cadre, Edith Le [Supagro, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Hinsinger, Philippe; Jaillard, Benoit [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Gerard, Frederic, E-mail: gerard@supagro.inra.fr [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France)

    2009-11-15

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl{sub 2} (P-CaCl{sub 2}) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH {approx} 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R{sup 2} = 0.9, RMSE = 0.03 mg kg{sup -1}). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl{sub 2} with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional

  2. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    Directory of Open Access Journals (Sweden)

    Rasmus Magnusson

    2017-06-01

    Full Text Available Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM, which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE for gene regulatory networks (GRNs. LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models

  3. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  4. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  5. Development of Improved Mechanistic Deterioration Models for Flexible Pavements

    DEFF Research Database (Denmark)

    Ullidtz, Per; Ertman, Hans Larsen

    1998-01-01

    The paper describes a pilot study in Denmark with the main objective of developing improved mechanistic deterioration models for flexible pavements based on an accelerated full scale test on an instrumented pavement in the Danish Road Tessting Machine. The study was the first in "International...... Pavement Subgrade Performance Study" sponsored by the Federal Highway Administration (FHWA), USA. The paper describes in detail the data analysis and the resulting models for rutting, roughness, and a model for the plastic strain in the subgrade.The reader will get an understanding of the work needed...

  6. Mechanistic study of aerosol dry deposition on vegetated canopies

    International Nuclear Information System (INIS)

    Petroff, A.

    2005-04-01

    The dry deposition of aerosols onto vegetated canopies is modelled through a mechanistic approach. The interaction between aerosols and vegetation is first formulated by using a set of parameters, which are defined at the local scale of one surface. The overall deposition is then deduced at the canopy scale through an up-scaling procedure based on the statistic distribution parameters. This model takes into account the canopy structural and morphological properties, and the main characteristics of the turbulent flow. Deposition mechanisms considered are Brownian diffusion, interception, initial and turbulent impaction, initially with coniferous branches and then with entire canopies of different roughness, such as grass, crop field and forest. (author)

  7. Mechanistic modeling of CHF in forced-convection subcooled boiling

    International Nuclear Information System (INIS)

    Podowski, M.Z.; Alajbegovic, A.; Kurul, N.; Drew, D.A.; Lahey, R.T. Jr.

    1997-05-01

    Because of the complexity of phenomena governing boiling heat transfer, the approach to solve practical problems has traditionally been based on experimental correlations rather than mechanistic models. The recent progress in computational fluid dynamics (CFD), combined with improved experimental techniques in two-phase flow and heat transfer, makes the use of rigorous physically-based models a realistic alternative to the current simplistic phenomenological approach. The objective of this paper is to present a new CFD model for critical heat flux (CHF) in low quality (in particular, in subcooled boiling) forced-convection flows in heated channels

  8. Mechanistic CHF modeling for natural circulation applications in SMR

    Energy Technology Data Exchange (ETDEWEB)

    Luitjens, Jeffrey [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Wu, Qiao, E-mail: qiao.wu@oregonstate.edu [Department of Nuclear Engineering and Radiation Health Physics, Oregon State University, 3451 SW Jefferson Way, Corvallis, OR 97331 (United States); Greenwood, Scott; Corradini, Michael [Department of Engineering Physics, University of Wisconsin, 1415 Engineering Drive, Madison, WI 53706 (United States)

    2016-12-15

    A mechanistic critical heat flux correlation has been developed for a wide range of operating conditions which include low mass fluxes of 540–890 kg/m{sup 2}-s, high pressures of 12–13 MPa, and critical heat fluxes of 835–1100 kW/m{sup 2}. Eleven experimental data points have been collected over these conditions to inform the development of the model using bundle geometry. Errors of within 15% have been obtained with the proposed model for predicting the critical heat flux value, location, and critical pin power for a non-uniform heat flux applied to a 2 × 2 bundle configuration.

  9. Mechanistic modelling of the drying behaviour of single pharmaceutical granules

    DEFF Research Database (Denmark)

    Thérèse F.C. Mortier, Séverine; Beer, Thomas De; Gernaey, Krist

    2012-01-01

    The trend to move towards continuous production processes in pharmaceutical applications enhances the necessity to develop mechanistic models to understand and control these processes. This work focuses on the drying behaviour of a single wet granule before tabletting, using a six...... phase (submodel 2), the water inside the granule evaporates. The second submodel contains an empirical power coefficient, b. A sensitivity analysis was performed to study the influence of parameters on the moisture content of single pharmaceutical granules, which clearly points towards the importance...

  10. Ionizing radiation induced cataracts: Recent biological and mechanistic developments and perspectives for future research.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Barnard, Stephen; Bright, Scott; Dalke, Claudia; Jarrin, Miguel; Kunze, Sarah; Tanner, Rick; Dynlacht, Joseph R; Quinlan, Roy A; Graw, Jochen; Kadhim, Munira; Hamada, Nobuyuki

    The lens of the eye has long been considered as a radiosensitive tissue, but recent research has suggested that the radiosensitivity is even greater than previously thought. The 2012 recommendation of the International Commission on Radiological Protection (ICRP) to substantially reduce the annual occupational equivalent dose limit for the ocular lens has now been adopted in the European Union and is under consideration around the rest of the world. However, ICRP clearly states that the recommendations are chiefly based on epidemiological evidence because there are a very small number of studies that provide explicit biological, mechanistic evidence at doses <2Gy. This paper aims to present a review of recently published information on the biological and mechanistic aspects of cataracts induced by exposure to ionizing radiation (IR). The data were compiled by assessing the pertinent literature in several distinct areas which contribute to the understanding of IR induced cataracts, information regarding lens biology and general processes of cataractogenesis. Results from cellular and tissue level studies and animal models, and relevant human studies, were examined. The main focus was the biological effects of low linear energy transfer IR, but dosimetry issues and a number of other confounding factors were also considered. The results of this review clearly highlight a number of gaps in current knowledge. Overall, while there have been a number of recent advances in understanding, it remains unknown exactly how IR exposure contributes to opacification. A fuller understanding of how exposure to relatively low doses of IR promotes induction and/or progression of IR-induced cataracts will have important implications for prevention and treatment of this disease, as well as for the field of radiation protection. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  11. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  12. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  13. Chemical kinetic mechanistic models to investigate cancer biology and impact cancer medicine

    International Nuclear Information System (INIS)

    Stites, Edward C

    2013-01-01

    Traditional experimental biology has provided a mechanistic understanding of cancer in which the malignancy develops through the acquisition of mutations that disrupt cellular processes. Several drugs developed to target such mutations have now demonstrated clinical value. These advances are unequivocal testaments to the value of traditional cellular and molecular biology. However, several features of cancer may limit the pace of progress that can be made with established experimental approaches alone. The mutated genes (and resultant mutant proteins) function within large biochemical networks. Biochemical networks typically have a large number of component molecules and are characterized by a large number of quantitative properties. Responses to a stimulus or perturbation are typically nonlinear and can display qualitative changes that depend upon the specific values of variable system properties. Features such as these can complicate the interpretation of experimental data and the formulation of logical hypotheses that drive further research. Mathematical models based upon the molecular reactions that define these networks combined with computational studies have the potential to deal with these obstacles and to enable currently available information to be more completely utilized. Many of the pressing problems in cancer biology and cancer medicine may benefit from a mathematical treatment. As work in this area advances, one can envision a future where such models may meaningfully contribute to the clinical management of cancer patients. (paper)

  14. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  15. Modeling of fission product release in integral codes

    International Nuclear Information System (INIS)

    Obaidurrahman, K.; Raman, Rupak K.; Gaikwad, Avinash J.

    2014-01-01

    The Great Tohoku earthquake and tsunami that stroke the Fukushima-Daiichi nuclear power station in March 11, 2011 has intensified the needs of detailed nuclear safety research and with this objective all streams associated with severe accident phenomenology are being revisited thoroughly. The present paper would cover an overview of state of art FP release models being used, the important phenomenon considered in semi-mechanistic models and knowledge gaps in present FP release modeling. Capability of FP release module, ELSA of ASTEC integral code in appropriate prediction of FP release under several diversified core degraded conditions will also be demonstrated. Use of semi-mechanistic fission product release models at AERB in source-term estimation shall be briefed. (author)

  16. QR CODES IN EDUCATION AND COMMUNICATION

    Directory of Open Access Journals (Sweden)

    Gurhan DURAK

    2016-04-01

    Full Text Available Technological advances brought applications of innovations to education. Conventional education increasingly flourishes with new technologies accompanied by more learner active environments. In this continuum, there are learners preferring self-learning. Traditional learning materials yield attractive, motivating and technologically enhanced learning materials. The QR (Quick Response Codes are one of these innovations. The aim of this study is to redesign a lesson unit supported with QR Codes and to get the learner views about the redesigned material. For this purpose, the redesigned lesson unit was delivered to 15 learners in Balıkesir University in the academic year of 2013-2014. The learners were asked to study the material. The learners who had smart phones and Internet access were chosen for the study. To provide sectional diversity, three groups were created. The group learners were from Faculty of Education, Faculty of Science and Literature and Faculty of Engineering. After the semi-structured interviews were held, the learners were asked about their pre-knowledge about QR Codes, QR Codes’ contribution to learning, difficulties with using QR Codes about and design issues. Descriptive data analysis was used in the study. The findings were interpreted on the basis of Theory of Diffusion of Innovations and Theory of Uses and Gratifications. After the research, the themes found were awareness of QR Code, types of QR Codes and applications, contributions to learning, and proliferation of QR Codes. Generally, the learners participating in the study reported that they were aware of QR Codes; that they could use the QR Codes; and that using QR Codes in education was useful. They also expressed that such features as visual elements, attractiveness and direct routing had positive impact on learning. In addition, they generally mentioned that they did not have any difficulty using QR Codes; that they liked the design; and that the content should

  17. Trichloroethylene: Mechanistic, epidemiologic and other supporting evidence of carcinogenic hazard.

    Science.gov (United States)

    Rusyn, Ivan; Chiu, Weihsueh A; Lash, Lawrence H; Kromhout, Hans; Hansen, Johnni; Guyton, Kathryn Z

    2014-01-01

    The chlorinated solvent trichloroethylene (TCE) is a ubiquitous environmental pollutant. The carcinogenic hazard of TCE was the subject of a 2012 evaluation by a Working Group of the International Agency for Research on Cancer (IARC). Information on exposures, relevant data from epidemiologic studies, bioassays in experimental animals, and toxicity and mechanism of action studies was used to conclude that TCE is carcinogenic to humans (Group 1). This article summarizes the key evidence forming the scientific bases for the IARC classification. Exposure to TCE from environmental sources (including hazardous waste sites and contaminated water) is common throughout the world. While workplace use of TCE has been declining, occupational exposures remain of concern, especially in developing countries. The strongest human evidence is from studies of occupational TCE exposure and kidney cancer. Positive, although less consistent, associations were reported for liver cancer and non-Hodgkin lymphoma. TCE is carcinogenic at multiple sites in multiple species and strains of experimental animals. The mechanistic evidence includes extensive data on the toxicokinetics and genotoxicity of TCE and its metabolites. Together, available evidence provided a cohesive database supporting the human cancer hazard of TCE, particularly in the kidney. For other target sites of carcinogenicity, mechanistic and other data were found to be more limited. Important sources of susceptibility to TCE toxicity and carcinogenicity were also reviewed by the Working Group. In all, consideration of the multiple evidence streams presented herein informed the IARC conclusions regarding the carcinogenicity of TCE. © 2013.

  18. The coefficient of restitution of pressurized balls: a mechanistic model

    Science.gov (United States)

    Georgallas, Alex; Landry, Gaëtan

    2016-01-01

    Pressurized, inflated balls used in professional sports are regulated so that their behaviour upon impact can be anticipated and allow the game to have its distinctive character. However, the dynamics governing the impacts of such balls, even on stationary hard surfaces, can be extremely complex. The energy transformations, which arise from the compression of the gas within the ball and from the shear forces associated with the deformation of the wall, are examined in this paper. We develop a simple mechanistic model of the dependence of the coefficient of restitution, e, upon both the gauge pressure, P_G, of the gas and the shear modulus, G, of the wall. The model is validated using the results from a simple series of experiments using three different sports balls. The fits to the data are extremely good for P_G > 25 kPa and consistent values are obtained for the value of G for the wall material. As far as the authors can tell, this simple, mechanistic model of the pressure dependence of the coefficient of restitution is the first in the literature. *%K Coefficient of Restitution, Dynamics, Inflated Balls, Pressure, Impact Model

  19. New web-based applications for mechanistic case diagramming

    Directory of Open Access Journals (Sweden)

    Fred R. Dee

    2014-07-01

    Full Text Available The goal of mechanistic case diagraming (MCD is to provide students with more in-depth understanding of cause and effect relationships and basic mechanistic pathways in medicine. This will enable them to better explain how observed clinical findings develop from preceding pathogenic and pathophysiological events. The pedagogic function of MCD is in relating risk factors, disease entities and morphology, signs and symptoms, and test and procedure findings in a specific case scenario with etiologic pathogenic and pathophysiological sequences within a flow diagram. In this paper, we describe the addition of automation and predetermined lists to further develop the original concept of MCD as described by Engelberg in 1992 and Guerrero in 2001. We demonstrate that with these modifications, MCD is effective and efficient in small group case-based teaching for second-year medical students (ratings of ~3.4 on a 4.0 scale. There was also a significant correlation with other measures of competency, with a ‘true’ score correlation of 0.54. A traditional calculation of reliability showed promising results (α =0.47 within a low stakes, ungraded environment. Further, we have demonstrated MCD's potential for use in independent learning and TBL. Future studies are needed to evaluate MCD's potential for use in medium stakes assessment or self-paced independent learning and assessment. MCD may be especially relevant in returning students to the application of basic medical science mechanisms in the clinical years.

  20. European Validation of the Integral Code ASTEC (EVITA)

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Neu, K.; Dorsselaere, J.P. Van

    2005-01-01

    The main objective of the European Validation of the Integral Code ASTEC (EVITA) project is to distribute the severe accident integral code ASTEC to European partners in order to apply the validation strategy issued from the VASA project (4th EC FWP). Partners evaluate the code capability through validation on reference experiments and plant applications accounting for severe accident management measures, and compare results with reference codes. The basis version V0 of ASTEC (Accident Source Term Evaluation Code)-commonly developed and basically validated by GRS and IRSN-was made available in late 2000 for the EVITA partners on their individual platforms. Users' training was performed by IRSN and GRS. The code portability on different computers was checked to be correct. A 'hot line' assistance was installed continuously available for EVITA code users. The actual version V1 has been released to the EVITA partners end of June 2002. It allows to simulate the front-end phase by two new modules:- for reactor coolant system 2-phase simplified thermal hydraulics (5-equation approach) during both front-end and core degradation phases; - for core degradation, based on structure and main models of ICARE2 (IRSN) reference mechanistic code for core degradation and on other simplified models. Next priorities are clearly identified: code consolidation in order to increase the robustness, extension of all plant applications beyond the vessel lower head failure and coupling with fission product modules, and continuous improvements of users' tools. As EVITA has very successfully made the first step into the intention to provide end-users (like utilities, vendors and licensing authorities) with a well validated European integral code for the simulation of severe accidents in NPPs, the EVITA partners strongly recommend to continue validation, benchmarking and application of ASTEC. This work will continue in Severe Accident Research Network (SARNET) in the 6th Framework Programme

  1. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  2. Steam explosion simulation code JASMINE v.3 user's guide

    International Nuclear Information System (INIS)

    Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo

    2008-07-01

    A steam explosion occurs when hot liquid contacts with cold volatile liquid. In this phenomenon, fine fragmentation of the hot liquid causes extremely rapid heat transfer from the hot liquid to the cold volatile liquid, and explosive vaporization, bringing shock waves and destructive forces. The steam explosion due to the contact of the molten core material and coolant water during severe accidents of light water reactors has been regarded as a potential threat to the integrity of the containment vessel. We developed a mechanistic steam explosion simulation code, JASMINE, that is applicable to plant scale assessment of the steam explosion loads. This document, as a manual for users of JASMINE code, describes the models, numerical solution methods, and also some verification and example calculations, as well as practical instructions for input preparation and usage of the code. (author)

  3. A codificação moral da medicina: avanços e desafios na formação dos médicos Producing a moral code for medicine: advances and challenges in medical training

    Directory of Open Access Journals (Sweden)

    Roberto Luiz d'Avila

    2010-12-01

    Full Text Available O presente artigo objetivou discutir a codificação moral da prática médica, mostrando a necessidade de ampliar a formação tecnicista com elementos humanitários. Neste sentido, abordou primeiramente a Medicina como ciência e arte pautada em princípios morais. Posteriormente, revisou o contexto histórico da codificação moral médica, enfocando a realidade brasileira. Por fim, discutiu-se o código médico vigente, destacando a importância de não considerá-lo como ferramenta estritamente punitiva, mas de orientação para a promoção do bem-estar dos pacientes e melhoria da sociedade. Concluindo, indicou-se a necessidade de as escolas médicas oferecerem, além de formação técnica, uma preparação contínua em temas humanitários, promovendo o desenvolvimento moral dos estudantes e futuros médicos.The aim of this article is to discuss how the moral code for medicine is produced and to demonstrate the need for technical training to be expanded to include humanitarian features. To this end, it first addresses medicine as a science and as an art founded on moral principles. Then, it reviews the historical context of the production of the moral codes for medicine, with a specific focus on Brazil. Finally, it discusses the prevailing medical code, pointing out the importance of not regarding it as a strictly punitive tool, but as a set of guidelines for the promotion of the well-being of patients and the improvement of society in general. It concludes by indicating the need for medical schools to go beyond technical training and provide on-going preparation for dealing with humanitarian issues, thereby developing the sense of morality among the students and future doctors.

  4. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  5. A new code for predicting the thermo-mechanical and irradiation behavior of metallic fuels in sodium fast reactors

    Energy Technology Data Exchange (ETDEWEB)

    Karahan, Aydin, E-mail: karahan@mit.ed [Center for Advanced Nuclear Energy Systems, Nuclear Science and Engineering Department, Massachusetts Institute of Technology (United States); Buongiorno, Jacopo [Center for Advanced Nuclear Energy Systems, Nuclear Science and Engineering Department, Massachusetts Institute of Technology (United States)

    2010-01-31

    An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO{sub 2}-PuO{sub 2} mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium

  6. A new code for predicting the thermo-mechanical and irradiation behavior of metallic fuels in sodium fast reactors

    International Nuclear Information System (INIS)

    Karahan, Aydin; Buongiorno, Jacopo

    2010-01-01

    An engineering code to predict the irradiation behavior of U-Zr and U-Pu-Zr metallic alloy fuel pins and UO 2 -PuO 2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named Fuel Engineering and Structural analysis Tool (FEAST). FEAST has several modules working in coupled form with an explicit numerical algorithm. These modules describe fission gas release and fuel swelling, fuel chemistry and restructuring, temperature distribution, fuel-clad chemical interaction, and fuel and clad mechanical analysis including transient creep-fracture for the clad. Given the fuel pin geometry, composition and irradiation history, FEAST can analyze fuel and clad thermo-mechanical behavior at both steady-state and design-basis (non-disruptive) transient scenarios. FEAST was written in FORTRAN-90 and has a simple input file similar to that of the LWR fuel code FRAPCON. The metal-fuel version is called FEAST-METAL, and is described in this paper. The oxide-fuel version, FEAST-OXIDE is described in a companion paper. With respect to the old Argonne National Laboratory code LIFE-METAL and other same-generation codes, FEAST-METAL emphasizes more mechanistic, less empirical models, whenever available. Specifically, fission gas release and swelling are modeled with the GRSIS algorithm, which is based on detailed tracking of fission gas bubbles within the metal fuel. Migration of the fuel constituents is modeled by means of thermo-transport theory. Fuel-clad chemical interaction models based on precipitation kinetics were developed for steady-state operation and transients. Finally, a transient intergranular creep-fracture model for the clad, which tracks the nucleation and growth of the cavities at the grain boundaries, was developed for and implemented in the code. Reducing the empiricism in the constitutive models should make it more acceptable to extrapolate FEAST-METAL to new fuel compositions and higher burnup, as envisioned in advanced sodium reactors

  7. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  8. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  9. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  10. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  11. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  12. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  13. Mechanistic model for Sr and Ba release from severely damaged fuel

    International Nuclear Information System (INIS)

    Rest, J.; Cronenberg, A.W.

    1985-11-01

    Among radionuclides associated with fission product release during severe accidents, the primary ones with health consequences are the volatile species of I, Te, and Cs, and the next most important are Sr, Ba, and Ru. Considerable progress has been made in the mechanistic understanding of I, Cs, Te, and noble gas release; however, no capability presently exists for estimating the release of Sr, Ba, and Ru. This paper presents a description of the primary physical/chemical models recently incorporated into the FASTGRASS-VFP (volatile fission product) code for the estimation of Sr and Ba release. FASTGRASS-VFP release predictions are compared with two data sets: (1) data from out-of-reactor induction-heating experiments on declad low-burnup (1000 and 4000 MWd/t) pellets, and (2) data from the more recent in-reactor PBF Severe Fuel Damage Tests, in which one-meter-long, trace-irradiated (89 MWd/t) and normally irradiated (approx.35,000 MWd/t) fuel rods were tested under accident conditions. 10 refs

  14. Energy Efficiency Program Administrators and Building Energy Codes

    Science.gov (United States)

    Explore how energy efficiency program administrators have helped advance building energy codes at federal, state, and local levels—using technical, institutional, financial, and other resources—and discusses potential next steps.

  15. MELCOR code modeling for APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Young; Park, S. Y.; Kim, D. H.; Ahn, K. I.; Song, Y. M.; Kim, S. D.; Park, J. H

    2001-11-01

    The severe accident phenomena of nuclear power plant have large uncertainties. For the retention of the containment integrity and improvement of nuclear reactor safety against severe accident, it is essential to understand severe accident phenomena and be able to access the accident progression accurately using computer code. Furthermore, it is important to attain a capability for developing technique and assessment tools for an advanced nuclear reactor design as well as for the severe accident prevention and mitigation. The objective of this report is to establish technical bases for an application of the MELCOR code to the Korean Next Generation Reactor (APR1400) by modeling the plant and analyzing plant steady state. This report shows the data and the input preparation for MELCOR code as well as state-state assessment results using MELCOR code.

  16. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  17. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  18. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  19. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  20. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  1. Oxide fuel pin transient performance analysis and design with the TEMECH code

    International Nuclear Information System (INIS)

    Bard, F.E.; Dutt, S.P.; Hinman, C.A.; Hunter, C.W.; Pitner, A.L.

    1986-01-01

    The TEMECH code is a fast-running, thermal-mechanical-hydraulic, analytical program used to evaluate the transient performance of LMR oxide fuel pins. The code calculates pin deformation and failure probability due to fuel-cladding differential thermal expansion, expansion of fuel upon melting, and fission gas pressurization. The mechanistic fuel model in the code accounts for fuel cracking, crack closure, porosity decrease, and the temperature dependence of fuel creep through the course of the transient. Modeling emphasis has been placed on results obtained from Fuel Cladding Transient Test (FCTT) testing, Transient Fuel Deformation (TFD) tests and TREAT integral fuel pin experiments

  2. Code package {open_quotes}SVECHA{close_quotes}: Modeling of core degradation phenomena at severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Veshchunov, M.S.; Kisselev, A.E.; Palagin, A.V. [Nuclear Safety Institute, Moscow (Russian Federation)] [and others

    1995-09-01

    The code package SVECHA for the modeling of in-vessel core degradation (CD) phenomena in severe accidents is being developed in the Nuclear Safety Institute, Russian Academy of Science (NSI RAS). The code package presents a detailed mechanistic description of the phenomenology of severe accidents in a reactor core. The modules of the package were developed and validated on separate effect test data. These modules were then successfully implemented in the ICARE2 code and validated against a wide range of integral tests. Validation results have shown good agreement with separate effect tests data and with the integral tests CORA-W1/W2, CORA-13, PHEBUS-B9+.

  3. Application of mechanistic models to fermentation and biocatalysis for next-generation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Eliasson Lantz, Anna; Tufvesson, Pär

    2010-01-01

    of variables required for measurement, control and process design. In the near future, mechanistic models with a higher degree of detail will play key roles in the development of efficient next-generation fermentation and biocatalytic processes. Moreover, mechanistic models will be used increasingly......Mechanistic models are based on deterministic principles, and recently, interest in them has grown substantially. Herein we present an overview of mechanistic models and their applications in biotechnology, including future perspectives. Model utility is highlighted with respect to selection...

  4. Future trends in image coding

    Science.gov (United States)

    Habibi, Ali

    1993-01-01

    The objective of this article is to present a discussion on the future of image data compression in the next two decades. It is virtually impossible to predict with any degree of certainty the breakthroughs in theory and developments, the milestones in advancement of technology and the success of the upcoming commercial products in the market place which will be the main factors in establishing the future stage to image coding. What we propose to do, instead, is look back at the progress in image coding during the last two decades and assess the state of the art in image coding today. Then, by observing the trends in developments of theory, software, and hardware coupled with the future needs for use and dissemination of imagery data and the constraints on the bandwidth and capacity of various networks, predict the future state of image coding. What seems to be certain today is the growing need for bandwidth compression. The television is using a technology which is half a century old and is ready to be replaced by high definition television with an extremely high digital bandwidth. Smart telephones coupled with personal computers and TV monitors accommodating both printed and video data will be common in homes and businesses within the next decade. Efficient and compact digital processing modules using developing technologies will make bandwidth compressed imagery the cheap and preferred alternative in satellite and on-board applications. In view of the above needs, we expect increased activities in development of theory, software, special purpose chips and hardware for image bandwidth compression in the next two decades. The following sections summarize the future trends in these areas.

  5. MARS CODE MANUAL VOLUME III - Programmer's Manual

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Hwang, Moon Kyu; Jeong, Jae Jun; Kim, Kyung Doo; Bae, Sung Won; Lee, Young Jin; Lee, Won Jae

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This programmer's manual provides a complete list of overall information of code structure and input/output function of MARS. In addition, brief descriptions for each subroutine and major variables used in MARS are also included in this report, so that this report would be very useful for the code maintenance. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  6. Study of experimental validation for combustion analysis of GOTHIC code

    International Nuclear Information System (INIS)

    Lee, J. Y.; Yang, S. Y.; Park, K. C.; Jeong, S. H.

    2001-01-01

    In this study, present lumped and subdivided GOTHIC6 code analyses of the premixed hydrogen combustion experiment at the Seoul National University and comparison with the experiment results. The experimental facility has 16367 cc free volume and rectangular shape. And the test was performed with unit equivalence ratio of the hydrogen and air, and with various location of igniter position. Using the lumped and mechanistic combustion model in GOTHIC6 code, the experiments were simulated with the same conditions. In the comparison between experiment and calculated results, the GOTHIC6 prediction of the combustion response does not compare well with the experiment results. In the point of combustion time, the lumped combustion model of GOTHIC6 code does not simulate the physical phenomena of combustion appropriately. In the case of mechanistic combustion model, the combustion time is predicted well, but the induction time of calculation data is longer than the experiment data remarkably. Also, the laminar combustion model of GOTHIC6 has deficiency to simulate combustion phenomena unless control the user defined value appropriately. And the pressure is not a proper variable that characterize the three dimensional effect of combustion

  7. Short-Term Memory Coding in Children With Intellectual Disabilities

    OpenAIRE

    Henry, L.; Conners, F.

    2008-01-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and word length effects). Neither the intellectual disabilities nor MA groups showed evidence for memory coding strategies. However, children in these gr...

  8. Short-Term Memory Coding in Children with Intellectual Disabilities

    Science.gov (United States)

    Henry, Lucy

    2008-01-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and…

  9. From patterns to emerging processes in mechanistic urban ecology.

    Science.gov (United States)

    Shochat, Eyal; Warren, Paige S; Faeth, Stanley H; McIntyre, Nancy E; Hope, Diane

    2006-04-01

    Rapid urbanization has become an area of crucial concern in conservation owing to the radical changes in habitat structure and loss of species engendered by urban and suburban development. Here, we draw on recent mechanistic ecological studies to argue that, in addition to altered habitat structure, three major processes contribute to the patterns of reduced species diversity and elevated abundance of many species in urban environments. These activities, in turn, lead to changes in animal behavior, morphology and genetics, as well as in selection pressures on animals and plants. Thus, the key to understanding urban patterns is to balance studying processes at the individual level with an integrated examination of environmental forces at the ecosystem scale.

  10. Mechanistic model for void distribution in flashing flow

    International Nuclear Information System (INIS)

    Riznic, J.; Ishii, M.; Afgan, N.

    1987-01-01

    A problem of discharging of an initially subcooled liquid from a high pressure condition into a low pressure environment is quite important in several industrial systems such as nuclear reactors and chemical reactors. A new model for the flashing process is proposed here based on the wall nucleation theory, bubble growth model and drift-flux bubble transport model. In order to calculate the bubble number density, the bubble number transport equation with a distributed source from the wall nucleation sites is used. The model predictions in terms of the void fraction are compared to Moby Dick and BNL experimental data. It shows that satisfactory agreements could be obtained from the present model without any floating parameter to be adjusted with data. This result indicates that, at least for the experimental conditions considered here, the mechanistic prediction of the flashing phenomenon is possible based on the present wall nucleation based model. 43 refs., 4 figs

  11. Mechanistic failure mode investigation and resolution of parvovirus retentive filters.

    Science.gov (United States)

    LaCasse, Daniel; Lute, Scott; Fiadeiro, Marcus; Basha, Jonida; Stork, Matthew; Brorson, Kurt; Godavarti, Ranga; Gallo, Chris

    2016-07-08

    Virus retentive filters are a key product safety measure for biopharmaceuticals. A simplistic perception is that they function solely based on a size-based particle removal mechanism of mechanical sieving and retention of particles based on their hydrodynamic size. Recent observations have revealed a more nuanced picture, indicating that changes in viral particle retention can result from process pressure and/or flow interruptions. In this study, a mechanistic investigation was performed to help identify a potential mechanism leading to the reported reduced particle retention in small virus filters. Permeate flow rate or permeate driving force were varied and analyzed for their impact on particle retention in three commercially available small virus retentive filters. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:959-970, 2016. © 2016 American Institute of Chemical Engineers.

  12. A mechanistic model for the evolution of multicellularity

    Science.gov (United States)

    Amado, André; Batista, Carlos; Campos, Paulo R. A.

    2018-02-01

    Through a mechanistic approach we investigate the formation of aggregates of variable sizes, accounting mechanisms of aggregation, dissociation, death and reproduction. In our model, cells can produce two metabolites, but the simultaneous production of both metabolites is costly in terms of fitness. Thus, the formation of larger groups can favor the aggregates to evolve to a configuration where division of labor arises. It is assumed that the states of the cells in a group are those that maximize organismal fitness. In the model it is considered that the groups can grow linearly, forming a chain, or compactly keeping a roughly spherical shape. Starting from a population consisting of single-celled organisms, we observe the formation of groups with variable sizes and usually much larger than two-cell aggregates. Natural selection can favor the formation of large groups, which allows the system to achieve new and larger fitness maxima.

  13. Mechanistic modelling of genetic and epigenetic events in radiation carcinogenesis

    International Nuclear Information System (INIS)

    Andreev, S. G.; Eidelman, Y. A.; Salnikov, I. V.; Khvostunov, I. K.

    2006-01-01

    Methodological problems arise on the way of radiation carcinogenesis modelling with the incorporation of radiobiological and cancer biology mechanistic data. The results of biophysical modelling of different endpoints [DNA DSB induction, repair, chromosome aberrations (CA) and cell proliferation] are presented and applied to the analysis of RBE-LET relationships for radiation-induced neoplastic transformation (RINT) of C3H/10T1/2 cells in culture. Predicted values for some endpoints correlate well with the data. It is concluded that slowly repaired DSB clusters, as well as some kind of CA, may be initiating events for RINT. As an alternative interpretation, it is possible that DNA damage can induce RINT indirectly via epigenetic process. A hypothetical epigenetic pathway for RINT is discussed. (authors)

  14. Synthetic and mechanistic aspects of titanium-mediated carbonyl olefinations

    Energy Technology Data Exchange (ETDEWEB)

    Petasis, N.A.; Staszewski, J.P.; Hu, Yong-Han; Lu, Shao-Po [Univ. of Southern California, Los Angeles, CA (United States)

    1995-12-31

    A new method for the olefination of carbonyl compounds with dimethyl titanocene, and other related bishydrocarbyl titanocene derivatives has been recently developed in the author`s laboratories. This process is experimentally convenient and works with various types of carbonyl compounds, including aldehydes, ketones, esters, lactones, carbonates, anhydrides, amides, imides, lactams, thioesters, selenoesters, and acylsilanes. More recent studies have focused on the scope and utility of this reaction, including mechanistic studies and synthetic applications. In addition to varying the reaction conditions, the authors have examined several mixed titanocene derivatives and have found ways for carrying out this type of olefination at room temperature, such as the use of tris(trimethylsilyl) titanacyclobutene. The authors have also employed this reaction in the modification of carbohydrates and cyclobutenediones. This olefination was also followed-up with subsequent transformations to produce carbocycles and heterocycles, including tetrahydrofurans and tetrahydropyrans.

  15. Behavioural Procedural Models – a multipurpose mechanistic account

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2012-05-01

    Full Text Available In this paper we outline an epistemological defence of what wecall Behavioural Procedural Models (BPMs, which represent the processes of individual decisions that lead to relevant economic patterns as psychologically (rather than rationally driven. Their general structure, and the way in which they may be incorporated to a multipurpose view of models, where the representational and interventionist goals are combined, is shown. It is argued that BPMs may provide “mechanistic-based explanations” in the sense defended by Hedström and Ylikoski (2010, which involve invariant regularities in Woodward’s sense. Such mechanisms provide a causal sort of explanation of anomalous economic patterns, which allow for extra marketintervention and manipulability in order to correct and improve some key individual decisions. This capability sets the basis for the so called libertarian paternalism (Sunstein and Thaler 2003.

  16. DART code optimization works

    International Nuclear Information System (INIS)

    Taboada, Horacio; Solis, Diego

    1999-01-01

    DART (Dispersion Analysis Research Tool) calculation and assessment program is a thermomechanical computer model developed by Dr. J. Rest of Argonne National Laboratory, USA. This program is the only mechanistic model available to assure the performance of low-enriched oxided-based dispersion fuels, dispersion of siliciures and uranium intermetallics in aluminum matrix for research reactors. The program predicts fission-products induced swelling (especially gases), fuel behavior during fabrication porosity closing, macroscopical changes in diameter of rods or width of plates and tubes produced by fuel deformation, degradation of thermal conductivity of fuel dispersion owing to irradiation and fuel restructuring because of Al-fuel reaction, amorphization and recrystallization. (author)

  17. Toxic neuropathies: Mechanistic insights based on a chemical perspective.

    Science.gov (United States)

    LoPachin, Richard M; Gavin, Terrence

    2015-06-02

    2,5-Hexanedione (HD) and acrylamide (ACR) are considered to be prototypical among chemical toxicants that cause central-peripheral axonopathies characterized by distal axon swelling and degeneration. Because the demise of distal regions was assumed to be causally related to the onset of neurotoxicity, substantial effort was devoted to deciphering the respective mechanisms. Continued research, however, revealed that expression of the presumed hallmark morphological features was dependent upon the daily rate of toxicant exposure. Indeed, many studies reported that the corresponding axonopathic changes were late developing effects that occurred independent of behavioral and/or functional neurotoxicity. This suggested that the toxic axonopathy classification might be based on epiphenomena related to dose-rate. Therefore, the goal of this mini-review is to discuss how quantitative morphometric analyses and the establishment of dose-dependent relationships helped distinguish primary, mechanistically relevant toxicant effects from non-specific consequences. Perhaps more importantly, we will discuss how knowledge of neurotoxicant chemical nature can guide molecular-level research toward a better, more rational understanding of mechanism. Our discussion will focus on HD, the neurotoxic γ-diketone metabolite of the industrial solvents n-hexane and methyl-n-butyl ketone. Early investigations suggested that HD caused giant neurofilamentous axonal swellings and eventual degeneration in CNS and PNS. However, as our review will point out, this interpretation underwent several iterations as the understanding of γ-diketone chemistry improved and more quantitative experimental approaches were implemented. The chemical concepts and design strategies discussed in this mini-review are broadly applicable to the mechanistic studies of other chemicals (e.g., n-propyl bromine, methyl methacrylate) that cause toxic neuropathies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  19. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  20. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  1. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  2. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  3. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  4. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  5. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  6. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  7. Temporal Coding of Volumetric Imagery

    Science.gov (United States)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  8. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  9. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  10. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    International Nuclear Information System (INIS)

    Goodman, Julie; Lynch, Heather

    2017-01-01

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  11. Improving the International Agency for Research on Cancer's consideration of mechanistic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Goodman, Julie, E-mail: jgoodman@gradientcorp.com; Lynch, Heather

    2017-03-15

    Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential. Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent

  12. Optics code development at Los Alamos

    International Nuclear Information System (INIS)

    Mottershead, C.T.; Lysenko, W.P.

    1988-01-01

    This paper is an overview of part of the beam optics code development effort in the Accelerator Technology Division at Los Alamos National Laboratory. The aim of this effort is to improve our capability to design advanced beam optics systems. The work reported is being carried out by a collaboration of permanent staff members, visiting consultants, and student research assistants. The main components of the effort are building a new framework of common supporting utilities and software tools to facilitate further development. research and development on basic computational techniques in classical mechanics and electrodynamics, and evaluation and comparison of existing beam optics codes, and support for their continuing development

  13. Digital color acquisition, perception, coding and rendering

    CERN Document Server

    Fernandez-Maloigne, Christine; Macaire, Ludovic

    2013-01-01

    In this book the authors identify the basic concepts and recent advances in the acquisition, perception, coding and rendering of color. The fundamental aspects related to the science of colorimetry in relation to physiology (the human visual system) are addressed, as are constancy and color appearance. It also addresses the more technical aspects related to sensors and the color management screen. Particular attention is paid to the notion of color rendering in computer graphics. Beyond color, the authors also look at coding, compression, protection and quality of color images and videos.

  14. Optics code development at Los Alamos

    International Nuclear Information System (INIS)

    Mottershead, C.T.; Lysenko, W.P.

    1988-01-01

    This paper is an overview of part of the beam optics code development effort in the Accelerator Technology Division at Los Alamos National Laboratory. The aim of this effort is to improve our capability to design advanced beam optics systems. The work reported is being carried out by a collaboration of permanent staff members, visiting consultants, and student research assistants. The main components of the effort are: building a new framework of common supporting utilities and software tools to facilitate further development; research and development on basic computational techniques in classical mechanics and electrodynamics; and evaluation and comparison of existing beam optics codes, and support for their continuing development. 17 refs

  15. IFR code for secondary particle dynamics

    International Nuclear Information System (INIS)

    Teague, M.R.; Yu, S.S.

    1985-01-01

    A numerical simulation has been constructed to obtain a detailed, quantitative estimate of the electromagnetic fields and currents existing in the Advanced Test Accelerator under conditions of laser guiding. The code treats the secondary electrons by particle simulation and the beam dynamics by a time-dependent envelope model. The simulation gives a fully relativistic description of secondary electrons moving in self-consistent electromagnetic fields. The calculations are made using coordinates t, x, y, z for the electrons and t, ct-z, r for the axisymmetric electromagnetic fields and currents. Code results, showing in particular current enhancement effects, will be given

  16. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  17. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  18. Mechanistic Framework for Establishment, Maintenance, and Alteration of Cell Polarity in Plants

    Directory of Open Access Journals (Sweden)

    Pankaj Dhonukshe

    2012-01-01

    Full Text Available Cell polarity establishment, maintenance, and alteration are central to the developmental and response programs of nearly all organisms and are often implicated in abnormalities ranging from patterning defects to cancer. By residing at the distinct plasma membrane domains polar cargoes mark the identities of those domains, and execute localized functions. Polar cargoes are recruited to the specialized membrane domains by directional secretion and/or directional endocytic recycling. In plants, auxin efflux carrier PIN proteins display polar localizations in various cell types and play major roles in directional cell-to-cell transport of signaling molecule auxin that is vital for plant patterning and response programs. Recent advanced microscopy studies applied to single cells in intact plants reveal subcellular PIN dynamics. They uncover the PIN polarity generation mechanism and identified important roles of AGC kinases for polar PIN localization. AGC kinase family members PINOID, WAG1, and WAG2, belonging to the AGC-3 subclass predominantly influence the polar localization of PINs. The emerging mechanism for AGC-3 kinases action suggests that kinases phosphorylate PINs mainly at the plasma membrane after initial symmetric PIN secretion for eventual PIN internalization and PIN sorting into distinct ARF-GEF-regulated polar recycling pathways. Thus phosphorylation status directs PIN translocation to different cell sides. Based on these findings a mechanistic framework evolves that suggests existence of cell side-specific recycling pathways in plants and implicates AGC3 kinases for differential PIN recruitment among them for eventual PIN polarity establishment, maintenance, and alteration.

  19. The challenge of making ozone risk assessment for forest trees more mechanistic

    International Nuclear Information System (INIS)

    Matyssek, R.; Sandermann, H.; Wieser, G.; Booker, F.; Cieslik, S.; Musselman, R.; Ernst, D.

    2008-01-01

    Upcoming decades will experience increasing atmospheric CO 2 and likely enhanced O 3 exposure which represents a risk for the carbon sink strength of forests, so that the need for cause-effect related O 3 risk assessment increases. Although assessment will gain in reliability on an O 3 uptake basis, risk is co-determined by the effective dose, i.e. the plant's sensitivity per O 3 uptake. Recent progress in research on the molecular and metabolic control of the effective O 3 dose is reported along with advances in empirically assessing O 3 uptake at the whole-tree and stand level. Knowledge on both O 3 uptake and effective dose (measures of stress avoidance and tolerance, respectively) needs to be understood mechanistically and linked as a pre-requisite before practical use of process-based O 3 risk assessment can be implemented. To this end, perspectives are derived for validating and promoting new O 3 flux-based modelling tools. - Clarifying and linking mechanisms of O 3 uptake and effective dose are research challenges highlighted in view of recent progress and perspectives towards cause-effect based risk assessment

  20. Mechanistic insight into oxide-promoted palladium catalysts for the electro-oxidation of ethanol.

    Science.gov (United States)

    Martinez, Ulises; Serov, Alexey; Padilla, Monica; Atanassov, Plamen

    2014-08-01

    Recent advancements in the development of alternatives to proton exchange membrane fuel cells utilizing less-expensive catalysts and renewable liquid fuels, such as alcohols, has been observed for alkaline fuel cell systems. Alcohol fuels present the advantage of not facing the challenge of storage and transportation encountered with hydrogen fuel. Oxidation of alcohols has been improved by the promotion of alloyed or secondary phases. Nevertheless, currently, there is no experimental understanding of the difference between an intrinsic and a synergistic promotion effect in high-pH environments. This report shows evidence of different types of promotion effects on palladium electrocatalysts obtained from the presence of an oxide phase for the oxidation of ethanol. The correlation of mechanistic in situ IR spectroscopic studies with electrochemical voltammetry studies on two similar electrocatalytic systems allow the role of either an alloyed or a secondary phase on the mechanism of oxidation of ethanol to be elucidated. Evidence is presented for the difference between an intrinsic effect obtained from an alloyed system and a synergistic effect produced by the presence of an oxide phase. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Blinded prospective evaluation of computer-based mechanistic schizophrenia disease model for predicting drug response.

    Directory of Open Access Journals (Sweden)

    Hugo Geerts

    Full Text Available The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published 'Quantitative Systems Pharmacology' computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D(2 antagonist and ocaperidone, a very high affinity dopamine D(2 antagonist, using only pharmacology and human positron emission tomography (PET imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS total score and the higher extra-pyramidal symptom (EPS liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development.

  2. Proceedings of the international workshop on mechanistic understanding of radionuclide migration in compacted/intact systems

    International Nuclear Information System (INIS)

    Tachi, Yukio; Yui, Mikazu

    2010-03-01

    The international workshop on mechanistic understanding of radionuclide migration in compacted / intact systems was held at ENTRY, JAEA, Tokai on 21st - 23rd January, 2009. This workshop was hosted by Japan Atomic Energy Agency (JAEA) as part of the project on the mechanistic model/database development for radionuclide sorption and diffusion behavior in compacted / intact systems. The overall goal of the project is to develop the mechanistic model / database for a consistent understanding and prediction of migration parameters and its uncertainties for performance assessment of geological disposal of radioactive waste. The objective of the workshop is to integrate the state-of-the-art of mechanistic sorption and diffusion model in compacted / intact systems, especially in bentonite / clay systems, and discuss the JAEA's mechanistic approaches and future challenges, especially the following discussions points; 1) What's the status and difficulties for mechanistic model/database development? 2) What's the status and difficulties for applicability of mechanistic model to the compacted/intact system? 3) What's the status and difficulties for obtaining evidences for mechanistic model? 4) What's the status and difficulties for standardization of experimental methodology for batch sorption and diffusion? 5) What's the uncertainties of transport parameters in radionuclides migration analysis due to a lack of understanding/experimental methodologies, and how do we derive them? This report includes workshop program, overview and materials of each presentation, summary of discussions. (author)

  3. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  4. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  5. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  6. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  7. MARS code manual volume I: code structure, system models, and solution methods

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu; Yoon, Churl

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This theory manual provides a complete list of overall information of code structure and major function of MARS including code architecture, hydrodynamic model, heat structure, trip / control system and point reactor kinetics model. Therefore, this report would be very useful for the code users. The overall structure of the manual is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  8. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  9. The EGS5 Code System

    Energy Technology Data Exchange (ETDEWEB)

    Hirayama, Hideo; Namito, Yoshihito; /KEK, Tsukuba; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  10. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  11. Code Team Training: Demonstrating Adherence to AHA Guidelines During Pediatric Code Blue Activations.

    Science.gov (United States)

    Stewart, Claire; Shoemaker, Jamie; Keller-Smith, Rachel; Edmunds, Katherine; Davis, Andrew; Tegtmeyer, Ken

    2017-10-16

    Pediatric code blue activations are infrequent events with a high mortality rate despite the best effort of code teams. The best method for training these code teams is debatable; however, it is clear that training is needed to assure adherence to American Heart Association (AHA) Resuscitation Guidelines and to prevent the decay that invariably occurs after Pediatric Advanced Life Support training. The objectives of this project were to train a multidisciplinary, multidepartmental code team and to measure this team's adherence to AHA guidelines during code simulation. Multidisciplinary code team training sessions were held using high-fidelity, in situ simulation. Sessions were held several times per month. Each session was filmed and reviewed for adherence to 5 AHA guidelines: chest compression rate, ventilation rate, chest compression fraction, use of a backboard, and use of a team leader. After the first study period, modifications were made to the code team including implementation of just-in-time training and alteration of the compression team. Thirty-eight sessions were completed, with 31 eligible for video analysis. During the first study period, 1 session adhered to all AHA guidelines. During the second study period, after alteration of the code team and implementation of just-in-time training, no sessions adhered to all AHA guidelines; however, there was an improvement in percentage of sessions adhering to ventilation rate and chest compression rate and an improvement in median ventilation rate. We present a method for training a large code team drawn from multiple hospital departments and a method of assessing code team performance. Despite subjective improvement in code team positioning, communication, and role completion and some improvement in ventilation rate and chest compression rate, we failed to consistently demonstrate improvement in adherence to all guidelines.

  12. SIMULATE-3 K coupled code applications

    Energy Technology Data Exchange (ETDEWEB)

    Joensson, Christian [Studsvik Scandpower AB, Vaesteraas (Sweden); Grandi, Gerardo; Judd, Jerry [Studsvik Scandpower Inc., Idaho Falls, ID (United States)

    2017-07-15

    This paper describes the coupled code system TRACE/SIMULATE-3 K/VIPRE and the application of this code system to the OECD PWR Main Steam Line Break. A short description is given for the application of the coupled system to analyze DNBR and the flexibility the system creates for the user. This includes the possibility to compare and evaluate the result with the TRACE/SIMULATE-3K (S3K) coupled code, the S3K standalone code (core calculation) as well as performing single-channel calculations with S3K and VIPRE. This is the typical separate-effect-analyses required for advanced calculations in order to develop methodologies to be used for safety analyses in general. The models and methods of the code systems are presented. The outline represents the analysis approach starting with the coupled code system, reactor and core model calculation (TRACE/S3K). This is followed by a more detailed core evaluation (S3K standalone) and finally a very detailed thermal-hydraulic investigation of the hot pin condition (VIPRE).

  13. Modeling Bird Migration under Climate Change: A Mechanistic Approach

    Science.gov (United States)

    Smith, James A.

    2009-01-01

    How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this

  14. On the antibacterial effects of manuka honey: mechanistic insights

    Directory of Open Access Journals (Sweden)

    Roberts AEL

    2015-10-01

    Full Text Available Aled Edward Lloyd Roberts,* Helen Louise Brown,* Rowena Eleri Jenkins Department of Biomedical Sciences, Cardiff Metropolitan University, Cardiff, Wales, UK *These authors contributed equally to this work Abstract: Antimicrobial resistance (AMR is an increasing clinical problem precipitated by the inappropriate use of antibiotics in the later parts of the 20th Century. This problem, coupled with the lack of novel therapeutics in the development pipeline, means AMR is reaching crisis point, with an expected annual death rate of ten million people worldwide by 2050. To reduce, and to potentially remedy this problem, many researchers are looking into natural compounds with antimicrobial and/or antivirulence activity. Manuka honey is an ancient antimicrobial remedy with a good track record against a wide range of nosocomial pathogens that have increased AMR. Its inhibitory effects are the result of its constituent components, which add varying degrees of antimicrobial efficacy to the overall activity of manuka honey. The antimicrobial efficacy of manuka honey and some of its constituent components (such as methylglyoxal and leptosperin are known to bestow some degree of antimicrobial efficacy to manuka honey. Despite growing in vitro evidence of its antimicrobial efficacy, the in vivo use of manuka honey (especially in a clinical environment has been unexpectedly slow, partly due to the lack of mechanistic data. The mechanism by which manuka honey achieves its inhibitory efficacy has recently been identified against Staphylococcus aureus and Pseudomonas aeruginosa, with both of these contrasting organisms being inhibited through different mechanisms. Manuka honey inhibits S. aureus by interfering with the cell division process, whereas P. aeruginosa cells lyse in its presence due to the reduction of a key structural protein. In addition to these inhibitory effects, manuka honey is known to reduce virulence, motility, and biofilm formation. With this

  15. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  16. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  17. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  18. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  19. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  20. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  1. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  2. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  3. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  4. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  5. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  6. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  7. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  8. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  9. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  10. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  11. Development of Regulatory Audit Core Safety Code : COREDAX

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chae Yong; Jo, Jong Chull; Roh, Byung Hwan [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Jae Jun; Cho, Nam Zin [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    Korea Institute of Nuclear Safety (KINS) has developed a core neutronics simulator, COREDAX code, for verifying core safety of SMART-P reactor, which is technically supported by Korea Advanced Institute of Science and Technology (KAIST). The COREDAX code would be used for regulatory audit calculations of 3- dimendional core neutronics. The COREDAX code solves the steady-state and timedependent multi-group neutron diffusion equation in hexagonal geometry as well as rectangular geometry by analytic function expansion nodal (AFEN) method. AFEN method was developed at KAIST, and it was internationally verified that its accuracy is excellent. The COREDAX code is originally programmed based on the AFEN method. Accuracy of the code on the AFEN method was excellent for the hexagonal 2-dimensional problems, but there was a need for improvement for hexagonal-z 3-dimensional problems. Hence, several solution routines of the AFEN method are improved, and finally the advanced AFEN method is created. COREDAX code is based on the advanced AFEN method . The initial version of COREDAX code is to complete a basic framework, performing eigenvalue calculations and kinetics calculations with thermal-hydraulic feedbacks, for audit calculations of steady-state core design and reactivity-induced accidents of SMART-P reactor. This study describes the COREDAX code for hexagonal geometry.

  12. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  13. Development of the advanced CANDU technology

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Min, Byung Joo; Na, Y. H.; Lee, S. Y.; Choi, J. H.; Lee, B. C.; Kim, S. N.; Jo, C. H.; Paik, J. S.; On, M. R.; Park, H. S.; Kim, S. R.

    1997-07-01

    The purpose of this study is to develop the advanced design technology to improve safety, operability and economy and to develop and advanced safety evaluation system. More realistic and reasonable methodology and modeling was employed to improve safety margin in containment analysis. Various efforts have been made to verify the CATHENA code which is the major safety analysis code for CANDU PHWR system. Fully computerized prototype ECCS was developed. The feasibility study and conceptual design of the distributed digital control system have been performed as well. The core characteristics of advanced fuel cycle, fuel management and power upgrade have been studied to determine the advanced core. (author). 77 refs., 51 tabs., 108 figs

  14. Development of the advanced CANDU technology

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Min, Byung Joo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Na, Y H; Lee, S Y; Choi, J H; Lee, B C; Kim, S N; Jo, C H; Paik, J S; On, M R; Park, H S; Kim, S R [Korea Electric Power Co., Taejon (Korea, Republic of)

    1997-07-01

    The purpose of this study is to develop the advanced design technology to improve safety, operability and economy and to develop and advanced safety evaluation system. More realistic and reasonable methodology and modeling was employed to improve safety margin in containment analysis. Various efforts have been made to verify the CATHENA code which is the major safety analysis code for CANDU PHWR system. Fully computerized prototype ECCS was developed. The feasibility study and conceptual design of the distributed digital control system have been performed as well. The core characteristics of advanced fuel cycle, fuel management and power upgrade have been studied to determine the advanced core. (author). 77 refs., 51 tabs., 108 figs.

  15. Relationship between various pressure vessel and piping codes

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1976-01-01

    Section VIII of the ASME Code provides stress allowable values for material specifications that are provided in Section II Parts A and B. Since the adoption of the ASME Code over 60 years ago the incidence of failure has been greatly reduced. The Codes are currently based on strength criteria and advancements in the technology of fracture toughness and fracture mechanics should permit an even greater degree of reliability and safety. This lecture discusses the various Sections of the Code. It describes the basis for the establishment of design stress allowables and promotes the idea of the use of fracture mechanics

  16. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  17. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  18. Mechanistic modeling of aberrant energy metabolism in human disease

    Directory of Open Access Journals (Sweden)

    Vineet eSangar

    2012-10-01

    Full Text Available Dysfunction in energy metabolism—including in pathways localized to the mitochondria—has been implicated in the pathogenesis of a wide array of disorders, ranging from cancer to neurodegenerative diseases to type II diabetes. The inherent complexities of energy and mitochondrial metabolism present a significant obstacle in the effort to understand the role that these molecular processes play in the development of disease. To help unravel these complexities, systems biology methods have been applied to develop an array of computational metabolic models, ranging from mitochondria-specific processes to genome-scale cellular networks. These constraint-based models can efficiently simulate aspects of normal and aberrant metabolism in various genetic and environmental conditions. Development of these models leverages—and also provides a powerful means to integrate and interpret—information from a wide range of sources including genomics, proteomics, metabolomics, and enzyme kinetics. Here, we review a variety of mechanistic modeling studies that explore metabolic functions, deficiency disorders, and aberrant biochemical pathways in mitochondria and related regions in the cell.

  19. A mechanistic approach to postirradiation spoilage kinetics of fish

    International Nuclear Information System (INIS)

    Tukenmez, I.

    2004-01-01

    Full text: In order to simulate postirradiation spoilage of fish, the mechanistic aspects of the growth of surviving microorganisms during chill storage and their product formation in irradiated fish were analyzed. Anchovy (Engraulis encrasicholus) samples those unirradiated and irradiated at 1, 2 and 3 kGy doses of gamma radiation were stored at +2 o C for 21 days. Total bacterial counts (TBC) and trimethylamine (TMA) analysis of the samples were done periodically during storage. Depending on the proposed spoilage mechanism, kinetic model equations were derived. By using experimental data of TBC and TMA in the developed model, the postirradiation spoilage parameters including growth rate constant, inital and maximum attainable TBC, lag time and TMA yield were evaluated and microbial spoilage of fish was simulated for postirradiation storage. Shelf life of irradiated fish was estimated depending on the spoilage kinetics. Dose effects on the kinetic parameters were analyzed. It is suggested that the kinetic evaluation method developed in this study may be used for quality assessment, shelf life determination and dose optimization for radiation preservation of fish

  20. Ancient Chinese medicine and mechanistic evidence of acupuncture physiology.

    Science.gov (United States)

    Yang, Edward S; Li, Pei-Wen; Nilius, Bernd; Li, Geng

    2011-11-01

    Acupuncture has been widely used in China for three millennia as an art of healing. Yet, its physiology is not yet understood. The current interest in acupuncture started in 1971. Soon afterward, extensive research led to the concept of neural signaling with possible involvement of opioid peptides, glutamate, adenosine and identifying responsive parts in the central nervous system. In the last decade scientists began investigating the subject with anatomical and molecular imaging. It was found that mechanical movements of the needle, ignored in the past, appear to be central to the method and intracellular calcium ions may play a pivotal role. In this review, we trace the technique of clinical treatment from the first written record about 2,200 years ago to the modern time. The ancient texts have been used to introduce the concepts of yin, yang, qi, de qi, and meridians, the traditional foundation of acupuncture. We explore the sequence of the physiological process, from the turning of the needle, the mechanical wave activation of calcium ion channel to beta-endorphin secretion. By using modern terminology to re-interpret the ancient texts, we have found that the 2nd century B.C.: physiologists were meticulous investigators and their explanation fits well with the mechanistic model derived from magnetic resonance imaging (MRI) and confocal microscopy. In conclusion, the ancient model appears to have withstood the test of time surprisingly well confirming the popular axiom that the old wine is better than the new.

  1. Mechanistic Perspectives of Maslinic Acid in Targeting Inflammation

    Directory of Open Access Journals (Sweden)

    Wei Hsum Yap

    2015-01-01

    Full Text Available Chronic inflammation drives the development of various pathological diseases such as rheumatoid arthritis, atherosclerosis, multiple sclerosis, and cancer. The arachidonic acid pathway represents one of the major mechanisms for inflammation. Prostaglandins (PGs are lipid products generated from arachidonic acid by the action of cyclooxygenase (COX enzymes and their activity is blocked by nonsteroidal anti-inflammatory drugs (NSAIDS. The use of natural compounds in regulation of COX activity/prostaglandins production is receiving increasing attention. In Mediterranean diet, olive oil and table olives contain significant dietary sources of maslinic acid. Maslinic acid is arising as a safe and novel natural pentacyclic triterpene which has protective effects against chronic inflammatory diseases in various in vivo and in vitro experimental models. Understanding the anti-inflammatory mechanism of maslinic acid is crucial for its development as a potential dietary nutraceutical. This review focuses on the mechanistic action of maslinic acid in regulating the inflammation pathways through modulation of the arachidonic acid metabolism including the nuclear factor-kappa B (NF-κB/COX-2 expression, upstream protein kinase signaling, and phospholipase A2 enzyme activity. Further investigations may provide insight into the mechanism of maslinic acid in regulating the molecular targets and their associated pathways in response to specific inflammatory stimuli.

  2. Diffusion theory in biology: a relic of mechanistic materialism.

    Science.gov (United States)

    Agutter, P S; Malone, P C; Wheatley, D N

    2000-01-01

    Diffusion theory explains in physical terms how materials move through a medium, e.g. water or a biological fluid. There are strong and widely acknowledged grounds for doubting the applicability of this theory in biology, although it continues to be accepted almost uncritically and taught as a basis of both biology and medicine. Our principal aim is to explore how this situation arose and has been allowed to continue seemingly unchallenged for more than 150 years. The main shortcomings of diffusion theory will be briefly reviewed to show that the entrenchment of this theory in the corpus of biological knowledge needs to be explained, especially as there are equally valid historical grounds for presuming that bulk fluid movement powered by the energy of cell metabolism plays a prominent note in the transport of molecules in the living body. First, the theory's evolution, notably from its origins in connection with the mechanistic materialist philosophy of mid nineteenth century physiology, is discussed. Following this, the entrenchment of the theory in twentieth century biology is analyzed in relation to three situations: the mechanism of oxygen transport between air and mammalian tissues; the structure and function of cell membranes; and the nature of the intermediary metalbolism, with its implicit presumptions about the intracellular organization and the movement of molecules within it. In our final section, we consider several historically based alternatives to diffusion theory, all of which have their precursors in nineteenth and twentieth century philosophy of science.

  3. A mechanistic compartmental model for total antibody uptake in tumors.

    Science.gov (United States)

    Thurber, Greg M; Dane Wittrup, K

    2012-12-07

    Antibodies are under development to treat a variety of cancers, such as lymphomas, colon, and breast cancer. A major limitation to greater efficacy for this class of drugs is poor distribution in vivo. Localization of antibodies occurs slowly, often in insufficient therapeutic amounts, and distributes heterogeneously throughout the tumor. While the microdistribution around individual vessels is important for many therapies, the total amount of antibody localized in the tumor is paramount for many applications such as imaging, determining the therapeutic index with antibody drug conjugates, and dosing in radioimmunotherapy. With imaging and pretargeted therapeutic strategies, the time course of uptake is critical in determining when to take an image or deliver a secondary reagent. We present here a simple mechanistic model of antibody uptake and retention that captures the major rates that determine the time course of antibody concentration within a tumor including dose, affinity, plasma clearance, target expression, internalization, permeability, and vascularization. Since many of the parameters are known or can be estimated in vitro, this model can approximate the time course of antibody concentration in tumors to aid in experimental design, data interpretation, and strategies to improve localization. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Mechanistic Modeling of Water Replenishment Rate of Zeer Refrigerator

    Directory of Open Access Journals (Sweden)

    B. N. Nwankwojike

    2017-06-01

    Full Text Available A model for predicting the water replenishment rate of zeer pot refrigerator was developed in this study using mechanistic modeling approach and evaluated at Obowo, Imo State, Nigeria using six fruits, tomatoes, guava, okra, banana, orange and avocado pear. The developed model confirmed zeer pot water replenishment rate as a function of ambient temperature, relative humidity, wind speed, thermal conductivity of the pot materials and sand, density of air and water vapor, permeability coefficient of clay and heat transfer coefficient of water into air, circumferential length, height of pot, geometrical profile of the pot, heat load of the food preserved, heat flow into the device and gradient at which the pot is placed above ground level. Compared to the conventional approach of water replenishment, performance analysis results revealed 44% to 58% water economy when the zeer pot’s water was replenished based on the model’s prediction; while there was no significant difference in the shelf-life of the fruits preserved with both replenishment methods. Application of the developed water replenishment model facilitates optimal water usage in this system, thereby reducing operational cost of zeer pot refrigerator.

  5. Polymerization kinetics of wheat gluten upon thermosetting. A mechanistic model.

    Science.gov (United States)

    Domenek, Sandra; Morel, Marie-Hélène; Bonicel, Joëlle; Guilbert, Stéphane

    2002-10-09

    Size exclusion high-performance liquid chromatography analysis was carried out on wheat gluten-glycerol blends subjected to different heat treatments. The elution profiles were analyzed in order to follow the solubility loss of protein fractions with specific molecular size. Owing to the known biochemical changes involved during the heat denaturation of gluten, a mechanistic mathematical model was developed, which divided the protein denaturation into two distinct reaction steps: (i) reversible change in protein conformation and (ii) protein precipitation through disulfide bonding between initially SDS-soluble and SDS-insoluble reaction partners. Activation energies of gluten unfolding, refolding, and precipitation were calculated with the Arrhenius law to 53.9 kJ x mol(-1), 29.5 kJ x mol(-1), and 172 kJ x mol(-1), respectively. The rate of protein solubility loss decreased as the cross-linking reaction proceeded, which may be attributed to the formation of a three-dimensional network progressively hindering the reaction. The enhanced susceptibility to aggregation of large molecules was assigned to a risen reaction probability due to their higher number of cysteine residues and to the increased percentage of unfolded and thereby activated proteins as complete protein refolding seemed to be an anticooperative process.

  6. Mechanistic studies of ethylene biosynthesis in higher plants

    International Nuclear Information System (INIS)

    McGeehan, G.M.

    1986-01-01

    Ethylene is a plant hormone that elicits a wide variety of responses in plant tissue. Among these responses are the hastening of abscission, ripening and senescence. In 1979 it was discovered that 1-amino-1-cyclopropane carboxylic acid is the immediate biosynthetic precursor to ethylene. Given the obvious economic significance of ethylene production the authors concentrated their studies on the conversion of ACC to ethylene. They delved into mechanistic aspects of ACC oxidation and they studied potential inhibitors of ethylene forming enzyme (EFE). They synthesized various analogs of ACC and found that EFE shows good stereodiscrimination among alkyl substituted ACC analogs with the 1R, 2S stereoisomer being processed nine times faster than the 1S, 2R isomer in the MeACC series. They also synthesized 2-cyclopropyl ACC which is a good competitive inhibitor of EFE. This compound also causes time dependent loss of EFE activity leading us to believe it is an irreversible inhibitor of ethylene formation. The synthesis of these analogs has also allowed them to develop a spectroscopic technique to assign the relative stereochemistry of alkyl groups. 13 C NMR allows them to assign the alkyl stereochemistry based upon gamma-shielding effects on the carbonyl resonance. Lastly, they measured kinetic isotope effects on the oxidation of ACC in vivo and in vitro and found that ACC is oxidized by a rate-determining 1-electron removal from nitrogen in close accord with mechanisms for the oxidation of other alkyl amines

  7. Refined pipe theory for mechanistic modeling of wood development.

    Science.gov (United States)

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  8. Mechanistic understanding of monosaccharide-air flow battery electrochemistry

    Science.gov (United States)

    Scott, Daniel M.; Tsang, Tsz Ho; Chetty, Leticia; Aloi, Sekotilani; Liaw, Bor Yann

    Recently, an inexpensive monosaccharide-air flow battery configuration has been demonstrated to utilize a strong base and a mediator redox dye to harness electrical power from the partial oxidation of glucose. Here the mechanistic understanding of glucose oxidation in this unique glucose-air power source is further explored by acid-base titration experiments, 13C NMR, and comparison of results from chemically different redox mediators (indigo carmine vs. methyl viologen) and sugars (fructose vs. glucose) via studies using electrochemical techniques. Titration results indicate that gluconic acid is the main product of the cell reaction, as supported by evidence in the 13C NMR spectra. Using indigo carmine as the mediator dye and fructose as the energy source, an abiotic cell configuration generates a power density of 1.66 mW cm -2, which is greater than that produced from glucose under similar conditions (ca. 1.28 mW cm -2). A faster transition from fructose into the ene-diol intermediate than from glucose likely contributed to this difference in power density.

  9. Four Mechanistic Models of Peer Influence on Adolescent Cannabis Use.

    Science.gov (United States)

    Caouette, Justin D; Feldstein Ewing, Sarah W

    2017-06-01

    Most adolescents begin exploring cannabis in peer contexts, but the neural mechanisms that underlie peer influence on adolescent cannabis use are still unknown. This theoretical overview elucidates the intersecting roles of neural function and peer factors in cannabis use in adolescents. Novel paradigms using functional magnetic resonance imaging (fMRI) in adolescents have identified distinct neural mechanisms of risk decision-making and incentive processing in peer contexts, centered on reward-motivation and affect regulatory neural networks; these findings inform a theoretical model of peer-driven cannabis use decisions in adolescents. We propose four "mechanistic profiles" of social facilitation of cannabis use in adolescents: (1) peer influence as the primary driver of use; (2) cannabis exploration as the primary driver, which may be enhanced in peer contexts; (3) social anxiety; and (4) negative peer experiences. Identification of "neural targets" involved in motivating cannabis use may inform clinicians about which treatment strategies work best in adolescents with cannabis use problems, and via which social and neurocognitive processes.

  10. Mechanistic modeling of insecticide risks to breeding birds in ...

    Science.gov (United States)

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  11. A mechanistic approach to the generation of sorption databases

    International Nuclear Information System (INIS)

    Bradbury, M.H.; Baeyens, B.

    1992-01-01

    Sorption of radionuclides in the near and far fields of an underground nuclear waste repository is one of the most important processes retarding their release to the environment. In the vast majority of cases sorption data have been presented in terms of empirical parameters such as distribution coefficients and isotherm equations. A consequence of this empirical methodology is that the sorption data are only strictly valid under the experimental conditions at which they were measured. Implicit in this approach is the need to generate large amounts of data and fitting parameters necessary for an empirical description of sorption under all realistically conceivable conditions which may arise in space and time along the migration pathway to Man. An alternative approach to the problem is to try to understand, and develop model descriptions of, underlying retention mechanisms and to identify those systems parameters which essentially determine the extent of sorption. The aim of this work is to see to what extent currently existing mechanistic models, together with their associated data, can be applied to predict sorption data from laboratory experiments on natural systems. This paper describes the current status of this work which is very much in an early stage of development. An example is given whereby model predictions are compared with laboratory results for the sorption of Np at trace concentrations under oxidizing conditions on a series of minerals relevant to granite formations. 31 refs., 11 figs., 5 tabs

  12. MECHANISTIC STUDY OF COLCHICINE’s ELECTROCHEMICAL OXIDATION

    International Nuclear Information System (INIS)

    Bodoki, Ede; Chira, Ruxandra; Zaharia, Valentin; Săndulescu, Robert

    2015-01-01

    Colchicine, as one of the most ancient drugs of human kind, is still in the focal point of the current research due to its multimodal mechanism of action. The elucidation of colchicine’s still unknown redox properties may play an important role in deciphering its beneficial and harmful implications over the human body. Therefore, a systematic mechanistic study of colchicine’s oxidation has been undertaken by electrochemistry coupled to mass spectrometry using two different types of electrolytic cells, in order to clarify the existing inconsistencies with respect to this topic. At around 1 V vs. Pd/H 2 , initiated by a one-electron transfer, the oxidation of colchicine sets off leading to a cation radical, whose further oxidation may evolve on several different pathways. The main product of the anodic electrochemical reaction, regardless of the carrier solution’s pH is represented by a 7-hydroxy derivative of colchicine. At more anodic potentials (above 1.4 V vs. Pd/H 2 ) compounds arising from epoxidation and/or multiple hydroxylation occur. No di- or tridemethylated quinone structures, as previously suggested in the literature for the electrolytic oxidation of colchicine, has been detected in the mass spectra.

  13. Rapid Discrimination Among Putative Mechanistic Models of Biochemical Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-08-31

    An overarching goal in molecular biology is to gain an understanding of the mechanistic basis underlying biochemical systems. Success is critical if we are to predict effectively the outcome of drug treatments and the development of abnormal phenotypes. However, data from most experimental studies is typically noisy and sparse. This allows multiple potential mechanisms to account for experimental observations, and often devising experiments to test each is not feasible. Here, we introduce a novel strategy that discriminates among putative models based on their repertoire of qualitatively distinct phenotypes, without relying on knowledge of specific values for rate constants and binding constants. As an illustration, we apply this strategy to two synthetic gene circuits exhibiting anomalous behaviors. Our results show that the conventional models, based on their well-characterized components, cannot account for the experimental observations. We examine a total of 40 alternative hypotheses and show that only 5 have the potential to reproduce the experimental data, and one can do so with biologically relevant parameter values.

  14. Multiscale mechanistic modeling in pharmaceutical research and development.

    Science.gov (United States)

    Kuepfer, Lars; Lippert, Jörg; Eissing, Thomas

    2012-01-01

    Discontinuation of drug development projects due to lack of efficacy or adverse events is one of the main cost drivers in pharmaceutical research and development (R&D). Investments have to be written-off and contribute to the total costs of a successful drug candidate receiving marketing authorization and allowing return on invest. A vital risk for pharmaceutical innovator companies is late stage clinical failure since costs for individual clinical trials may exceed the one billion Euro threshold. To guide investment decisions and to safeguard maximum medical benefit and safety for patients recruited in clinical trials, it is therefore essential to understand the clinical consequences of all information and data generated. The complexity of the physiological and pathophysiological processes and the sheer amount of information available overcharge the mental capacity of any human being and prevent a prediction of the success in clinical development. A rigorous integration of knowledge, assumption, and experimental data into computational models promises a significant improvement of the rationalization of decision making in pharmaceutical industry. We here give an overview of the current status of modeling and simulation in pharmaceutical R&D and outline the perspectives of more recent developments in mechanistic modeling. Specific modeling approaches for different biological scales ranging from intracellular processes to whole organism physiology are introduced and an example for integrative multiscale modeling of therapeutic efficiency in clinical oncology trials is showcased.

  15. Confinement effects and mechanistic aspects for montmorillonite nanopores.

    Science.gov (United States)

    Li, Xiong; Zhu, Chang; Jia, Zengqiang; Yang, Gang

    2018-08-01

    Owing to the ubiquity, critical importance and special properties, confined microenvironments have recently triggered overwhelming interest. In this work, all-atom molecular dynamics simulations have been conducted to address the confinement effects and ion-specific effects for electrolyte solutions within montmorillonite nanopores, where the pore widths vary with a wide range. The adsorption number, structure, dynamics and stability of inner- and outer-sphere metal ions are affected by the change of pore widths (confinement effects), while the extents are significantly dependent on the type of adsorbed species. The type of adsorbed species is, however, not altered by the magnitude of confinement effects, and confinement effects are similar for different electrolyte concentrations. Ion-specific effects are pronounced for all magnitudes of confinement effects (from non- to strong confined conditions), and Hofmeister sequences of outer-sphere species are closely associated with the magnitude of confinement effects while those of inner-sphere species remain consistent. In addition, mechanistic aspects of confinement have been posed using the electrical double layer theories, and the results can be generalized to other confined systems that are ubiquitous in biology, chemistry, geology and nanotechnology. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Analytical techniques for mechanistic characterization of EUV photoresists

    Science.gov (United States)

    Grzeskowiak, Steven; Narasimhan, Amrit; Murphy, Michael; Ackerman, Christian; Kaminsky, Jake; Brainard, Robert L.; Denbeaux, Greg

    2017-03-01

    Extreme ultraviolet (EUV, 13.5 nm) lithography is the prospective technology for high volume manufacturing by the microelectronics industry. Significant strides towards achieving adequate EUV source power and availability have been made recently, but a limited rate of improvement in photoresist performance still delays the implementation of EUV. Many fundamental questions remain to be answered about the exposure mechanisms of even the relatively well understood chemically amplified EUV photoresists. Moreover, several groups around the world are developing revolutionary metal-based resists whose EUV exposure mechanisms are even less understood. Here, we describe several evaluation techniques to help elucidate mechanistic details of EUV exposure mechanisms of chemically amplified and metal-based resists. EUV absorption coefficients are determined experimentally by measuring the transmission through a resist coated on a silicon nitride membrane. Photochemistry can be evaluated by monitoring small outgassing reaction products to provide insight into photoacid generator or metal-based resist reactivity. Spectroscopic techniques such as thin-film Fourier transform infrared (FTIR) spectroscopy can measure the chemical state of a photoresist system pre- and post-EUV exposure. Additionally, electrolysis can be used to study the interaction between photoresist components and low energy electrons. Collectively, these techniques improve our current understanding of photomechanisms for several EUV photoresist systems, which is needed to develop new, better performing materials needed for high volume manufacturing.

  17. Mechanistic Features of Nanodiamonds in the Lapping of Magnetic Heads

    Directory of Open Access Journals (Sweden)

    Xionghua Jiang

    2014-01-01

    Full Text Available Nanodiamonds, which are the main components of slurry in the precision lapping process of magnetic heads, play an important role in surface quality. This paper studies the mechanistic features of nanodiamond embedment into a Sn plate in the lapping process. This is the first study to develop mathematical models for nanodiamond embedment. Such models can predict the optimum parameters for particle embedment. From the modeling calculations, the embedded pressure satisfies p0=3/2·W/πa2 and the indentation depth satisfies δ=k1P/HV. Calculation results reveal that the largest embedded pressure is 731.48 GPa and the critical indentation depth δ is 7 nm. Atomic force microscopy (AFM, scanning electron microscopy (SEM, and Auger electron spectroscopy (AES were used to carry out surface quality detection and analysis of the disk head. Both the formation of black spots on the surface and the removal rate have an important correlation with the size of nanodiamonds. The results demonstrate that an improved removal rate (21 nm·min−1 can be obtained with 100 nm diamonds embedded in the plate.

  18. Mechanistic features of nanodiamonds in the lapping of magnetic heads.

    Science.gov (United States)

    Jiang, Xionghua; Chen, Zhenxing; Wolfram, Joy; Yang, Zhizhou

    2014-01-01

    Nanodiamonds, which are the main components of slurry in the precision lapping process of magnetic heads, play an important role in surface quality. This paper studies the mechanistic features of nanodiamond embedment into a Sn plate in the lapping process. This is the first study to develop mathematical models for nanodiamond embedment. Such models can predict the optimum parameters for particle embedment. From the modeling calculations, the embedded pressure satisfies p 0 = (3/2) · (W/πa (2)) and the indentation depth satisfies δ = k1√P/HV. Calculation results reveal that the largest embedded pressure is 731.48 GPa and the critical indentation depth δ is 7 nm. Atomic force microscopy (AFM), scanning electron microscopy (SEM), and Auger electron spectroscopy (AES) were used to carry out surface quality detection and analysis of the disk head. Both the formation of black spots on the surface and the removal rate have an important correlation with the size of nanodiamonds. The results demonstrate that an improved removal rate (21 nm · min(-1)) can be obtained with 100 nm diamonds embedded in the plate.

  19. Advanced Ceramics

    International Nuclear Information System (INIS)

    1989-01-01

    The First Florida-Brazil Seminar on Materials and the Second State Meeting about new materials in Rio de Janeiro State show the specific technical contribution in advanced ceramic sector. The others main topics discussed for the development of the country are the advanced ceramic programs the market, the national technic-scientific capacitation, the advanced ceramic patents, etc. (C.G.C.) [pt

  20. Breathing (and Coding?) a Bit Easier: Changes to International Classification of Disease Coding for Pulmonary Hypertension.

    Science.gov (United States)

    Mathai, Stephen C; Mathew, Sherin

    2018-04-20

    International Classification of Disease (ICD) coding system is broadly utilized by healthcare providers, hospitals, healthcare payers, and governments to track health trends and statistics at the global, national, and local levels and to provide a reimbursement framework for medical care based upon diagnosis and severity of illness. The current iteration of the ICD system, ICD-10, was implemented in 2015. While many changes to the prior ICD-9 system were included in the ICD-10 system, the newer revision failed to adequately reflect advances in the clinical classification of certain diseases such as pulmonary hypertension (PH). Recently, a proposal to modify the ICD-10 codes for PH was considered and ultimately adopted for inclusion as updates to ICD-10 coding system. While these revisions better reflect the current clinical classification of PH, in the future, further changes should be considered to improve the accuracy and ease of coding for all forms of PH. Copyright © 2018. Published by Elsevier Inc.