WorldWideScience

Sample records for tracing code reproducing

  1. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Safety related investigations of the VVER-1000 reactor type by the coupled code system TRACE/PARCS

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Lischke, Wolfgang; Sanchez Espinoza, Victor Hugo

    2007-01-01

    This study was performed at the Institute of Reactor Safety at the Research Center Karlsruhe. It is embedded in the ongoing investigations of the international code application and maintenance program (CAMP) for qualification and validation of system codes like TRACE [1] and PARCS [2]. The predestinated reactor type for the validation of these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2 [3] includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The posttest-investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement to the measured data. The coolant mixing pattern especially in the downcomer has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provides good results compared to reference values and the ones of other participants of the benchmark. It can be pointed out that the developed three-dimensional nodalisation of the reactor pressure vessel (RPV) is appropriate for the description of transients where the thermal-hydraulics and the neutronics are strongly linked. (author)

  3. The Alba ray tracing code: ART

    Science.gov (United States)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  4. Particle tracing code for multispecies gas

    International Nuclear Information System (INIS)

    Eaton, R.R.; Fox, R.L.; Vandevender, W.H.

    1979-06-01

    Details are presented for the development of a computer code designed to calculate the flow of a multispecies gas mixture using particle tracing techniques. The current technique eliminates the need for a full simulation by utilizing local time averaged velocity distribution functions to obtain the dynamic properties for probable collision partners. The development of this concept reduces statistical scatter experienced in conventional Monte Carlo simulations. The technique is applicable to flow problems involving gas mixtures with disparate masses and trace constituents in the Knudsen number, Kn, range from 1.0 to less than 0.01. The resulting code has previously been used to analyze several aerodynamic isotope enrichment devices

  5. Perspective on the audit calculation for SFR using TRACE code

    Energy Technology Data Exchange (ETDEWEB)

    Shin, An Dong; Choi, Yong Won; Bang, Young Suk; Bae, Moo Hoon; Huh, Byung Gil; Seol, Kwang One [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    Korean Sodium Cooled Fast Reactor (SFR) is being developed by KAERI. The Prototype SFR will be a first SFR applied for licensing. KINS started research programs for preparing new concept design licensing recently. Safety analysis for the certain reactor is based on the computational estimation with conservatism and/or uncertainty of modeling. For the audit calculation for sodium cooled fast reactor (SFR), TRACE code is considered as one of analytical tool for SFR since TRACE code have already sodium related properties and models in it and have experience in the liquid metal coolant system area in abroad. Applicability of TRACE code for SFR is prechecked before real audit calculation. In this study, Demonstration Fast Reactor (DFR) 600 steady state conditions is simulated for identification of area of modeling improvements of TRACE code.

  6. Comparative study of boron transport models in NRC Thermal-Hydraulic Code Trace

    Energy Technology Data Exchange (ETDEWEB)

    Olmo-Juan, Nicolás; Barrachina, Teresa; Miró, Rafael; Verdú, Gumersindo; Pereira, Claubia, E-mail: nioljua@iqn.upv.es, E-mail: tbarrachina@iqn.upv.es, E-mail: rmiro@iqn.upv.es, E-mail: gverdu@iqn.upv.es, E-mail: claubia@nuclear.ufmg.br [Institute for Industrial, Radiophysical and Environmental Safety (ISIRYM). Universitat Politècnica de València (Spain); Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    Recently, the interest in the study of various types of transients involving changes in the boron concentration inside the reactor, has led to an increase in the interest of developing and studying new models and tools that allow a correct study of boron transport. Therefore, a significant variety of different boron transport models and spatial difference schemes are available in the thermal-hydraulic codes, as TRACE. According to this interest, in this work it will be compared the results obtained using the different boron transport models implemented in the NRC thermal-hydraulic code TRACE. To do this, a set of models have been created using the different options and configurations that could have influence in boron transport. These models allow to reproduce a simple event of filling or emptying the boron concentration in a long pipe. Moreover, with the aim to compare the differences obtained when one-dimensional or three-dimensional components are chosen, it has modeled many different cases using only pipe components or a mix of pipe and vessel components. In addition, the influence of the void fraction in the boron transport has been studied and compared under close conditions to BWR commercial model. A final collection of the different cases and boron transport models are compared between them and those corresponding to the analytical solution provided by the Burgers equation. From this comparison, important conclusions are drawn that will be the basis of modeling the boron transport in TRACE adequately. (author)

  7. Documentation for TRACE: an interactive beam-transport code

    International Nuclear Information System (INIS)

    Crandall, K.R.; Rusthoi, D.P.

    1985-01-01

    TRACE is an interactive, first-order, beam-dynamics computer program. TRACE includes space-charge forces and mathematical models for a number of beamline elements not commonly found in beam-transport codes, such as permanent-magnet quadrupoles, rf quadrupoles, rf gaps, accelerator columns, and accelerator tanks. TRACE provides an immediate graphic display of calculative results, has a powerful and easy-to-use command procedure, includes eight different types of beam-matching or -fitting capabilities, and contains its own internal HELP package. This report describes the models and equations used for each of the transport elements, the fitting procedures, and the space-charge/emittance calculations, and provides detailed instruction for using the code

  8. SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)

    2013-10-01

    SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.

  9. Evaluation of ATLAS 100% DVI Line Break Using TRACE Code

    International Nuclear Information System (INIS)

    Huh, Byung Gil; Bang, Young Seok; Cheong, Ae Ju; Woo, Sweng Woong

    2011-01-01

    ATLAS (Advanced Thermal-Hydraulic Test Loop for Accident Simulation) is an integral effect test facility in KAERI. It had installed completely to simulate the accident for the OPR1000 and the APR1400 in 2005. After then, several tests for LBLOCA, DVI line break have been performed successfully to resolve the safety issues of the APR1400. Especially, a DVI line break is considered as another spectrum among the SBLOCAs in APR1400 because the DVI line is directly connected to the reactor vessel and the thermal hydraulic behaviors are expected to be different from those for the cold leg injection. However, there are not enough experimental data for the DVI line break. Therefore, integral effect data for the DVI line break of ATLAS is very useful and available for an improvement and validation of safety codes. For the DVI line break in ATLAS, several analyses using MARS and RELAP codes were performed in the ATLAS DSP (Domestic Standard Problem) meetings. However, TRACE code has still not used to simulate a DVI line break. TRACE code has developed as the unified code for the reactor thermal hydraulic analyses in USNRC. In this study, the 100% DVI line break in ATLAS was evaluated by TRACE code. The objectives of this study are to identify the prediction capability of TRACE code for the major thermal hydraulic phenomena of a DVI line break in ATLAS

  10. Papa, a Particle Tracing Code in Pascal

    NARCIS (Netherlands)

    Haselhoff, E.H.; Haselhoff, Eltjo H.; Ernst, G.J.

    1990-01-01

    During the design of a 10 ¿m high-gain FEL oscillator (TEUFEL Project) we developed a new particle-tracing code to perform simulations of thermionic- and photo-cathode electron injectors/accelerators. The program allows predictions of current, energy and beam emittance in a user-specified linac

  11. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  12. Simulation of the KAERI PASCAL Test with MARS-KS and TRACE Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Won; Cheong, Aeju; Shin, Andong; Cho, Min Ki [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In order to validate the operational performance of the PAFS, KAERI has performed the experimental investigation using the PASCAL (PAFS Condensing heat removal Assessment Loop) facility. In this study, we simulated the KAERI PASCAL SS-540-P1 test with MARS-KS V1.4 and TRACE V5.0 p4 codes to assess the code predictability for the condensation heat transfer inside the passive auxiliary feedwater system. We simulated the KAERI PASCAL SS-540-P1 test with MARS-KS V1.4 and TRACE V5.0 p4 codes to assess the code predictability for the condensation heat transfer inside the passive auxiliary feedwater system. The calculated results of heat flux, inner wall surface temperature of the condensing tube, fluid temperature, and steam mass flow rate are compared with the experimental data. The result shows that the MARS-KS generally under-predict the heat fluxes. The TRACE over-predicts the heat flux at tube inlet region and under-predicts it at tube outlet region. The TRACE prediction shows larger amount of steam condensation by about 3% than the MARS-KS prediction.

  13. Simulation of the turbine discharge transient with the code Trace

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Filio L, C.

    2014-10-01

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  14. A comparison of the reproducibility of manual tracing and on-screen digitization for cephalometric profile variables

    NARCIS (Netherlands)

    Dvortsin, D. P.; Sandham, John; Pruim, G. J.; Dijkstra, P. U.

    2008-01-01

    The aim of this investigation was to analyse and compare the reproducibility of manual cephalometric tracings with on-screen digitization using a soft tissue analysis. A random sample of 20 lateral cephalometric radiographs, in the natural head posture, was selected. On-screen digitization using

  15. Feasibility Study of Coupling the CASMO-4/TABLES-3/SIMULATE-3 Code System to TRACE/PARCS

    International Nuclear Information System (INIS)

    Demaziere, Christophe; Staalek, Mathias

    2004-12-01

    This report investigates the feasibility of coupling the Studsvik Scandpower CASMO-4/TABLES-3/SIMULATE-3 codes to the US NRC TRACE/PARCS codes. The data required by TRACE/PARCS are actually the ones necessary to run its neutronic module PARCS. Such data are the macroscopic nuclear cross-sections, some microscopic nuclear cross-sections important for the Xenon and Samarium poisoning effects, the Assembly Discontinuity Factors, and the kinetic parameters. All these data can be retrieved from the Studsvik Scandpower codes. The data functionalization is explained in detail for both systems of codes and the possibility of coupling each of these codes to TRACE/PARCS is discussed. Due to confidentiality restrictions in the use of the CASMO-4 files and to an improper format of the TABLES-3 output files, it is demonstrated that TRACE/PARCS can only be coupled to SIMULATE-3. Specifically-dedicated SIMULATE-3 input decks allow easily editing the neutronic data at specific operating statepoints. Although the data functionalization is different between both systems of codes, such a procedure permits reconstructing a set of data directly compatible with PARCS

  16. Assessment of TRACE code against CHF experiments

    International Nuclear Information System (INIS)

    Audrius Jasiulevicius; Rafael Macian-Juan; Paul Coddington

    2005-01-01

    Full text of publication follows: This paper reports on the validation of the USNRC 'consolidate' code TRACE with data obtained during Critical Heat Flux (CHF) experiments in single channels and round and annular tubes. CHF is one of the key reactor safety parameters, because it determines the conditions for the onset of transition boiling in the core rod bundles, leading to the low heat transfer rates characteristics of the post-CHF heat transfer regime. In the context of the participation of PSI in the the International Programme for uncertainty analysis BEMUSE, we have carried out extensive work for the validation of some important TRACE models. The present work is aimed at assessing the range of validity for the CHF correlations and post-CHF heat transfer models currently included in TRACE. The heat transfer experiments selected for the assessment were performed at the Royal Institute of Technology (RIT) in Stockholm, Sweden and at the Atomic Energy Establishment in Winfrith, UK. The experimental investigations of the CHF and post-CHF heat transfer at RIT for flow of water in vertical tubes and annulus were performed at pressures ranging from 1 to 20 MPa and coolant mass fluxes from 500 to 3000 kg/m 2 s. The liquid was subcooled by 10 deg. C and 40 deg. C at the inlet of the test section. The experiments were performed on two different types of test sections. Experiments with uniformly heated single 7.0 m long tubes were carried out with three different inner tube diameters of 10, 14.9 and 24.7 mm. A series of experiments with non-uniform axial power distribution were also conducted in order to study the effect of the axial heat flux distribution on the CHF conditions in both 7.0 m long single tubes and 3.65 long annulus. Several different axial power profiles were employed with bottom, middle and top power peaks as well as the double-humped axial power profiles. In total more than 100 experiments with uniform axial heat flux distribution and several hundreds

  17. Two-phase wall friction model for the trace computer code

    International Nuclear Information System (INIS)

    Wang Weidong

    2005-01-01

    The wall drag model in the TRAC/RELAP5 Advanced Computational Engine computer code (TRACE) has certain known deficiencies. For example, in an annular flow regime, the code predicts an unphysical high liquid velocity compared to the experimental data. To address those deficiencies, a new wall frictional drag package has been developed and implemented in the TRACE code to model the wall drag for two-phase flow system code. The modeled flow regimes are (1) annular/mist, (2) bubbly/slug, and (3) bubbly/slug with wall nucleation. The new models use void fraction (instead of flow quality) as the correlating variable to minimize the calculation oscillation. In addition, the models allow for transitions between the three regimes. The annular/mist regime is subdivided into three separate regimes for pure annular flow, annular flow with entrainment, and film breakdown. For adiabatic two-phase bubbly/slug flows, the vapor phase primarily exists outside of the boundary layer, and the wall shear uses single-phase liquid velocity for friction calculation. The vapor phase wall friction drag is set to zero for bubbly/slug flows. For bubbly/slug flows with wall nucleation, the bubbles are presented within the hydrodynamic boundary layer, and the two-phase wall friction drag is significantly higher with a pronounced mass flux effect. An empirical correlation has been studied and applied to account for nucleate boiling. Verification and validation tests have been performed, and the test results showed a significant code improvement. (authors)

  18. Divided multimodal attention sensory trace and context coding strategies in spatially congruent auditory and visual presentation.

    Science.gov (United States)

    Kristjánsson, Tómas; Thorvaldsson, Tómas Páll; Kristjánsson, Arni

    2014-01-01

    Previous research involving both unimodal and multimodal studies suggests that single-response change detection is a capacity-free process while a discriminatory up or down identification is capacity-limited. The trace/context model assumes that this reflects different memory strategies rather than inherent differences between identification and detection. To perform such tasks, one of two strategies is used, a sensory trace or a context coding strategy, and if one is blocked, people will automatically use the other. A drawback to most preceding studies is that stimuli are presented at separate locations, creating the possibility of a spatial confound, which invites alternative interpretations of the results. We describe a series of experiments, investigating divided multimodal attention, without the spatial confound. The results challenge the trace/context model. Our critical experiment involved a gap before a change in volume and brightness, which according to the trace/context model blocks the sensory trace strategy, simultaneously with a roaming pedestal, which should block the context coding strategy. The results clearly show that people can use strategies other than sensory trace and context coding in the tasks and conditions of these experiments, necessitating changes to the trace/context model.

  19. Analysis of an XADS Target with the System Code TRACE

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Sanchez Espinoza, Victor H.; Feng, Bo

    2008-01-01

    Accelerator-driven systems (ADS) present an option to reduce the radioactive waste of the nuclear industry. The experimental Accelerator-Driven System (XADS) has been designed to investigate the feasibility of using ADS on an industrial scale to burn minor actinides. The target section lies in the middle of the subcritical core and is bombarded by a proton beam to produce spallation neutrons. The thermal energy produced from this reaction requires a heat removal system for the target section. The target is cooled by liquid lead-bismuth-eutectics (LBE) in the primary system which in turn transfers the heat via a heat exchanger (HX) to the secondary coolant, Diphyl THT (DTHT), a synthetic diathermic fluid. Since this design is still in development, a detailed investigation of the system is necessary to evaluate the behavior during normal and transient operations. Due to the lack of experimental facilities and data for ADS, the analyses are mostly done using thermal hydraulic codes. In addition to evaluating the thermal hydraulics of the XADS, this paper also benchmarks a new code developed by the NRC, TRACE, against other established codes. The events used in this study are beam power switch-on/off transients and a loss of heat sink accident. The obtained results from TRACE were in good agreement with the results of various other codes. (authors)

  20. Simulation of the turbine discharge transient with the code Trace; Simulacion del transitorio disparo de turbina con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M.; Filio L, C., E-mail: dulcemaria.mejia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Jose Ma. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico)

    2014-10-15

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  1. TRACE/VALKIN: a neutronics-thermohydraulics coupled code to analyze strong 3D transients

    Energy Technology Data Exchange (ETDEWEB)

    Rafael Miro; Gumersindo Verdu; Ana Maria Sanchez [Chemical and Nuclear Engineering Department. Polytechnic University of Valencia. Cami de Vera s/n. 46022 Valencia (Spain); Damian Ginestar [Applied Mathematics Department. Polytechnic University of Valencia. Cami de Vera s/n. 46022 Valencia (Spain)

    2005-07-01

    Full text of publication follows: A nuclear reactor simulator consists mainly of two different blocks, which solve the models used for the basic physical phenomena taking place in the reactor. In this way, there is a neutronic module which simulates the neutron balance in the reactor core, and a thermal-hydraulics module, which simulates the heat transfer in the fuel, the heat transfer from the fuel to the water, and the different condensation and evaporation processes taking place in the reactor core and in the condenser systems. TRACE is a two-phase, two-fluid thermal-hydraulic reactor systems analysis code. The TRACE acronym stands for TRAC/RELAP Advanced Computational Engine, reflecting its ability to run both RELAP5 and TRAC legacy input models. It includes a three-dimensional kinetics module called PARCS for performing advanced analysis of coupled core thermal-hydraulic/kinetics problems. TRACE-VALKIN code is a new time domain analysis code to study transients in LWR reactors. This code uses the best estimate code TRACE to give account of the heat transfer and thermal-hydraulic processes, and a 3D neutronics module. This module has two options, the MODKIN option that makes use of a modal method based on the assumption that the neutronic flux can be approximately expanded in terms of the dominant lambda modes associated with a static configuration of the reactor core, and the NOKIN option that uses a one-step backward discretization of the neutron diffusion equation. The lambda modes are obtained using the Implicit Restarted Arnoldi approach or the Jacob-Davidson algorithm. To check the performance of the coupled code TRACE-VALKIN against complex 3D neutronic transients, using the cross-sections tables generated with the translator SIMTAB from SIMULATE to TRACE/VALKIN, the Cofrentes NPP SCRAM-61 transient is simulated. Cofrentes NPP is a General Electric BWR-6 design located in Valencia-land (Spain). It is in operation since 1985 and currently in its fifteenth

  2. Assessment of GOTHIC and TRACE codes against selected PANDA experiments on a Passive Containment Condenser

    Energy Technology Data Exchange (ETDEWEB)

    Papini, Davide, E-mail: davide.papini@psi.ch; Adamsson, Carl; Andreani, Michele; Prasser, Horst-Michael

    2014-10-15

    Highlights: • Code comparison on the performance of a Passive Containment Condenser. • Simulation of separate effect tests with pure steam and non-condensable gases. • Role of the secondary side and accuracy of pool boiling models are discussed. • GOTHIC and TRACE predict the experimental performance with slight underestimation. • Recirculatory flow pattern with injection of light non-condensable gas is inferred. - Abstract: Typical passive safety systems for ALWRs (Advanced Light Water Reactors) rely on the condensation of steam to remove the decay heat from the core or the containment. In the present paper the three-dimensional containment code GOTHIC and the one-dimensional system code TRACE are compared on the calculation of a variety of phenomena characterizing the response of a passive condenser submerged in a boiling pool. The investigation addresses the conditions of interest for the Passive Containment Cooling System (PCCS) proposed for the ESBWR (Economic Simplified Boiling Water Reactor). The analysis of selected separate effect tests carried out on a PCC (Passive Containment Condenser) unit in the PANDA large-scale thermal-hydraulic facility is presented to assess the code predictions. Both pure steam conditions (operating pressure of 3 bar, 6 bar and 9 bar) and the effect on the condensation heat transfer of non-condensable gases heavier than steam (air) and lighter than steam (helium) are considered. The role of the secondary side (pool side) heat transfer on the condenser performance is examined too. In general, this study shows that both the GOTHIC and TRACE codes are able to reasonably predict the heat transfer capability of the PCC as well as the influence of non-condensable gas on the system. A slight underestimation of the condenser performance is obtained with both codes. For those tests where the experimental and simulated efficiencies agree better the possibility of compensating errors among different parts of the heat transfer

  3. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    Science.gov (United States)

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  4. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Del Valle G, E.

    2015-09-01

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  5. A versatile ray-tracing code for studying rf wave propagation in toroidal magnetized plasmas

    International Nuclear Information System (INIS)

    Peysson, Y; Decker, J; Morini, L

    2012-01-01

    A new ray-tracing code named C3PO has been developed to study the propagation of arbitrary electromagnetic radio-frequency (rf) waves in magnetized toroidal plasmas. Its structure is designed for maximum flexibility regarding the choice of coordinate system and dielectric model. The versatility of this code makes it particularly suitable for integrated modeling systems. Using a coordinate system that reflects the nested structure of magnetic flux surfaces in tokamaks, fast and accurate calculations inside the plasma separatrix can be performed using analytical derivatives of a spline-Fourier interpolation of the axisymmetric toroidal MHD equilibrium. Applications to reverse field pinch magnetic configuration are also included. The effects of 3D perturbations of the axisymmetric toroidal MHD equilibrium, due to the discreteness of the magnetic coil system or plasma fluctuations in an original quasi-optical approach, are also studied. Using a Runge–Kutta–Fehlberg method for solving the set of ordinary differential equations, the ray-tracing code is extensively benchmarked against analytical models and other codes for lower hybrid and electron cyclotron waves. (paper)

  6. High fidelity analysis of BWR fuel assembly with COBRA-TF/PARCS and trace codes

    International Nuclear Information System (INIS)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.; Soler, A.

    2013-01-01

    The growing importance of detailed reactor core and fuel assembly description for light water reactors (LWRs) as well as the sub-channel safety analysis requires high fidelity models and coupled neutronic/thermalhydraulic codes. Hand in hand with advances in the computer technology, the nuclear safety analysis is beginning to use a more detailed thermal hydraulics and neutronics. Previously, a PWR core and a 16 by 16 fuel assembly models were developed to test and validate our COBRA-TF/PARCS v2.7 (CTF/PARCS) coupled code. In this work, a comparison of the modeling and simulation advantages and disadvantages of modern 10 by 10 BWR fuel assembly with CTF/PARCS and TRACE codes has been done. The objective of the comparison is making known the main advantages of using the sub-channel codes to perform high resolution nuclear safety analysis. The sub-channel codes, like CTF, permits obtain accurate predictions, in two flow regime, of the thermalhydraulic parameters important to safety with high local resolution. The modeled BWR fuel assembly has 91 fuel rods (81 full length and 10 partial length fuel rods) and a big square central water rod. This assembly has been modeled with high level of detail with CTF code and using the BWR modeling parameters provided by TRACE. The same neutronic PARCS's model has been used for the simulation with both codes. To compare the codes a coupled steady state has be performed. (author)

  7. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    Science.gov (United States)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  8. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  9. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-01-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  10. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    Full Text Available Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012 for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  11. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Science.gov (United States)

    Stodden, Victoria; Guo, Peixuan; Ma, Zhaokun

    2013-01-01

    Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  12. Statistical safety evaluation of BWR turbine trip scenario using coupled neutron kinetics and thermal hydraulics analysis code SKETCH-INS/TRACE5.0

    International Nuclear Information System (INIS)

    Ichikawa, Ryoko; Masuhara, Yasuhiro; Kasahara, Fumio

    2012-01-01

    The Best Estimate Plus Uncertainty (BEPU) method has been prepared for the regulatory cross-check analysis at Japan Nuclear Energy Safety Organization (JNES) on base of the three-dimensional neutron-kinetics/thermal-hydraulics coupled code SKETCH-INS/TRACE5.0. In the preparation, TRACE5.0 is verified against the large-scale thermal-hydraulic tests carried out with NUPEC facility. These tests were focused on the pressure drop of steam-liquid two phase flow and void fraction distribution. From the comparison of the experimental data with other codes (RELAP5/MOD3.3 and TRAC-BF1), TRACE5.0 was judged better than other codes. It was confirmed that TRACE5.0 has high reliability for thermal hydraulics behavior and are used as a best-estimate code for the statistical safety evaluation. Next, the coupled code SKETCH-INS/TRACE5.0 was applied to turbine trip tests performed at the Peach Bottom-2 BWR4 Plant. The turbine trip event shows the rapid power peak due to the voids collapse with the pressure increase. The analyzed peak value of core power is better simulated than the previous version SKETCH-INS/TRAC-BF1. And the statistical safety evaluation using SKETCH-INS/TRACE5.0 was applied to the loss of load transient for examining the influence of the choice of sampling method. (author)

  13. Non-coding RNA detection methods combined to improve usability, reproducibility and precision

    Directory of Open Access Journals (Sweden)

    Kreikemeyer Bernd

    2010-09-01

    Full Text Available Abstract Background Non-coding RNAs gain more attention as their diverse roles in many cellular processes are discovered. At the same time, the need for efficient computational prediction of ncRNAs increases with the pace of sequencing technology. Existing tools are based on various approaches and techniques, but none of them provides a reliable ncRNA detector yet. Consequently, a natural approach is to combine existing tools. Due to a lack of standard input and output formats combination and comparison of existing tools is difficult. Also, for genomic scans they often need to be incorporated in detection workflows using custom scripts, which decreases transparency and reproducibility. Results We developed a Java-based framework to integrate existing tools and methods for ncRNA detection. This framework enables users to construct transparent detection workflows and to combine and compare different methods efficiently. We demonstrate the effectiveness of combining detection methods in case studies with the small genomes of Escherichia coli, Listeria monocytogenes and Streptococcus pyogenes. With the combined method, we gained 10% to 20% precision for sensitivities from 30% to 80%. Further, we investigated Streptococcus pyogenes for novel ncRNAs. Using multiple methods--integrated by our framework--we determined four highly probable candidates. We verified all four candidates experimentally using RT-PCR. Conclusions We have created an extensible framework for practical, transparent and reproducible combination and comparison of ncRNA detection methods. We have proven the effectiveness of this approach in tests and by guiding experiments to find new ncRNAs. The software is freely available under the GNU General Public License (GPL, version 3 at http://www.sbi.uni-rostock.de/moses along with source code, screen shots, examples and tutorial material.

  14. Validation and application of the system code TRACE for safety related investigations of innovative nuclear energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim

    2011-12-19

    The system code TRACE is the latest development of the U.S. Nuclear Regulatory Commission (US NRC). TRACE, developed for the analysis of operational conditions, transients and accidents of light water reactors (LWR), is a best-estimate code with two fluid, six equation models for mass, energy, and momentum conservation, and related closure models. Since TRACE is mainly applied to LWR specific issues, the validation process related to innovative nuclear systems (liquid metal cooled systems, systems operated with supercritical water, etc.) is very limited, almost not existing. In this work, essential contribution to the validation of TRACE related to lead and lead alloy cooled systems as well as systems operated with supercritical water is provided in a consistent and corporate way. In a first step, model discrepancies of the TRACE source code were removed. This inconsistencies caused the wrong prediction of the thermo physical properties of supercritical water and lead bismuth eutectic, and hence the incorrect prediction of heat transfer relevant characteristic numbers like Reynolds or Prandtl number. In addition to the correction of the models to predict these quantities, models describing the thermo physical properties of lead and Diphyl THT (synthetic heat transfer medium) were implemented. Several experiments and numerical benchmarks were used to validate the modified TRACE version. These experiments, mainly focused on wall-to-fluid heat transfer, revealed that not only the thermo physical properties are afflicted with inconsistencies but also the heat transfer models. The models for the heat transfer to liquid metals were enhanced in a way that the code can now distinguish between pipe and bundle flow by using the right correlation. The heat transfer to supercritical water was not existing in TRACE up to now. Completely new routines were implemented to overcome that issue. The comparison of the calculations to the experiments showed, on one hand, the necessity

  15. Critical review of conservation equations for two-phase flow in the U.S. NRC TRACE code

    International Nuclear Information System (INIS)

    Wulff, Wolfgang

    2011-01-01

    Research highlights: → Field equations as implemented in TRACE are incorrect. → Boundary conditions needed for cooling of nuclear fuel elements are wrong. → The two-fluid model in TRACE is not closed. → Three-dimensional flow modeling in TRACE has no basis. - Abstract: The field equations for two-phase flow in the computer code TRAC/RELAP Advanced Computational Engine or TRACE are examined to determine their validity, their capabilities and limitations in resolving nuclear reactor safety issues. TRACE was developed for the NRC to predict thermohydraulic phenomena in nuclear power plants during operational transients and postulated accidents. TRACE is based on the rigorously derived and well-established two-fluid field equations for 1-D and 3-D two-phase flow. It is shown that: (1)The two-fluid field equations for mass conservation as implemented in TRACE are wrong because local mass balances in TRACE are in conflict with mass conservation for the whole reactor system, as shown in Section . (2)Wrong equations of motion are used in TRACE in place of momentum balances, compromising at branch points the prediction of momentum transfer between, and the coupling of, loops in hydraulic networks by impedance (form loss and wall shear) and by inertia and thereby the simulation of reactor component interactions. (3)Most seriously, TRACE calculation of heat transfer from fuel elements is incorrect for single and two-phase flows, because Eq. of the TRACE Manual is wrong (see Section ). (4)Boundary conditions for momentum and energy balances in TRACE are restricted to flow regimes with single-phase wall contact because TRACE lacks constitutive relations for solid-fluid exchange of momentum and heat in prevailing flow regimes. Without a quantified assessment of consequences from (3) to (4), predictions of phasic fluid velocities, fuel temperatures and important safety parameters, e.g., peak clad temperature, are questionable. Moreover, TRACE cannot predict 3-D single- or

  16. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    Science.gov (United States)

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P 30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  17. TRACE and TRAC-BF1 benchmark against Leibstadt plant data during the event inadvertent opening of relief valves

    Energy Technology Data Exchange (ETDEWEB)

    Sekhri, A.; Baumann, P. [KernkraftwerkLeibstadt AG, 5325 Leibstadt (Switzerland); Wicaksono, D. [Swiss Federal Inst. of Technology Zurich ETH, 8092 Zurich (Switzerland); Miro, R.; Barrachina, T.; Verdu, G. [Inst. for Industrial, Radiophysical and Environmental Safety ISIRYM, Universitat Politecnica de Valencia UPV, Cami de Vera s/n, 46021 Valencia (Spain)

    2012-07-01

    In framework of introducing TRACE code to transient analyses system codes for Leibstadt Power Plant (KKL), a conversion process of existing TRAC-BF1 model to TRACE has been started within KKL. In the first step, TRACE thermal-hydraulic model for KKL has been developed based on existing TRAC-BF1 model. In order to assess the code models a simulation of plant transient event is required. In this matter simulations of inadvertent opening of 8 relief valves event have been performed. The event occurs at KKL during normal operation, and it started when 8 relief valves open resulting in depressurization of the Reactor Pressure Vessel (RPV). The reactor was shutdown safely by SCRAM at low level. The high pressure core spray (HPCS) and the reactor core isolation cooling (RCIC) have been started manually in order to compensate the level drop. The remaining water in the feedwater (FW) lines flashes due to saturation conditions originated from RPV depressurization and refills the reactor downcomer. The plant boundary conditions have been used in the simulations and the FW flow rate has been adjusted for better prediction. The simulations reproduce the plant data with good agreement. It can be concluded that the TRAC-BF1 existing model has been used successfully to develop the TRACE model and the results of the calculations have shown good agreement with plant recorded data. Beside the modeling assessment, the TRACE and TRAC-BF1 capabilities to reproduce plant physical behavior during the transient have shown satisfactory results. The first step of developing KKL model for TRACE has been successfully achieved and this model is further developed in order to simulate more complex plant behavior such as Turbine Trip. (authors)

  18. TRACE and TRAC-BF1 benchmark against Leibstadt plant data during the event inadvertent opening of relief valves

    International Nuclear Information System (INIS)

    Sekhri, A.; Baumann, P.; Wicaksono, D.; Miro, R.; Barrachina, T.; Verdu, G.

    2012-01-01

    In framework of introducing TRACE code to transient analyses system codes for Leibstadt Power Plant (KKL), a conversion process of existing TRAC-BF1 model to TRACE has been started within KKL. In the first step, TRACE thermal-hydraulic model for KKL has been developed based on existing TRAC-BF1 model. In order to assess the code models a simulation of plant transient event is required. In this matter simulations of inadvertent opening of 8 relief valves event have been performed. The event occurs at KKL during normal operation, and it started when 8 relief valves open resulting in depressurization of the Reactor Pressure Vessel (RPV). The reactor was shutdown safely by SCRAM at low level. The high pressure core spray (HPCS) and the reactor core isolation cooling (RCIC) have been started manually in order to compensate the level drop. The remaining water in the feedwater (FW) lines flashes due to saturation conditions originated from RPV depressurization and refills the reactor downcomer. The plant boundary conditions have been used in the simulations and the FW flow rate has been adjusted for better prediction. The simulations reproduce the plant data with good agreement. It can be concluded that the TRAC-BF1 existing model has been used successfully to develop the TRACE model and the results of the calculations have shown good agreement with plant recorded data. Beside the modeling assessment, the TRACE and TRAC-BF1 capabilities to reproduce plant physical behavior during the transient have shown satisfactory results. The first step of developing KKL model for TRACE has been successfully achieved and this model is further developed in order to simulate more complex plant behavior such as Turbine Trip. (authors)

  19. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs; Acoplamiento neutronico / termohidraulico con el sistema de codigos TRACE / PARCS

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M. [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico); Del Valle G, E., E-mail: dulcemaria.mejia@cnsns.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, Col. Lindavista, 07738 Ciudad de Mexico (Mexico)

    2015-09-15

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  20. Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project

    Science.gov (United States)

    Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.

    2007-01-01

    The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.

  1. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  2. Simulation of a main steam line break with steam generator tube rupture using trace

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, S.; Querol, A.; Verdu, G. [Departamento de Ingenieria Quimica Y Nuclear, Universitat Politecnica de Valencia, Camino de Vera s/n, 46022, Valencia (Spain)

    2012-07-01

    A simulation of the OECD/NEA ROSA-2 Project Test 5 was made with the thermal-hydraulic code TRACE5. Test 5 performed in the Large Scale Test Facility (LSTF) reproduced a Main Steam Line Break (MSLB) with a Steam Generator Tube Rupture (SGTR) in a Pressurized Water Reactor (PWR). The result of these simultaneous breaks is a depressurization in the secondary and primary system in loop B because both systems are connected through the SGTR. Good approximation was obtained between TRACE5 results and experimental data. TRACE5 reproduces qualitatively the phenomena that occur in this transient: primary pressure falls after the break, stagnation of the pressure after the opening of the relief valve of the intact steam generator, the pressure falls after the two openings of the PORV and the recovery of the liquid level in the pressurizer after each closure of the PORV. Furthermore, a sensitivity analysis has been performed to know the effect of varying the High Pressure Injection (HPI) flow rate in both loops on the system pressures evolution. (authors)

  3. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Weeratunga, S K

    2008-11-06

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can be easily shared between these two code frameworks and concludes with a set of recommendations for its development.

  4. An empirical analysis of journal policy effectiveness for computational reproducibility.

    Science.gov (United States)

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  5. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  6. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  7. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors; Adaptacion y aplicacion del codigo TRACE para el analisis de transitorios en disenos de reactores rapidos refrigerados por plomo

    Energy Technology Data Exchange (ETDEWEB)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-07-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  8. Main considerations for modelling a station blackout scenario with trace

    International Nuclear Information System (INIS)

    Querol, Andrea; Turégano, Jara; Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo

    2017-01-01

    In the nuclear safety field, the thermal hydraulic phenomena that take place during an accident in a nuclear power plant is of special importance. One of the most studied accidents is the Station BlackOut (SBO). The aim of the present work is the analysis of the PKL integral test facility nodalization using the thermal-hydraulic code TRACE5 to reproduce a SBO accidental scenario. The PKL facility reproduces the main components of the primary and secondary systems of its reference nuclear power plant (Philippsburg II). The results obtained with different nodalization have been compared: 3D vessel vs 1D vessel, Steam Generator (SG) modelling using PIPE or TEE components and pressurizer modelling with PIPE or PRIZER components. Both vessel nodalization (1D vessel and 3D vessel) reproduce the physical phenomena of the experiment. However, there are significant discrepancies between them. The appropriate modelling of the SG is also relevant in the results. Regarding the other nodalization (PIPE or TEE components for SG and PIPE or PRIZER components for pressurizer), do not produce relevant differences in the results. (author)

  9. Main considerations for modelling a station blackout scenario with trace

    Energy Technology Data Exchange (ETDEWEB)

    Querol, Andrea; Turégano, Jara; Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: anquevi@upv.es, E-mail: jaturna@upv.es, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), Universitat Politècnica de València (Spain)

    2017-07-01

    In the nuclear safety field, the thermal hydraulic phenomena that take place during an accident in a nuclear power plant is of special importance. One of the most studied accidents is the Station BlackOut (SBO). The aim of the present work is the analysis of the PKL integral test facility nodalization using the thermal-hydraulic code TRACE5 to reproduce a SBO accidental scenario. The PKL facility reproduces the main components of the primary and secondary systems of its reference nuclear power plant (Philippsburg II). The results obtained with different nodalization have been compared: 3D vessel vs 1D vessel, Steam Generator (SG) modelling using PIPE or TEE components and pressurizer modelling with PIPE or PRIZER components. Both vessel nodalization (1D vessel and 3D vessel) reproduce the physical phenomena of the experiment. However, there are significant discrepancies between them. The appropriate modelling of the SG is also relevant in the results. Regarding the other nodalization (PIPE or TEE components for SG and PIPE or PRIZER components for pressurizer), do not produce relevant differences in the results. (author)

  10. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  11. Validation of the U.S. NRC coupled code system TRITON/TRACE/PARCS with the special power excursion reactor test III (SPERT III)

    Energy Technology Data Exchange (ETDEWEB)

    Wang, R. C.; Xu, Y.; Downar, T. [Dept. of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Ann Arbor, MI 48104 (United States); Hudson, N. [RES Div., U.S. NRC, Rockville, MD (United States)

    2012-07-01

    The Special Power Excursion Reactor Test III (SPERT III) was a series of reactivity insertion experiments conducted in the 1950's. This paper describes the validation of the U.S. NRC Coupled Code system TRITON/PARCS/TRACE to simulate reactivity insertion accidents (RIA) by using several of the SPERT III tests. The work here used the SPERT III E-core configuration tests in which the RIA was initiated by ejecting a control rod. The resulting super-prompt reactivity excursion and negative reactivity feedback produced the familiar bell shaped power increase and decrease. The energy deposition during such a power peak has important safety consequences and provides validation basis for core coupled multi-physics codes. The transients of five separate tests are used to benchmark the PARCS/TRACE coupled code. The models were thoroughly validated using the original experiment documentation. (authors)

  12. GRay: A MASSIVELY PARALLEL GPU-BASED CODE FOR RAY TRACING IN RELATIVISTIC SPACETIMES

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal [Department of Astronomy, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States)

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  13. Analysis of an ADS spurious opening event at a BWR/6 by means of the TRACE code

    International Nuclear Information System (INIS)

    Nikitin, Konstantin; Manera, Annalisa

    2011-01-01

    Highlights: → The spurious opening of 8 relief valves of the ADS system in a BWR/6 has been simulated. → The valves opening results in a fast depressurization and significant loads on the RPV internals. → This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the available plant data. - Abstract: The paper presents the results of a post-event analysis of a spurious opening of 8 relief valves of the automatic depressurization system (ADS) occurred in a BWR/6. The opening of the relief valves results in a fast depressurization (pressure blow down) of the primary system which might lead to significant dynamic loads on the RPV and associated internals. In addition, the RPV level swelling caused by the fast depressurization might lead to undesired water carry-over into the steam line and through the safety relief valves (SRVs). Therefore, the transient needs to be characterized in terms of evolution of pressure, temperature and fluid distribution in the system. This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the plant data.

  14. SIMULATE-3 K coupled code applications

    Energy Technology Data Exchange (ETDEWEB)

    Joensson, Christian [Studsvik Scandpower AB, Vaesteraas (Sweden); Grandi, Gerardo; Judd, Jerry [Studsvik Scandpower Inc., Idaho Falls, ID (United States)

    2017-07-15

    This paper describes the coupled code system TRACE/SIMULATE-3 K/VIPRE and the application of this code system to the OECD PWR Main Steam Line Break. A short description is given for the application of the coupled system to analyze DNBR and the flexibility the system creates for the user. This includes the possibility to compare and evaluate the result with the TRACE/SIMULATE-3K (S3K) coupled code, the S3K standalone code (core calculation) as well as performing single-channel calculations with S3K and VIPRE. This is the typical separate-effect-analyses required for advanced calculations in order to develop methodologies to be used for safety analyses in general. The models and methods of the code systems are presented. The outline represents the analysis approach starting with the coupled code system, reactor and core model calculation (TRACE/S3K). This is followed by a more detailed core evaluation (S3K standalone) and finally a very detailed thermal-hydraulic investigation of the hot pin condition (VIPRE).

  15. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  16. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  17. Radiation heat transfer model in a spent fuel pool by TRACE code

    International Nuclear Information System (INIS)

    Sanchez-Saez, F.; Carlos, S.; Villanueva, J.F.; Martorell, S.

    2014-01-01

    Nuclear policies have experienced an important change since Fukushima Daiichi nuclear plant accident and the safety of spent fuels has been in the spot issue among all the safety concerns. The work presented consists of the thermohydraulic simulation of spent fuel pool behavior after a loss of coolant throughout transfer channel with loss of cooling transient is produced. The simulation is done with the TRACE code. One of the most important variables that define the behavior of the pool is cladding temperature, which evolution depends on the heat emission. In this work convection and radiation heat transfer is considered. When both heat transfer models are considered, a clear delay in achieving the maximum peak cladding temperature (1477 K) is observed compared with the simulation in which only convection heat transfer is considered. (authors)

  18. Analysis of PWR control rod ejection accident with the coupled code system SKETCH-INS/TRACE by incorporating pin power reconstruction model

    International Nuclear Information System (INIS)

    Nakajima, T.; Sakai, T.

    2010-01-01

    The pin power reconstruction model was incorporated in the 3-D nodal kinetics code SKETCH-INS in order to produce accurate calculation of three-dimensional pin power distributions throughout the reactor core. In order to verify the employed pin power reconstruction model, the PWR MOX/UO_2 core transient benchmark problem was analyzed with the coupled code system SKETCH-INS/TRACE by incorporating the model and the influence of pin power reconstruction model was studied. SKETCH-INS pin power distributions for 3 benchmark problems were compared with the PARCS solutions which were provided by the host organisation of the benchmark. SKETCH-INS results were in good agreement with the PARCS results. The capability of employed pin power reconstruction model was confirmed through the analysis of benchmark problems. A PWR control rod ejection benchmark problem was analyzed with the coupled code system SKETCH-INS/ TRACE by incorporating the pin power reconstruction model. The influence of pin power reconstruction model was studied by comparing with the result of conventional node averaged flux model. The results indicate that the pin power reconstruction model has significant effect on the pin powers during transient and hence on the fuel enthalpy

  19. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    Science.gov (United States)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  20. The Effect of Nitrous Oxide Psychosedation on Pantographic Tracings; A preliminary study

    International Nuclear Information System (INIS)

    Fareed, Kamal

    1989-01-01

    The form and reproducibility of pantographic tracings under the influence of relaxant drugs and in patients with muscle dysfunction and TMJ disorders, tend to emphasize the dominance of the neuromuscular factors. The purpose of this study was to demonstrate the effect of nitrous oxide induced psychosedation, on the reproducibility of pantographic tracings of border movements of the mandible. This study included four male subjects (with no signs and symptoms of muscular dysfunction and temporomandibular joint problems). Operator guided border tracings were recorded using the Denar pantograph. Three sets of tracings were recorded: (1) three tracings prior to sedation (Tracing I); (2) one tracing prior to sedation and two after sedation (Tracing II); (3) three tracings after psychosedation (Tracing III). The coincidence of tracings I, II, and 111 were statistically analyzed applying the chi-square (X2) analysis. There was a significant difference in the coincidence of tracings between Tracings 1 and II (X2 = 14.892). There was no significant difference in the coincidence of tracings between Tracings I and III (X2 = 1.338). This suggests that nitrous oxide psychosedation produces a centrally induced relaxation of the musculature, by possibly eliminating the extraneous anxiety producing factors. (author)

  1. Developments in the ray-tracing code Zgoubi for 6-D multiturn tracking in FFAG rings

    International Nuclear Information System (INIS)

    Lemuet, F.; Meot, F.

    2005-01-01

    A geometrical method for 3-D modeling of the magnetic field in scaling and non-scaling FFAG magnets has been installed in the ray-tracing code Zgoubi. The method in particular allows a good simulation of transverse non-linearities, of field fall-offs and possible merging fields in configurations of neighboring magnets, while using realistic models of magnetic fields. That yields an efficient tool for FFAG lattice design and optimizations, and for 6-D tracking studies. It is applied for illustration to the simulation of an acceleration cycle in a 150 MeV radial sector proton FFAG

  2. A Computer Library for Ray Tracing in Analytical Media

    International Nuclear Information System (INIS)

    Miqueles, Eduardo; Coimbra, Tiago A; Figueiredo, J J S de

    2013-01-01

    Ray tracing technique is an important tool not only for forward but also for inverse problems in Geophysics, which most of the seismic processing steps depends on. However, implementing ray tracing codes can be very time consuming. This article presents a computer library to trace rays in 2.5D media composed by stack of layers. The velocity profile inside each layer is such that the eikonal equation can be analitically solved. Therefore, the ray tracing within such profile is made fast and accurately. The great advantage of an analytical ray tracing library is the numerical precision of the quantities computed and the fast execution of the implemented codes. Although ray tracing programs already exist for a long time, for example the seis package by Cervený, with a numerical approach to compute the ray. Regardless of the fact that numerical methods can solve more general problems, the analytical ones could be part of a more sofisticated simulation process, where the ray tracing time is completely relevant. We demonstrate the feasibility of our codes using numerical examples.

  3. TraceContract

    Science.gov (United States)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  4. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  5. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  6. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  8. A model of polarized-beam AGS in the ray-tracing code Zgoubi

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ahrens, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Glenn, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Roser, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Shoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-07-12

    A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are available from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.

  9. Analyses of SBO sequence of VVER1000 reactor using TRACE and MELCOR codes

    International Nuclear Information System (INIS)

    Mazzini, Guido; Kyncl, Milos; Miglierini, Bruno; Kopecek, Vit

    2015-01-01

    In response to the Fukushima accident, the European Commission ordered to perform stress tests to all European Nuclear Power Plants (NPPs). Due to shortage of time a number of conclusions in national stress tests reports were based on engineering judgment only. In the Czech Republic, as a follow up, a consortium of Research Organizations and Universities has decided to simulate selected stress tests scenarios, in particular station Black-Out (SBO) and Loss of Ultimate Sink (LoUS), with the aim to verify conclusions made in the national stress report and to analyse time response of respective source term releases. These activities are carried out in the frame of the project 'Prevention, preparedness and mitigation of consequences of Severe Accident (SA) at Czech NPPs in relation to lessons learned from stress tests after Fukushima' financed by the Ministry of Interior. The Research Centre Rez has been working on the preparation of a MELCOR model for VVER1000 NPP starting with a plant systems nodalization. The basic idea of this paper is to benchmark the MELCOR model with the validated TRACE model, first comparing the steady state and continuing in a long term SBO plus another event until the beginning of the severe accident. The presented work focuses mainly on the preliminary comparison of the thermo-hydraulics of the two models created in MELCOR and TRACE codes. After that, preliminary general results of the SA progression showing the hydrogen production and the relocation phenomena will be shortly discussed. This scenario is considered closed after some seconds to the break of the lower head. (author)

  10. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  11. Traces et espaces de consommation

    Directory of Open Access Journals (Sweden)

    Franck Cochoy

    2016-10-01

    Full Text Available L’avènement des technologies numériques mobiles contribue à une évolution des modalités de distribution et de consommation. Le présent article porte sur l’usage des QR-codes, ces codes-barres bidimensionnels qui offrent à tout usager équipé d’un smartphone l’accès à des contenus commerciaux en ligne. Ils participent à l’Internet des objets et donc au couplage entre espace physique et univers numérique. Ils permettent aussi la collecte de traces numériques porteuses de sens pour les professionnels mais aussi pour les sciences sociales. Grâce à ces traces, on peut comprendre les nouveaux liens marchands tissés entre l’espace physique et le développement de flux informationnels continus. À partir de l’analyse des traces enregistrées à l’occasion de la visite des QR-codes apposés sur trois produits alimentaires (une boîte de sel, une barre chocolatée, une bouteille d’eau, notre enquête s’attache à expliciter les enjeux théoriques, méthodologiques et analytiques du processus de numérisation de l’espace de mobilité physique marchand.

  12. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors; Adaptacion y aplicacion del codigo TRACE para el analisis de transitorios en disenos de reactores rapidos refrigerados por plomo

    Energy Technology Data Exchange (ETDEWEB)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-07-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  13. Translation of the model plant of the CN code TRAC-BF1 Cofrentes of a SNAP-TRACE

    International Nuclear Information System (INIS)

    Escriva, A.; Munuz-Cobo, J. L.; Concejal, A.; Melara, J.; Albendea, M.

    2012-01-01

    It aims to develop a three-dimensional model of the CN Cofrentes whose consistent results Compared with those in current use programs (TRAC-BFl, RETRAN) validated with data of the plant. This comparison should be done globally and that you can not carry a compensation of errors. To check the correct translation of the results obtained have been compared with TRACE and the programs currently in use and the relevant adjustments have been made, taking into account that both the correlations and models are different codes. During the completion of this work we have detected several errors that must be corrected in future versions of these tools.

  14. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  15. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  16. Intraoral gothic arch tracing.

    Science.gov (United States)

    Rubel, Barry; Hill, Edward E

    2011-01-01

    In order to create optimum esthetics, function and phonetics in complete denture fabrication, it is necessary to record accurate maxillo-mandibular determinants of occlusion. This requires clinical skill to establish an accurate, verifiable and reproducible vertical dimension of occlusion (VDO) and centric relation (CR). Correct vertical relation depends upon a consideration of several factors, including muscle tone, inter-dental arch space and parallelism of the ridges. Any errors made while taking maxillo-mandibular jaw relation records will result in dentures that are uncomfortable and, possibly, unwearable. The application of a tracing mechanism such as the Gothic arch tracer (a central bearing device) is a demonstrable method of determining centric relation. Intraoral Gothic arch tracers provide the advantage of capturing VDO and CR in an easy-to-use technique for practitioners. Intraoral tracing (Gothic arch tracing) is a preferred method of obtaining consistent positions of the mandible in motion (retrusive, protrusive and lateral) at a comfortable VDO.

  17. BETHSY 9.1b Test Calculation with TRACE Using 3D Vessel Component

    International Nuclear Information System (INIS)

    Berar, O.; Prosek, A.

    2012-01-01

    Recently, several advanced multidimensional computational tools for simulating reactor system behaviour during real and hypothetical transient scenarios were developed. One of such advanced, best-estimate reactor systems codes is TRAC/RELAP Advanced Computational Engine (TRACE), developed by the U.S. Nuclear Regulatory Commission. The advanced TRACE comes with a graphical user interface called SNAP (Symbolic Nuclear Analysis Package). It is intended for pre- and post-processing, running codes, RELAP5 to TRACE input deck conversion, input deck database generation etc. The TRACE code is still not fully development and it will have all the capabilities of RELAP5. The purpose of the present study was therefore to assess the 3D capability of the TRACE on BETHSY 9.1b test. The TRACE input deck was semi-converted (using SNAP and manual corrections) from the RELAP5 input deck. The 3D fluid dynamics within reactor vessel was modelled and compared to 1D fluid dynamics. The 3D calculation was compared both to TRACE 1D calculation and RELAP5 calculation. Namely, the geometry used in TRACE is basically the same, what gives very good basis for the comparison of the codes. The only exception is 3D reactor vessel model in case of TRACE 3D calculation. The TRACE V5.0 Patch 1 and RELAP5/MOD3.3 Patch 4 were used for calculations. The BETHSY 9.1b test (International Standard Problem no. 27 or ISP-27) was 5.08 cm equivalent diameter cold leg break without high pressure safety injection and with delayed ultimate procedure. BETHSY facility was a 3-loop replica of a 900 MWe FRAMATOME pressurized water reactor. For better presentation of the calculated physical phenomena and processes, an animation model using SNAP was developed. In general, the TRACE 3D code calculation is in good agreement with the BETHSY 9.1b test. The TRACE 3D calculation results are as good as or better than the RELAP5 calculated results. Also, the TRACE 3D calculation is not significantly different from TRACE 1D

  18. The trace ion module for the Monte Carlo code Eirene, a unified approach to plasma chemistry in the ITER divertor

    International Nuclear Information System (INIS)

    Seebacher, J.; Reiter, D.; Borner, P.

    2007-01-01

    Modelling of kinetic transport effects in magnetic fusion devices is of great importance for understanding the physical processes in both the core and and the scrape off layer (SOL) plasma. For SOL simulation the EIRENE code is a well established tool for modelling of neutral, impurities and radiation transport. Recently a new trace ion transport module (tim), has been developed and incorporated into EIRENE. The tim essentially consists of two parts: 1) A trajectory integrator tracing the deterministic motion of a guiding centre particle in general 3D electric and magnetic fields. 2) A stochastic representation of the Fokker Planck collision operator in suitable guiding centre coordinates treating Coulomb collisions with the plasma background species. The TIM enables integrated SOL simulation packages such as B2-EIRENE, EDGE2D-EIRENE (2D) or EMC3-EIRENE (3D) to treat the physical and chemical processes near the divertor targets and in the bulk of the SOL in greater detail than before, and in particular on a kinetic rather than a fluid level. One of the physics applications is the formation and transport of hydrocarbon molecules and ions in the divertor in tokamaks, where the tritium co deposition via hydrocarbons remains a serious issue for next generation fusion devices like ITER. Real tokamak modelling scenarios will be discussed with the code packages B2-EIRENE (2D) and EMC3-EIRENE (3D). A brief overview of the theoretical basis of the tim will be given including code verification studies of the basic physics properties. Applications to hydrocarbon transport studies in TEXTOR and ITER, comparing present (fluid) approximations in edge modelling with the new extended kinetic model, will be presented. (Author)

  19. Analysis of Uncertainty and Sensitivity with TRACE-SUSA and TRACE-DAKOTA. Application to NUPEC BFTB; Analisis de Incertidumbre y Sensibilidad con TRACE-SUSA y TRACE-DAKOTA. Aplicacion a NUPEC BFTB

    Energy Technology Data Exchange (ETDEWEB)

    Montero-Mayorga, J.; Wadim, J.; Sanchez, V. H.

    2012-07-01

    The aim of this work is to test the capabilities of the new tool of uncertainty incorporated into SNAP by simulating experiments with TRACE code and compare these with the results obtained by the same simulations with uncertainty calculation performed with the tool SUSA.

  20. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  1. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scien...

  2. Advanced methodology to simulate boiling water reactor transient using coupled thermal-hydraulic/neutron-kinetic codes

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Christoph Oliver

    2016-06-13

    -sets) predicted by SCALE6/TRITON and CASMO. Thereby the coupled TRACE/PARCS simulations reproduced the single fuel assembly depletion and stand-alone PARCS results. A turbine trip event, occurred at a BWR plant of type 72, has been investigated in detail using the cross-section libraries generated with SCALE/TRITON and CASMO. Thereby the evolution of the integral BWR parameters predicted by the coupled codes using cross-sections from SCALE/TRITON is very close to the global trends calculated using CASMO cross-sections. Further, to implement uncertainty quantifications, the PARCS reactor dynamic code was extended (uncertainty module) to facilitate the consideration of the uncertainty of neutron kinetic parameters in coupled TRACE/PARCS simulations. For a postulated pressure pertubation, an uncertainty and sensitivity study was performed using TRACE/PARCS and SUSA. The obtained results illustrated the capability of such methodologies which are still under development. Based on this analysis, the uncertainty band for key-parameters, e.g. reactivity, as well as the importance ranking of reactor kinetics parameters could be predicted and identified for this accident scenario.

  3. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  4. Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)

    Science.gov (United States)

    Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.

    2016-01-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We

  5. About the use of the Monte-Carlo code based tracing algorithm and the volume fraction method for S n full core calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gurevich, M. I.; Oleynik, D. S. [RRC Kurchatov Inst., Kurchatov Sq., 1, 123182, Moscow (Russian Federation); Russkov, A. A.; Voloschenko, A. M. [Keldysh Inst. of Applied Mathematics, Miusskaya Sq., 4, 125047, Moscow (Russian Federation)

    2006-07-01

    The tracing algorithm that is implemented in the geometrical module of Monte-Carlo transport code MCU is applied to calculate the volume fractions of original materials by spatial cells of the mesh that overlays problem geometry. In this way the 3D combinatorial geometry presentation of the problem geometry, used by MCU code, is transformed to the user defined 2D or 3D bit-mapped ones. Next, these data are used in the volume fraction (VF) method to approximate problem geometry by introducing additional mixtures for spatial cells, where a few original materials are included. We have found that in solving realistic 2D and 3D core problems a sufficiently fast convergence of the VF method takes place if the spatial mesh is refined. Virtually, the proposed variant of implementation of the VF method seems as a suitable geometry interface between Monte-Carlo and S{sub n} transport codes. (authors)

  6. Comparative evaluation of trace elements in blood

    International Nuclear Information System (INIS)

    Goeij, J.J.M. de; Tjioe, P.S.; Pries, C.; Zwiers, J.H.L.

    1976-01-01

    The Interuniversitair Reactor Instituut and the Centraal Laboratorium TNO have carried out a common investigation on neutron-activation-analytical procedures for the determination of trace elements in blood. A comparative evaluation of five methods, destructive as well as non-destructive, is given. The sensitivity and reproducibility of the procedures are discussed. By combining some of the methods it is possible, starting with 1 ml blood, to give quantitative information on 14 important trace elements: antimony, arsenic, bromine, cadmium, cobalt, gold, copper, mercury, molybdenum, nickel, rubidium, selenium, iron and zinc. The methods have also been applied to sodium, chromium and potassium

  7. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  8. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    International Nuclear Information System (INIS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-01-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  9. LUCID - an optical design and raytrace code

    International Nuclear Information System (INIS)

    Nicholas, D.J.; Duffey, K.P.

    1980-11-01

    A 2D optical design and ray trace code is described. The code can operate either as a geometric optics propagation code or provide a scalar diffraction treatment. There are numerous non-standard options within the code including design and systems optimisation procedures. A number of illustrative problems relating to the design of optical components in the field of high power lasers is included. (author)

  10. BWR transient analysis using neutronic / thermal hydraulic coupled codes including uncertainty quantification

    International Nuclear Information System (INIS)

    Hartmann, C.; Sanchez, V.; Tietsch, W.; Stieglitz, R.

    2012-01-01

    The KIT is involved in the development and qualification of best estimate methodologies for BWR transient analysis in cooperation with industrial partners. The goal is to establish the most advanced thermal hydraulic system codes coupled with 3D reactor dynamic codes to be able to perform a more realistic evaluation of the BWR behavior under accidental conditions. For this purpose a computational chain based on the lattice code (SCALE6/GenPMAXS), the coupled neutronic/thermal hydraulic code (TRACE/PARCS) as well as a Monte Carlo based uncertainty and sensitivity package (SUSA) has been established and applied to different kind of transients of a Boiling Water Reactor (BWR). This paper will describe the multidimensional models of the plant elaborated for TRACE and PARCS to perform the investigations mentioned before. For the uncertainty quantification of the coupled code TRACE/PARCS and specifically to take into account the influence of the kinetics parameters in such studies, the PARCS code has been extended to facilitate the change of model parameters in such a way that the SUSA package can be used in connection with TRACE/PARCS for the U and S studies. This approach will be presented in detail. The results obtained for a rod drop transient with TRACE/PARCS using the SUSA-methodology showed clearly the importance of some kinetic parameters on the transient progression demonstrating that the coupling of a best-estimate coupled codes with uncertainty and sensitivity tools is very promising and of great importance for the safety assessment of nuclear reactors. (authors)

  11. Development Status of TRACE model for PGSFR Safety Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Andong; Choi, Yong Won; Kim, Jihun; Bae, Moohoon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    For the preparation of the review of licensing application for PGSFR, TRACE model for the PGSFR is being developed considering the sodium related properties and model in the code. For the use of licensing purpose, it is identified and need to be improved that model uncertainty in the code and conservative conditions for accident analysis is needs to be defined and validated. And current simulations are applicable only to assembly-averaged assessment. So it is also need to be defined for pin-wise assessment within hot assembly. On the basis on the developed model, PGSFR design change will be applied and improved for independent audit calculation for incoming licensing review. Prototype Generation IV Sodium cooled Fast Reactor (PGSFR) of 150MWe is under developing targeting licensing application by 2017. KINS is preparing review of its licensing application, especially the audit calculation tool for transient and accident analysis is being prepared for review. Since 2012, TRACE code applicability study has been doing for the Sodium-cooled Fast Reactor. At first, Sodium properties and the related heat transfer model in the code were reviewed. Demonstration Sodium cooled Fast Reactor (DSFR-600) were model and representing DBAs were assessed until the PGSFR design is fixed. EBR-II Shutdown Heat Removal Test (SHRT) experiment is also being analyzed in terms of IAEA Cooperated Research Program. In this paper, PGSFR TRACE code modeling status and considerations for SFR DBA assessment is introduced.

  12. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  13. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  14. Related research with thermo hydraulics safety by means of Trace code

    International Nuclear Information System (INIS)

    Chaparro V, F. J.; Del Valle G, E.; Rodriguez H, A.; Gomez T, A. M.; Sanchez E, V. H.; Jager, W.

    2014-10-01

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  15. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  16. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    Science.gov (United States)

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  17. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  18. Evaluation of the reproducibility of two techniques used to determine and record centric relation in angle's class I patients

    Directory of Open Access Journals (Sweden)

    Fernanda Paixão

    2007-08-01

    Full Text Available The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson's Bilateral Manipulation and Gysi's Gothic Arch Tracing. Twenty volunteers (14 females and 6 males with no dental loss, presenting occlusal contacts according to those described in Angle's I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson's Bilateral Manipulation and to the Gysi's Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27. Shapiro-Wilk test was applied and the results allowed application of Student's t-test (sampling error of 5%. The techniques showed different degrees of variability. The Gysi's Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records.

  19. Simulation of a passive auxiliary feedwater system with TRACE5

    Energy Technology Data Exchange (ETDEWEB)

    Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), València (Spain)

    2017-07-01

    The study of the nuclear power plant accidents occurred in recent decades, as well as the probabilistic risk assessment carried out for this type of facility, present human error as one of the main contingency factors. For this reason, the design and development of generation III, III+ and IV reactors, which include inherent and passive safety systems, have been promoted. In this work, a TRACE5 model of ATLAS (Advanced Thermal- Hydraulic Test Loop for Accident Simulation) is used to reproduce an accidental scenario consisting in a prolonged Station BlackOut (SBO). In particular, the A1.2 test of the OECD-ATLAS project is analyzed, whose purpose is to study the primary system cooling by means of the water supply to one of the steam generators from a Passive Auxiliary Feedwater System (PAFS). This safety feature prevents the loss of secondary system inventory by means of the steam condensation and its recirculation. Thus, the conservation of a heat sink allows the natural circulation flow rate until restoring stable conditions. For the reproduction of the test, an ATLAS model has been adapted to the experiment conditions, and a PAFS has been incorporated. >From the simulation test results, the main thermal-hydraulic variables (pressure, flow rates, collapsed water level and temperature) are analyzed in the different circuits, contrasting them with experimental data series. As a conclusion, the work shows the TRACE5 code capability to correctly simulate the behavior of a passive feedwater system. (author)

  20. Verification of CTF/PARCSv3.2 coupled code in a Turbine Trip scenario

    International Nuclear Information System (INIS)

    Abarca, A.; Hidalga, P.; Miro, R.; Verdu, G.; Sekhri, A.

    2017-01-01

    Multiphysics codes had revealed as a best-estimate approach to simulate core behavior in LWR. Coupled neutronics and thermal-hydraulics codes are being used and improved to achieve reliable results for reactor safety transient analysis. The implementation of the feedback procedure between the coupled codes at each time step allows a more accurate simulation and a better prediction of the safety limits of analyzed scenarios. With the objective of testing the recently developed CTF/PARCSv3.2 coupled code, a code-to-code verification against TRACE has been developed in a BWR Turbine Trip scenario. CTF is a thermal-hydraulic subchannel code that features two-fluid, three-field representation of the two-phase flow, while PARCS code solves the neutronic diffusion equation in a 3D nodal distribution. PARCS features allow as well the use of extended sets of cross section libraries for a more precise neutronic performance in different formats like PMAX or NEMTAB. Using this option the neutronic core composition of KKL will be made taking advantage of the core follow database. The results of the simulation will be verified against TRACE results. TRACE will be used as a reference code for the validation process since it has been a recommended code by the USNRC. The model used for TRACE includes a full core plus relevant components such as the steam lines and the valves affecting and controlling the turbine trip evolution. The coupled code performance has been evaluated using the Turbine Trip event that took place in Kern Kraftwerk Leibstadt (KKL), at the fuel cycle 18. KKL is a Nuclear Power Plant (NPP) located in Leibstadt, Switzerland. This NPP operates with a BWR developing 3600 MWt in fuel cycles of one year. The Turbine Trip is a fast transient developing a pressure peak in the reactor followed by a power decreasing due to the selected control rod insertion. This kind of transient is very useful to check the feedback performance between both coupled codes due to the fast

  1. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  2. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  3. Variable Rate, Adaptive Transform Tree Coding Of Images

    Science.gov (United States)

    Pearlman, William A.

    1988-10-01

    A tree code, asymptotically optimal for stationary Gaussian sources and squared error distortion [2], is used to encode transforms of image sub-blocks. The variance spectrum of each sub-block is estimated and specified uniquely by a set of one-dimensional auto-regressive parameters. The expected distortion is set to a constant for each block and the rate is allowed to vary to meet the given level of distortion. Since the spectrum and rate are different for every block, the code tree differs for every block. Coding simulations for target block distortion of 15 and average block rate of 0.99 bits per pel (bpp) show that very good results can be obtained at high search intensities at the expense of high computational complexity. The results at the higher search intensities outperform a parallel simulation with quantization replacing tree coding. Comparative coding simulations also show that the reproduced image with variable block rate and average rate of 0.99 bpp has 2.5 dB less distortion than a similarly reproduced image with a constant block rate equal to 1.0 bpp.

  4. Au coated PS nanopillars as a highly ordered and reproducible SERS substrate

    Science.gov (United States)

    Kim, Yong-Tae; Schilling, Joerg; Schweizer, Stefan L.; Sauer, Guido; Wehrspohn, Ralf B.

    2017-07-01

    Noble metal nanostructures with nanometer gap size provide strong surface-enhanced Raman scattering (SERS) which can be used to detect trace amounts of chemical and biological molecules. Although several approaches were reported to obtain active SERS substrates, it still remains a challenge to fabricate SERS substrates with high sensitivity and reproducibility using low-cost techniques. In this article, we report on the fabrication of Au sputtered PS nanopillars based on a template synthetic method as highly ordered and reproducible SERS substrates. The SERS substrates are fabricated by anodic aluminum oxide (AAO) template-assisted infiltration of polystyrene (PS) resulting in hemispherical structures, and a following Au sputtering process. The optimum gap size between adjacent PS nanopillars and thickness of the Au layers for high SERS sensitivity are investigated. Using the Au sputtered PS nanopillars as an active SERS substrate, the Raman signal of 4-methylbenzenethiol (4-MBT) with a concentration down to 10-9 M is identified with good signal reproducibility, showing great potential as promising tool for SERS-based detection.

  5. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  6. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance.

    Science.gov (United States)

    Nöremark, Maria; Widgren, Stefan

    2014-03-17

    During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs.

  7. SU-E-J-227: Breathing Pattern Consistency and Reproducibility: Comparative Analysis for Supine and Prone Body Positioning

    International Nuclear Information System (INIS)

    Laugeman, E; Weiss, E; Chen, S; Hugo, G; Rosu, M

    2014-01-01

    Purpose: Evaluate and compare the cycle-to-cycle consistency of breathing patterns and their reproducibility over the course of treatment, for supine and prone positioning. Methods: Respiratory traces from 25 patients were recorded for sequential supine/prone 4DCT scans acquired prior to treatment, and during the course of the treatment (weekly or bi-weekly). For each breathing cycle, the average(AVE), end-of-exhale(EoE) and end-of-inhale( EoI) locations were identified using in-house developed software. In addition, the mean values and variations for the above quantities were computed for each breathing trace. F-tests were used to compare the cycle-to-cycle consistency of all pairs of sequential supine and prone scans. Analysis of variances was also performed using population means for AVE, EoE and EoI to quantify differences between the reproducibility of prone and supine respiration traces over the treatment course. Results: Consistency: Cycle-to-cycle variations are less in prone than supine in the pre-treatment and during-treatment scans for AVE, EoE and EoI points, for the majority of patients (differences significant at p<0.05). The few cases where the respiratory pattern had more variability in prone appeared to be random events. Reproducibility: The reproducibility of breathing patterns (supine and prone) improved as treatment progressed, perhaps due to patients becoming more comfortable with the procedure. However, variability in supine position continued to remain significantly larger than in prone (p<0.05), as indicated by the variance analysis of population means for the pretreatment and subsequent during-treatment scans. Conclusions: Prone positioning stabilizes breathing patterns in most subjects investigated in this study. Importantly, a parallel analysis of the same group of patients revealed a tendency towards increasing motion amplitude of tumor targets in prone position regardless of their size or location; thus, the choice for body positioning

  8. Translation of the model plant of the CN code TRAC-BF1 Cofrentes of a SNAP-TRACE; Traduccion del modelo de planta de CN Cofrentes del codigo TRAC-BF1 a SNAP-TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Escriva, A.; Munuz-Cobo, J. L.; Concejal, A.; Melara, J.; Albendea, M.

    2012-07-01

    It aims to develop a three-dimensional model of the CN Cofrentes whose consistent results Compared with those in current use programs (TRAC-BFl, RETRAN) validated with data of the plant. This comparison should be done globally and that you can not carry a compensation of errors. To check the correct translation of the results obtained have been compared with TRACE and the programs currently in use and the relevant adjustments have been made, taking into account that both the correlations and models are different codes. During the completion of this work we have detected several errors that must be corrected in future versions of these tools.

  9. Advanced Presentation of BETHSY 6.2TC Test Results Calculated by RELAP5 and TRACE

    Directory of Open Access Journals (Sweden)

    Andrej Prošek

    2012-01-01

    Full Text Available Today most software applications come with a graphical user interface, including U.S. Nuclear Regulatory Commission TRAC/RELAP Advanced Computational Engine (TRACE best-estimate reactor system code. The graphical user interface is called Symbolic Nuclear Analysis Package (SNAP. The purpose of the present study was to assess the TRACE computer code and to assess the SNAP capabilities for input deck preparation and advanced presentation of the results. BETHSY 6.2 TC test was selected, which is 15.24 cm equivalent diameter horizontal cold leg break. For calculations the TRACE V5.0 Patch 1 and RELAP5/MOD3.3 Patch 4 were used. The RELAP5 legacy input deck was converted to TRACE input deck using SNAP. The RELAP5 and TRACE comparison to experimental data showed that TRACE results are as good as or better than the RELAP5 calculated results. The developed animation masks were of great help in comparison of results and investigating the calculated physical phenomena and processes.

  10. TRACE/PARCS modelling of rips trip transients for Lungmen ABWR

    Energy Technology Data Exchange (ETDEWEB)

    Chang, C. Y. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., No.101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Lin, H. T.; Wang, J. R. [Inst. of Nuclear Energy Research, No. 1000, Wenhua Rd., Longtan Township, Taoyuan County 32546, Taiwan (China); Shih, C. [Inst. of Nuclear Engineering and Science, Dept. of Engineering and System Science, National Tsing-Hua Univ., No.101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2012-07-01

    The objectives of this study are to examine the performances of the steady-state results calculated by the Lungmen TRACE/PARCS model compared to SIMULATE-3 code, as well as to use the analytical results of the final safety analysis report (FSAR) to benchmark the Lungmen TRACE/PARCS model. In this study, three power generation methods in TRACE were utilized to analyze the three reactor internal pumps (RIPs) trip transient for the purpose of validating the TRACE/PARCS model. In general, the comparisons show that the transient responses of key system parameters agree well with the FSAR results, including core power, core inlet flow, reactivity, etc. Further studies will be performed in the future using Lungmen TRACE/PARCS model. After the commercial operation of Lungmen nuclear power plant, TRACE/PARCS model will be verified. (authors)

  11. Assessment of TRACE Condensation Model Against Reflux Condensation Tests with Noncondensable Gases

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Won; Cheong, Ae Ju; Shin, Andong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The TRACE is the latest in a series of advanced, best-estimated reactor systems code developed by U.S. Nuclear Regulatory Commission for analyzing transient and steady-state neutronic-thermal-hydraulic behavior in light water reactors. This special model is expected to replace the default model in a future code release after sufficient testing has been completed. This study assesses the special condensation model of TRACE 5.0-patch4 against the counter-current flow configuration. For this purpose, the predicted results of special model are compared to the experimental and to those of default model. The KAST reflux condensation test with NC gases are used in this assessment. We assessed the special model for film condensation of TRACE 5.0-patch4 against the data of the reflux condensation test in the presence of NC gases. The special condensation model of TRACE provides a reasonable estimate of HTC with good agreement at the low inlet steam flow rate.

  12. Assessment of TRACE Condensation Model Against Reflux Condensation Tests with Noncondensable Gases

    International Nuclear Information System (INIS)

    Lee, Kyung Won; Cheong, Ae Ju; Shin, Andong; Suh, Nam Duk

    2015-01-01

    The TRACE is the latest in a series of advanced, best-estimated reactor systems code developed by U.S. Nuclear Regulatory Commission for analyzing transient and steady-state neutronic-thermal-hydraulic behavior in light water reactors. This special model is expected to replace the default model in a future code release after sufficient testing has been completed. This study assesses the special condensation model of TRACE 5.0-patch4 against the counter-current flow configuration. For this purpose, the predicted results of special model are compared to the experimental and to those of default model. The KAST reflux condensation test with NC gases are used in this assessment. We assessed the special model for film condensation of TRACE 5.0-patch4 against the data of the reflux condensation test in the presence of NC gases. The special condensation model of TRACE provides a reasonable estimate of HTC with good agreement at the low inlet steam flow rate

  13. An analysis of reproducibility and non-determinism in HEP software and ROOT data

    Science.gov (United States)

    Ivie, Peter; Zheng, Charles; Lannon, Kevin; Thain, Douglas

    2017-10-01

    Reproducibility is an essential component of the scientific method. In order to validate the correctness or facilitate the extension of a computational result, it should be possible to re-run a published result and verify that the same results are produced. However, reproducing a computational result is surprisingly difficult: non-determinism and other factors may make it impossible to get the same result, even when running the same code on the same machine on the same day. We explore this problem in the context of HEP codes and data, showing three high level methods for dealing with non-determinism in general: 1) Domain specific methods; 2) Domain specific comparisons; and 3) Virtualization adjustments. Using a CMS workflow with output data stored in ROOT files, we use these methods to prevent, detect, and eliminate some sources of non-determinism. We observe improved determinism using pre-determined random seeds, a predictable progression of system timestamps, and fixed process identifiers. Unfortunately, sources of non-determinism continue to exist despite the combination of all three methods. Hierarchical data comparisons also allow us to appropriately ignore some non-determinism when it is unavoidable. We conclude that there is still room for improvement, and identify directions that can be taken in each method to make an experiment more reproducible.

  14. The International Atomic Energy Agency Flag Code

    International Nuclear Information System (INIS)

    1999-01-01

    The document reproduces the text of the IAEA Flag Code which was promulgated by the Director General on 15 September 1999, pursuant to the decision of the Board of Governors on 10 June 1999 to adopt an Agency flag as described in document GOV/1999/41 and its use in accordance with a flag code to be promulgated by the Director General

  15. The International Atomic Energy Agency Flag Code

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-11-17

    The document reproduces the text of the IAEA Flag Code which was promulgated by the Director General on 15 September 1999, pursuant to the decision of the Board of Governors on 10 June 1999 to adopt an Agency flag as described in document GOV/1999/41 and its use in accordance with a flag code to be promulgated by the Director General.

  16. Review and proposal for heat transfer predictions at supercritical water conditions using existing correlations and experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim, E-mail: wadim.jaeger@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, DE-76344 Eggenstein-Leopoldshafen (Germany); Sanchez Espinoza, Victor Hugo [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, DE-76344 Eggenstein-Leopoldshafen (Germany); Hurtado, Antonio [Technical University of Dresden, Institute of Power Engineering, DE-01062 Dresden (Germany)

    2011-06-15

    Highlights: > Implementation of heat transfer correlations for supercritical water into TRACE. > Simulation of several heat transfer experiments with modified TRACE version. > Most correlations are not able to reproduce the experimental results. > Bishop, Sandberg and Tong correlation is most suitable for TRACE applications. - Abstract: This paper summarizes the activities of the TRACE code validation at the Institute for Neutron Physics and Reactor Technology related to supercritical water conditions. In particular, the providing of the thermo physical properties and its appropriate use in the wall-to-fluid heat transfer models in the frame of the TRACE code is the object of this investigation. In a first step, the thermo physical properties of the original TRACE code were modified in order to account for supercritical conditions. In a second step, existing Nusselt correlations were reviewed and implemented into TRACE and available experiments were simulated to identify the most suitable Nusselt correlation(s).

  17. Goya - an MHD equilibrium code for toroidal plasmas

    International Nuclear Information System (INIS)

    Scheffel, J.

    1984-09-01

    A description of the GOYA free-boundary equilibrium code is given. The non-linear Grad-Shafranov equation of ideal MHD is solved in a toroidal geometry for plasmas with purely poloidal magnetic fields. The code is based on a field line-tracing procedure, making storage of a large amount of information on a grid unnecessary. Usage of the code is demonstrated by computations of equi/libria for the EXTRAP-T1 device. (Author)

  18. Related research with thermo hydraulics safety by means of Trace code; Investigaciones relacionadas con seguridad termohidraulica con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.; Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico); Rodriguez H, A.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Sanchez E, V. H.; Jager, W., E-mail: evalle@esfm.ipn.mx [Karlsruhe Institute of Technology, Hermann-von-Helmholtz Platz I, D-76344 Eggenstein - Leopoldshafen (Germany)

    2014-10-15

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  19. A manual to the MAXRAY program library for reflective and dispersive ray tracing

    International Nuclear Information System (INIS)

    Svensson, S.; Nyholm, R.

    1985-07-01

    A general ray tracing program package for reflective and dispersive X-ray optics is described. The package consists of a number of subroutines written in FORTRAN 77 code giving the necessary tools for ray tracing. The program package is available on request from the authors. (authors)

  20. PLASMOR: A laser-plasma simulation code. Pt. 2

    International Nuclear Information System (INIS)

    Salzman, D.; Krumbein, A.D.; Szichman, H.

    1987-06-01

    This report supplements a previous one which describes the PLASMOR hydrodynamics code. The present report documents the recent changes and additions made in the code. In particular described are two new subroutines for radiative preheat, a system of preprocessors which prepare the code before run, a list of postprocessors which simulate experimental setups, and the basic data sets required to run PLASMOR. In the Appendix a new computer-based manual which lists the main features of PLASMOR is reproduced

  1. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  2. APC-II: an electron beam propagation code

    International Nuclear Information System (INIS)

    Iwan, D.C.; Freeman, J.R.

    1984-05-01

    The computer code APC-II simulates the propagation of a relativistic electron beam through air. APC-II is an updated version of the APC envelope model code. It incorporates an improved conductivity model which significantly extends the range of stable calculations. A number of test cases show that these new models are capable of reproducing the simulations of the original APC code. As the result of a major restructuring and reprogramming of the code, APC-II is now friendly to both the occasional user and the experienced user who wishes to make modifications. Most of the code is in standard ANS-II Fortran 77 so that it can be easily transported between machines

  3. Uncertainty Methods Framework Development for the TRACE Thermal-Hydraulics Code by the U.S.NRC

    International Nuclear Information System (INIS)

    Bajorek, Stephen M.; Gingrich, Chester

    2013-01-01

    The Code of Federal Regulations, Title 10, Part 50.46 requires that the Emergency Core Cooling System (ECCS) performance be evaluated for a number of postulated Loss-Of-Coolant-Accidents (LOCAs). The rule allows two methods for calculation of the acceptance criteria; using a realistic model in the so-called 'Best Estimate' approach, or the more prescriptive following Appendix K to Part 50. Because of the conservatism of Appendix K, recent Evaluation Model submittals to the NRC used the realistic approach. With this approach, the Evaluation Model must demonstrate that the Peak Cladding Temperature (PCT), the Maximum Local Oxidation (MLO) and Core-Wide Oxidation (CWO) remain below their regulatory limits with a 'high probability'. Guidance for Best Estimate calculations following 50.46(a)(1) was provided by Regulatory Guide 1.157. This Guide identified a 95% probability level as being acceptable for comparisons of best-estimate predictions to the applicable regulatory limits, but was vague with respect to acceptable methods in which to determine the code uncertainty. Nor, did it specify if a confidence level should be determined. As a result, vendors have developed Evaluation Models utilizing several different methods to combine uncertainty parameters and determine the PCT and other variables to a high probability. In order to quantify the accuracy of TRACE calculations for a wide variety of applications and to audit Best Estimate calculations made by industry, the NRC is developing its own independent methodology to determine the peak cladding temperature and other parameters of regulatory interest to a high probability. Because several methods are in use, and each vendor's methodology ranges different parameters, the NRC method must be flexible and sufficiently general. Not only must the method apply to LOCA analysis for conventional light-water reactors, it must also be extendable to new reactor designs and type of analyses where the acceptance criteria are less

  4. MCViNE – An object oriented Monte Carlo neutron ray tracing simulation package

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiao Y.Y., E-mail: linjiao@ornl.gov [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Department of Applied Physics and Materials Science, California Institute of Technology (United States); Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Smith, Hillary L. [Department of Applied Physics and Materials Science, California Institute of Technology (United States); Granroth, Garrett E., E-mail: granrothge@ornl.gov [Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A. [Quantum Condensed Matter Division, Oak Ridge National Laboratory (United States); Aivazis, Michael [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Fultz, Brent, E-mail: btf@caltech.edu [Department of Applied Physics and Materials Science, California Institute of Technology (United States)

    2016-02-21

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.

  5. IRSN Code of Ethics and Professional Conduct. Annex VII [TSO Mission Statement and Code of Ethics

    International Nuclear Information System (INIS)

    2018-01-01

    IRSN has adopted, in 2013, a Code of Ethics and Professional Conduct, the contents of which are summarized. As a preamble, it is indicated that the Code, which was adopted in 2013 by the Ethics Commission of IRSN and the Board of IRSN, complies with relevant constitutional and legal requirements. The introduction to the Code presents the role and missions of IRSN in the French system, as well as the various conditions and constraints that frame its action, in particular with respect to ethical issues. It states that the Code sets principles and establishes guidance for addressing these constraints and resolving conflicts that may arise, thus constituting references for the Institute and its staff, and helping IRSN’s partners in their interaction with the Institute. The stipulations of the Code are organized in four articles, reproduced and translated.

  6. Review and proposal for heat transfer predictions at supercritical water conditions using existing correlations and experiments

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo; Hurtado, Antonio

    2011-01-01

    Highlights: → Implementation of heat transfer correlations for supercritical water into TRACE. → Simulation of several heat transfer experiments with modified TRACE version. → Most correlations are not able to reproduce the experimental results. → Bishop, Sandberg and Tong correlation is most suitable for TRACE applications. - Abstract: This paper summarizes the activities of the TRACE code validation at the Institute for Neutron Physics and Reactor Technology related to supercritical water conditions. In particular, the providing of the thermo physical properties and its appropriate use in the wall-to-fluid heat transfer models in the frame of the TRACE code is the object of this investigation. In a first step, the thermo physical properties of the original TRACE code were modified in order to account for supercritical conditions. In a second step, existing Nusselt correlations were reviewed and implemented into TRACE and available experiments were simulated to identify the most suitable Nusselt correlation(s).

  7. Model of heat transfer by radiation in a pool of spent fuel by TRACE; Modelo de transmision de calor por radiacion en una piscina de combustible gastado mediante TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Saez, F.; Carlos Alberola, S.; Martorell Alsina, J. J.; Villanueva Lopez, J. F.

    2013-07-01

    The work involves the simulation thermal-hydraulic of a spent fuel pool to cause a transient loss of coolant with coolant loss through the transfer channel. The simulation performed with the Best Estimate TRACE code. To follow the evolution of behavior of the pool, a variable is important to follow the sheath temperature, whose evolution depends on the heat that gets broadcast. In this simulation was considered the transfer by convection and has been compared with the evolution of Variable considering convection plus radiation. The proposed radiation pattern RADGEN obtains code data for use in TRACE. With this model, obtains an evolution of the temperature in the transient pod less conservative.

  8. Neutron Transmission through Sapphire Crystals

    DEFF Research Database (Denmark)

    of simulations, in order to reproduce the transmission of cold neutrons through sapphire crystals. Those simulations were part of the effort of validating and improving the newly developed interface between the Monte-Carlo neutron transport code MCNP and the Monte Carlo ray-tracing code McStas....

  9. Numerical modeling of flow boiling instabilities using TRACE

    International Nuclear Information System (INIS)

    Kommer, Eric M.

    2015-01-01

    Highlights: • TRACE was used to realistically model boiling instabilities in single and parallel channel configurations. • Model parameters were chosen to exactly mimic other author’s work in order to provide for direct comparison of results. • Flow stability maps generated by the model show unstable flow at operating points similar to other authors. • The method of adjudicating when a flow is “unstable” is critical in this type of numerical study. - Abstract: Dynamic flow instabilities in two-phase systems are a vitally important area of study due to their effects on a great number of industrial applications, including heat exchangers in nuclear power plants. Several next generation nuclear reactor designs incorporate once through steam generators which will exhibit boiling flow instabilities if not properly designed or when operated outside design limits. A number of numerical thermal hydraulic codes attempt to model instabilities for initial design and for use in accident analysis. TRACE, the Nuclear Regulatory Commission’s newest thermal hydraulic code is used in this study to investigate flow instabilities in both single and dual parallel channel configurations. The model parameters are selected as to replicate other investigators’ experimental and numerical work in order to provide easy comparison. Particular attention is paid to the similarities between analysis using TRACE Version 5.0 and RELAP5/MOD3.3. Comparison of results is accomplished via flow stability maps non-dimensionalized via the phase change and subcooling numbers. Results of this study show that TRACE does indeed model two phase flow instabilities, with the transient response closely mimicking that seen in experimental studies. When compared to flow stability maps generated using RELAP, TRACE shows similar results with differences likely due to the somewhat qualitative criteria used by various authors to determine when the flow is truly unstable

  10. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  11. Linac particle tracing simulations

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1979-01-01

    A particle tracing code was developed to study space--charge effects in proton or heavy-ion linear accelerators. The purpose is to study space--charge phenomena as directly as possible without the complications of many accelerator details. Thus, the accelerator is represented simply by harmonic oscillator or impulse restoring forces. Variable parameters as well as mismatched phase--space distributions were studied. This study represents the initial search for those features of the accelerator or of the phase--space distribution that lead to emittance growth

  12. Applicability of best-estimate analysis TRACE in terms of natural circulation BWR stability

    International Nuclear Information System (INIS)

    Furuya, Masahiro; Ueda, Nobuyuki; Nishi, Yoshihisa

    2011-01-01

    As a part of the international CAMP-Program of the US Nuclear Regulatory Commission (USNRC), the best-estimate code TRACE is validated with the stability database of SIRIUS-N Facility at high pressure. The TRACE code analyzed is version 5 patch level 2. The SIRIUS-N facility simulates thermal-hydraulics of the economic simplified BWR (ESBWR). The oscillation period correlates well with bubble transit time through the chimney region regardless of the system pressure, inlet subcooling and heat flux. Numerical results exhibits type-I density wave oscillation characteristics, since core inlet restriction shifts stability boundary toward the higher inlet subcooling, and chimney exit restriction enlarges instability region and oscillation amplitude. Stability maps in reference to the subcooling and heat flux obtained from the TRACE code agrees with those of the experimental data at 1 MPa. As the pressure increases from 2 MPa to 7.2 MPa, numerical results become much stable than the experimental results. This is because that two-phase frictional loss is underestimate, since the natural circulation flow rate of numerical results is higher by approximately 20% than that of experimental results. (author)

  13. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  14. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  15. Coding of Stimuli by Animals: Retrospection, Prospection, Episodic Memory and Future Planning

    Science.gov (United States)

    Zentall, Thomas R.

    2010-01-01

    When animals code stimuli for later retrieval they can either code them in terms of the stimulus presented (as a retrospective memory) or in terms of the response or outcome anticipated (as a prospective memory). Although retrospective memory is typically assumed (as in the form of a memory trace), evidence of prospective coding has been found…

  16. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  17. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  18. RIA Fuel Codes Benchmark - Volume 1

    International Nuclear Information System (INIS)

    Marchand, Olivier; Georgenthum, Vincent; Petit, Marc; Udagawa, Yutaka; Nagase, Fumihisa; Sugiyama, Tomoyuki; Arffman, Asko; Cherubini, Marco; Dostal, Martin; Klouzal, Jan; Geelhood, Kenneth; Gorzel, Andreas; Holt, Lars; Jernkvist, Lars Olof; Khvostov, Grigori; Maertens, Dietmar; Spykman, Gerold; Nakajima, Tetsuo; Nechaeva, Olga; Panka, Istvan; Rey Gayo, Jose M.; Sagrado Garcia, Inmaculada C.; Shin, An-Dong; Sonnenburg, Heinz Guenther; Umidova, Zeynab; Zhang, Jinzhao; Voglewede, John

    2013-01-01

    Reactivity-initiated accident (RIA) fuel rod codes have been developed for a significant period of time and they all have shown their ability to reproduce some experimental results with a certain degree of adequacy. However, they sometimes rely on different specific modelling assumptions the influence of which on the final results of the calculations is difficult to evaluate. The NEA Working Group on Fuel Safety (WGFS) is tasked with advancing the understanding of fuel safety issues by assessing the technical basis for current safety criteria and their applicability to high burnup and to new fuel designs and materials. The group aims at facilitating international convergence in this area, including the review of experimental approaches as well as the interpretation and use of experimental data relevant for safety. As a contribution to this task, WGFS conducted a RIA code benchmark based on RIA tests performed in the Nuclear Safety Research Reactor in Tokai, Japan and tests performed or planned in CABRI reactor in Cadarache, France. Emphasis was on assessment of different modelling options for RIA fuel rod codes in terms of reproducing experimental results as well as extrapolating to typical reactor conditions. This report provides a summary of the results of this task. (authors)

  19. Single-phase mixing studies by means of a directly coupled CFD/system-code tool

    International Nuclear Information System (INIS)

    Bertolotto, Davide; Chawla, Rakesh; Manera, Annalisa; Smith, Brian; Prasser, Horst-Michael

    2008-01-01

    The present paper describes the coupling of the 3D computational fluid dynamics (CFD) code CFX with the best estimate thermal-hydraulic code TRACE. Two different coupling schemes, i.e. an explicit and a semi-implicit one, have been tested. Verification of the coupled CFX/TRACE code has first been carried out on the basis of a simple test case consisting of a straight pipe filled with liquid subject to a sudden acceleration. As a second validation step, measurements using advanced instrumentation (wire-mesh sensors) have been performed in a simple, specially constructed test facility consisting of two loops connected by a double T-junction. Comparisons of the measurements are made with calculation results obtained using the coupled codes, as well as the individual codes in stand-alone mode, thereby clearly bringing out the effectiveness of the achieved coupling for simulating situations in which three-dimensional mixing phenomena are important. (authors)

  20. Markov traces and II1 factors in conformal field theory

    International Nuclear Information System (INIS)

    Boer, J. de; Goeree, J.

    1991-01-01

    Using the duality equations of Moore and Seiberg we define for every primary field in a Rational Conformal Field Theory a proper Markov trace and hence a knot invariant. Next we define two nested algebras and show, using results of Ocneanu, how the position of the smaller algebra in the larger one reproduces part of the duality data. A new method for constructing Rational Conformal Field Theories is proposed. (orig.)

  1. Recent progress of an integrated implosion code and modeling of element physics

    International Nuclear Information System (INIS)

    Nagatomo, H.; Takabe, H.; Mima, K.; Ohnishi, N.; Sunahara, A.; Takeda, T.; Nishihara, K.; Nishiguchu, A.; Sawada, K.

    2001-01-01

    Physics of the inertial fusion is based on a variety of elements such as compressible hydrodynamics, radiation transport, non-ideal equation of state, non-LTE atomic process, and relativistic laser plasma interaction. In addition, implosion process is not in stationary state and fluid dynamics, energy transport and instabilities should be solved simultaneously. In order to study such complex physics, an integrated implosion code including all physics important in the implosion process should be developed. The details of physics elements should be studied and the resultant numerical modeling should be installed in the integrated code so that the implosion can be simulated with available computer within realistic CPU time. Therefore, this task can be basically separated into two parts. One is to integrate all physics elements into a code, which is strongly related to the development of hydrodynamic equation solver. We have developed 2-D integrated implosion code which solves mass, momentum, electron energy, ion energy, equation of states, laser ray-trace, laser absorption radiation, surface tracing and so on. The reasonable results in simulating Rayleigh-Taylor instability and cylindrical implosion are obtained using this code. The other is code development on each element physics and verification of these codes. We had progress in developing a nonlocal electron transport code and 2 and 3 dimension radiation hydrodynamic code. (author)

  2. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    Science.gov (United States)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  3. The elimination of ray tracing in Monte Carlo shielding programs

    International Nuclear Information System (INIS)

    Bendall, D.E.

    1988-01-01

    The MONK6 code has clearly demonstrated the advantages of hole tracking, which was devised by Woodcock et at. for use in criticality codes from earlier work by Von Neumann. Hole tracking eliminates ray tracing by introducing, for all materials present in the problem, a pseudo scattering reaction that forward scatters without energy loss. The cross section for this reaction is chosen so that the total cross sections for all the materials are equal at a given energy. By this means, tracking takes place with a constant total cross section everywhere, so there is now no need to ray trace. The present work extends hole tracking to shielding codes, where it functions in tandem with Russian roulette and splitting. An algorithm has been evolved and its performance is compared with the ray-tracking code McBEND. A disadvantage with hole tracking occurs when there is a wide variation in total cross section for materials present. As the tracking uses the total cross section of the material that has the maximum cross section, there can be a large number of pseudo collisions in the materials with low total cross sections. In extreme cases, the advantages of hole tracking can be lost by the by the extra time taken in servicing these pseudo collisions; however, techniques for eliminating this problem are under consideration

  4. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser

    Directory of Open Access Journals (Sweden)

    Jonas S Almeida

    2012-01-01

    Full Text Available Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results : Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH′s popular ImageJ application. Conclusions : The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without

  5. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser.

    Science.gov (United States)

    Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R

    2012-01-01

    Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".

  6. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  7. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    -predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  8. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  9. Epidemic contact tracing via communication traces.

    Directory of Open Access Journals (Sweden)

    Katayoun Farrahi

    Full Text Available Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  10. Epidemic contact tracing via communication traces.

    Science.gov (United States)

    Farrahi, Katayoun; Emonet, Rémi; Cebrian, Manuel

    2014-01-01

    Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  11. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    Science.gov (United States)

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  12. Reproducing a Prospective Clinical Study as a Computational Retrospective Study in MIMIC-II.

    Science.gov (United States)

    Kury, Fabrício S P; Huser, Vojtech; Cimino, James J

    2015-01-01

    In this paper we sought to reproduce, as a computational retrospective study in an EHR database (MIMIC-II), a recent large prospective clinical study: the 2013 publication, by the Japanese Association for Acute Medicine (JAAM), about disseminated intravascular coagulation, in the journal Critical Care (PMID: 23787004). We designed in SQL and Java a set of electronic phenotypes that reproduced the study's data sampling, and used R to perform the same statistical inference procedures. All produced source code is available online at https://github.com/fabkury/paamia2015. Our program identified 2,257 eligible patients in MIMIC-II, and the results remarkably agreed with the prospective study. A minority of the needed data elements was not found in MIMIC-II, and statistically significant inferences were possible in the majority of the cases.

  13. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    OpenAIRE

    Stodden, Victoria; Miguez, Sheila

    2014-01-01

    The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are...

  14. A unified approach to model uptake kinetics of trace elements in complex aqueous – solid solution systems

    International Nuclear Information System (INIS)

    Thien, Bruno M.J.; Kulik, Dmitrii A.; Curti, Enzo

    2014-01-01

    Highlights: • There are several models able to describe trace element partitioning in growing minerals. • To describe complex systems, those models must be embedded in a geochemical code. • We merged two models into a unified one suitable for implementation in a geochemical code. • This unified model was tested against coprecipitation experimental data. • We explored how our model reacts to solution depletion effects. - Abstract: Thermodynamics alone is usually not sufficient to predict growth-rate dependencies of trace element partitioning into host mineral solid solutions. In this contribution, two uptake kinetic models were analyzed that are promising in terms of mechanistic understanding and potential for implementation in geochemical modelling codes. The growth Surface Entrapment Model (Watson, 2004) and the Surface Reaction Kinetic Model (DePaolo, 2011) were shown to be complementary, and under certain assumptions merged into a single analytical expression. This Unified Uptake Kinetics Model was implemented in GEMS3K and GEM-Selektor codes ( (http://gems.web.psi.ch)), a Gibbs energy minimization package for geochemical modelling. This implementation extends the applicability of the unified uptake kinetics model to accounting for non-trivial factors influencing the trace element partitioning into solid solutions, such as the changes in aqueous solution composition and speciation, or the depletion effects in closed geochemical systems

  15. Reproducible research: a minority opinion

    Science.gov (United States)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  16. New challenges in ray tracing simulations of X-ray optics

    International Nuclear Information System (INIS)

    Río, M Sánchez del

    2013-01-01

    The construction of new synchrotron sources and the refurbishment and upgrade of existing ones has boosted in the last years the interest in X-ray optics simulations for beamline design and optimization. In the last years we conducted a full renewal of the well established SHADOW ray tracing code, ending with a modular version SHADOW3 interfaced to multiple programming languages (C, C++, IDL, Python). Some of the new features of SHADOW3 are presented. From the physics point of view, SHADOW3 has been upgraded for dealing with lens systems. X-ray partial coherence applications demand an extension of traditional ray tracing methods into a hybrid ray-tracing wave-optics approach. The software development is essential for fulfilling the requests of the ESRF Upgrade Programme, and some examples of calculations are also presented.

  17. TRACK The New Beam Dynamics Code

    CERN Document Server

    Mustapha, Brahim; Ostroumov, Peter; Schnirman-Lessner, Eliane

    2005-01-01

    The new ray-tracing code TRACK was developed* to fulfill the special requirements of the RIA accelerator systems. The RIA lattice includes an ECR ion source, a LEBT containing a MHB and a RFQ followed by three SC linac sections separated by two stripping stations with appropriate magnetic transport systems. No available beam dynamics code meet all the necessary requirements for an end-to-end simulation of the RIA driver linac. The latest version of TRACK was used for end-to-end simulations of the RIA driver including errors and beam loss analysis.** In addition to the standard capabilities, the code includes the following new features: i) multiple charge states ii) realistic stripper model; ii) static and dynamic errors iii) automatic steering to correct for misalignments iv) detailed beam-loss analysis; v) parallel computing to perform large scale simulations. Although primarily developed for simulations of the RIA machine, TRACK is a general beam dynamics code. Currently it is being used for the design and ...

  18. Analysis of Void Fraction Distribution and Departure from Nucleate Boiling in Single Subchannel and Bundle Geometries Using Subchannel, System, and Computational Fluid Dynamics Codes

    Directory of Open Access Journals (Sweden)

    Taewan Kim

    2012-01-01

    Full Text Available In order to assess the accuracy and validity of subchannel, system, and computational fluid dynamics codes, the Paul Scherrer Institut has participated in the OECD/NRC PSBT benchmark with the thermal-hydraulic system code TRACE5.0 developed by US NRC, the subchannel code FLICA4 developed by CEA, and the computational fluid dynamic code STAR-CD developed by CD-adapco. The PSBT benchmark consists of a series of void distribution exercises and departure from nucleate boiling exercises. The results reveal that the prediction by the subchannel code FLICA4 agrees with the experimental data reasonably well in both steady-state and transient conditions. The analyses of single-subchannel experiments by means of the computational fluid dynamic code STAR-CD with the CD-adapco boiling model indicate that the prediction of the void fraction has no significant discrepancy from the experiments. The analyses with TRACE point out the necessity to perform additional assessment of the subcooled boiling model and bulk condensation model of TRACE.

  19. Generalised tally-based decoders for traitor tracing and group testing

    NARCIS (Netherlands)

    Skoric, B.; de Groot, W.

    2015-01-01

    We propose a new type of score function for Tardos traitor tracing codes. It is related to the recently introduced tally-based score function, but it utilizes more of the information available to the decoder. It does this by keeping track of sequences of symbols in the distributed codewords instead

  20. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    2014-07-01

    Full Text Available The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are changing to reflect the increasingly computational nature of scholarly research, primarily to include the sharing of the data and code associated with published results. We also present these Best Practices as a living, evolving, and changing document at http://wiki.stodden.net/Best_Practices.

  1. Code of Practice on the International Transboundary Movement of Radioactive Waste

    International Nuclear Information System (INIS)

    1990-11-01

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States

  2. Code of Practice on the International Transboundary Movement of Radioactive Waste

    International Nuclear Information System (INIS)

    1990-01-01

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States [fr

  3. Code of Practice on the International Transboundary Movement of Radioactive Waste

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-11-14

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States.

  4. Code of Practice on the International Transboundary Movement of Radioactive Waste

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-11-15

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States.

  5. Code of Practice on the International Transboundary Movement of Radioactive Waste

    International Nuclear Information System (INIS)

    1990-01-01

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States

  6. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  7. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  8. Simulations Of Neutron Beam Optic For Neutron Radiography Collimator Using Ray Tracing Methodology

    International Nuclear Information System (INIS)

    Norfarizan Mohd Said; Muhammad Rawi Mohamed Zin

    2014-01-01

    Ray- tracing is a technique for simulating the performance of neutron instruments. McStas, the open-source software package based on a meta-language, is a tool for carrying out ray-tracing simulations. The program has been successfully applied in investigating neutron guide design, flux optimization and other related areas with high complexity and precision. The aim of this paper is to discuss the implementation of ray-tracing technique with McStas for simulating the performance of neutron collimation system developed for imaging system of TRIGA RTP reactor. The code for the simulation was developed and the results are presented. The analysis of the performance is reported and discussed. (author)

  9. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  10. Trace conditioning in insects-keep the trace!

    Science.gov (United States)

    Dylla, Kristina V; Galili, Dana S; Szyszka, Paul; Lüdke, Alja

    2013-01-01

    Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS) and an unconditioned stimulus (US) following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination-a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase (Rut-AC), which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  11. Unveiling Exception Handling Bug Hazards in Android Based on GitHub and Google Code Issues

    NARCIS (Netherlands)

    Coelho, R.; Almeida, L.; Gousios, G.; Van Deursen, A.

    2015-01-01

    This paper reports on a study mining the exception stack traces included in 159,048 issues reported on Android projects hosted in GitHub (482 projects) and Google Code (157 projects). The goal of this study is to investigate whether stack trace information can reveal bug hazards related to exception

  12. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  13. Image Tracing: An Analysis of Its Effectiveness in Children's Pictorial Discrimination Learning

    Science.gov (United States)

    Levin, Joel R.; And Others

    1977-01-01

    A total of 45 fifth grade students were the subjects of an experiment offering support for a component of learning strategy (memory imagery). Various theoretical explanations of the image-tracing phenomenon are considered, including depth of processing, dual coding and frequency. (MS)

  14. TRACE/PARCS analysis of the OECD/NEA Oskarshamn-2 BWR stability benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Kozlowski, T. [Univ. of Illinois, Urbana-Champaign, IL (United States); Downar, T.; Xu, Y.; Wysocki, A. [Univ. of Michigan, Ann Arbor, MI (United States); Ivanov, K.; Magedanz, J.; Hardgrove, M. [Pennsylvania State Univ., Univ. Park, PA (United States); March-Leuba, J. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hudson, N.; Woodyatt, D. [Nuclear Regulatory Commission, Rockville, MD (United States)

    2012-07-01

    On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event which culminated in diverging power oscillations with a decay ratio of about 1.4. The event was successfully modeled by the TRACE/PARCS coupled code system, and further analysis of the event is described in this paper. The results show very good agreement with the plant data, capturing the entire behavior of the transient including the onset of instability, growth of the oscillations (decay ratio) and oscillation frequency. This provides confidence in the prediction of other parameters which are not available from the plant records. The event provides coupled code validation for a challenging BWR stability event, which involves the accurate simulation of neutron kinetics (NK), thermal-hydraulics (TH), and TH/NK. coupling. The success of this work has demonstrated the ability of the 3-D coupled systems code TRACE/PARCS to capture the complex behavior of BWR stability events. The problem was released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (authors)

  15. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    International Nuclear Information System (INIS)

    He, Xun

    2016-01-01

    MSR concept using the mathematic tools. In particular, the aim of the first part is to demonstrate the suitability of the TRACE code for the similar MSR designs by using a modified version of the TRACE code to implement the simulations for the steady-state, transient and accidental conditions. The basic approach of this part is to couple the thermal-hydraulic model and the modified point-kinetic model. The equivalent thermal-hydraulic model of the MSRE was built in 1D with three loops including all the critical main components. The point-kinetic model was improved through considering the precursor drift in order to produce more practical results in terms of the delayed neutron behavior. Additionally, new working fluids, namely the molten salts, were embedded into the source code of TRACE. Most results of the simulations show good agreements with the ORNL's reports and with another recent study and the errors were predictable and in an acceptable range. Therefore, the necessary code modification of TRACE appears to be successful and the model will be refined and its functions will be extended further in order to investigate new MSR design. Another part of this thesis is to implement a preliminary study on a new concept of molten salt reactor, namely the Dual Fluid Reactor (DFR). The DFR belongs to the group of the molten salt fast reactors (MSFR) and it is recently considered to be an option of minimum-waste and inherently safe operation of the nuclear reactors in the future. The DFR is using two separately circulating fluids in the reactor core. One is the fuel salt based on the mixture of tri-chlorides of uranium and plutonium (UCl_3-PuCl_3), while another is the coolant composed of the pure lead (Pb). The current work focuses on the basic dynamic behavior of a scaled-down DFR with 500 MW thermal output (DFR-500) instead of its reference design with 3000 MW thermal output (DFR-3000). For this purpose 10 parallel single fuel channels, as the representative samples

  16. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    one is about the demonstration of a new MSR concept using the mathematic tools. In particular, the aim of the first part is to demonstrate the suitability of the TRACE code for the similar MSR designs by using a modified version of the TRACE code to implement the simulations for the steady-state, transient and accidental conditions. The basic approach of this part is to couple the thermal-hydraulic model and the modified point-kinetic model. The equivalent thermal-hydraulic model of the MSRE was built in 1D with three loops including all the critical main components. The point-kinetic model was improved through considering the precursor drift in order to produce more practical results in terms of the delayed neutron behavior. Additionally, new working fluids, namely the molten salts, were embedded into the source code of TRACE. Most results of the simulations show good agreements with the ORNL's reports and with another recent study and the errors were predictable and in an acceptable range. Therefore, the necessary code modification of TRACE appears to be successful and the model will be refined and its functions will be extended further in order to investigate new MSR design. Another part of this thesis is to implement a preliminary study on a new concept of molten salt reactor, namely the Dual Fluid Reactor (DFR). The DFR belongs to the group of the molten salt fast reactors (MSFR) and it is recently considered to be an option of minimum-waste and inherently safe operation of the nuclear reactors in the future. The DFR is using two separately circulating fluids in the reactor core. One is the fuel salt based on the mixture of tri-chlorides of uranium and plutonium (UCl{sub 3}-PuCl{sub 3}), while another is the coolant composed of the pure lead (Pb). The current work focuses on the basic dynamic behavior of a scaled-down DFR with 500 MW thermal output (DFR-500) instead of its reference design with 3000 MW thermal output (DFR-3000). For this purpose 10 parallel

  17. Extensible, Reusable, and Reproducible Computing: A Case Study of PySPH

    International Nuclear Information System (INIS)

    Ramachandran, Prabhu

    2016-01-01

    In this work, the Smoothed Particle Hydrodynamics (SPH) technique is considered as an example of a typical computational research area. PySPH is an open source framework for SPH computations. PySPH is designed to be easy to use. The framework allows a user to implement an entire simulation in pure Python. It is designed to make it easy for scientists to reuse their code and extend the work of others. These important features allow PySPH to facilitate reproducible computational research. Based on the experience with PySPH, general recommendations are suggested for other computational researchers. (paper)

  18. Trace element studies in bioenvironmental samples using 3-MeV protons

    International Nuclear Information System (INIS)

    Walter, R.L.; Willis, R.D.; Gutknecht, W.F.

    1974-01-01

    Trace metal compositions of a wide range of biological, environmental, medical and clinical samples were investigated using proton-induced x-ray emission analysis (PIXEA). The x-rays were detected with a Si(Li) detector and spectra from over 3000 irradiations have been recorded on magnetic tape. The chi 2 fitting code TRACE developed at our laboratory was used in a semi-automatic mode to extract abundances of elements from S to Cd. Various methods of overcoming analytical problems and specimen preparation difficulties are reported. Results from some samples for typical studies are illustrated along with the reasons for interest in the sample types

  19. Neutron transport study based on assembly modular ray tracing MOC method

    International Nuclear Information System (INIS)

    Tian Chao; Zheng Youqi; Li Yunzhao; Li Shuo; Chai Xiaoming

    2015-01-01

    It is difficulty for the MOC method based on Cell Modular Ray Tracing to deal with the irregular geometry such as the water gap between the PWR lattices. Hence, the neutron transport code NECP-Medlar based on Assembly Modular Ray Tracing is developed. CMFD method is used to accelerate the transport calculation. The numerical results of the 2D C5G7 benchmark and typical PWR lattice prove that NECP-Medlar has an excellent performance in terms of accuracy and efficiency. Besides, NECP-Medlar can describe clearly the flux distribution of the lattice with water gap. (authors)

  20. A Denotational Semantics for Communicating Unstructured Code

    Directory of Open Access Journals (Sweden)

    Nils Jähnig

    2015-03-01

    Full Text Available An important property of programming language semantics is that they should be compositional. However, unstructured low-level code contains goto-like commands making it hard to define a semantics that is compositional. In this paper, we follow the ideas of Saabas and Uustalu to structure low-level code. This gives us the possibility to define a compositional denotational semantics based on least fixed points to allow for the use of inductive verification methods. We capture the semantics of communication using finite traces similar to the denotations of CSP. In addition, we examine properties of this semantics and give an example that demonstrates reasoning about communication and jumps. With this semantics, we lay the foundations for a proof calculus that captures both, the semantics of unstructured low-level code and communication.

  1. Adjustments in Almod3W2 transient analysis code to fit Angra 1 NPP experimental data

    International Nuclear Information System (INIS)

    Madeira, A.A.; Camargo, C.T.M.

    1988-01-01

    Some little modifications were introduced in ALMOD3W2 code, as consequence of the interest in reproducing the full load rejection test in Angra 1 NPP. Such modifications showed to be adequate when code results were compared with experimental data. (author) [pt

  2. Model with Peach Bottom Turbine trip and thermal-Hydraulic code TRACE V5P3

    International Nuclear Information System (INIS)

    Mesado, C.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-01-01

    This work is the continuation of the work presented previously in the thirty-ninth meeting annual of the Spanish Nuclear society. The semi-automatic translation of the Thermo-hydraulic model TRAC-BF1 Peach Bottom Turbine Trip to TRACE was presented in such work. This article is intended to validate the model obtained in TRACE, why compare the model results result from the translation with the Benchmark results: NEA/OECD BWR Peach Bottom Turbine Trip (PBTT), in particular is of the extreme scenario 2 of exercise 3, in which there is SCRAM in the reactor. Among other data present in the (transitional) Benchmark , are: total power, axial profile of power, pressure Dome, total reactivity and its components. (Author)

  3. Prediction of adiabatic bubbly flows in TRACE using the interfacial area transport equation

    International Nuclear Information System (INIS)

    Talley, J.; Worosz, T.; Kim, S.; Mahaffy, J.; Bajorek, S.; Tien, K.

    2011-01-01

    The conventional thermal-hydraulic reactor system analysis codes utilize a two-field, two-fluid formulation to model two-phase flows. To close this model, static flow regime transition criteria and algebraic relations are utilized to estimate the interfacial area concentration (a i ). To better reflect the continuous evolution of two-phase flow, an experimental version of TRACE is being developed which implements the interfacial area transport equation (IATE) to replace the flow regime based approach. Dynamic estimation of a i is provided through the use of mechanistic models for bubble coalescence and disintegration. To account for the differences in bubble interactions and drag forces, two-group bubble transport is sought. As such, Group 1 accounts for the transport of spherical and distorted bubbles, while Group 2 accounts for the cap, slug, and churn-turbulent bubbles. Based on this categorization, a two-group IATE applicable to the range of dispersed two-phase flows has been previously developed. Recently, a one-group, one-dimensional, adiabatic IATE has been implemented into the TRACE code with mechanistic models accounting for: (1) bubble breakup due to turbulent impact of an eddy on a bubble, (2) bubble coalescence due to random collision driven by turbulent eddies, and (3) bubble coalescence due to the acceleration of a bubble in the wake region of a preceding bubble. To demonstrate the enhancement of the code's capability using the IATE, experimental data for a i , void fraction, and bubble velocity measured by a multi-sensor conductivity probe are compared to both the IATE and flow regime based predictions. In total, 50 air-water vertical co-current upward and downward bubbly flow conditions in pipes with diameters ranging from 2.54 to 20.32 cm are evaluated. It is found that TRACE, using the conventional flow regime relation, always underestimates a i . Moreover, the axial trend of the a i prediction is always quasi-linear because a i in the

  4. CMCpy: Genetic Code-Message Coevolution Models in Python

    Science.gov (United States)

    Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.

    2013-01-01

    Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367

  5. Code of ethics and conduct for European nursing.

    Science.gov (United States)

    Sasso, Loredana; Stievano, Alessandro; González Jurado, Máximo; Rocco, Gennaro

    2008-11-01

    A main identifying factor of professions is professionals' willingness to comply with ethical and professional standards, often defined in a code of ethics and conduct. In a period of intense nursing mobility, if the public are aware that health professionals have committed themselves to the drawing up of a code of ethics and conduct, they will have more trust in the health professional they choose, especially if this person comes from another European Member State. The Code of Ethics and Conduct for European Nursing is a programmatic document for the nursing profession constructed by the FEPI (European Federation of Nursing Regulators) according to Directive 2005/36/EC On recognition of professional qualifications , and Directive 2006/123/EC On services in the internal market, set out by the European Commission. This article describes the construction of the Code and gives an overview of some specific areas of importance. The main text of the Code is reproduced in Appendix 1.

  6. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  7. Combined micro-droplet and thin-film-assisted pre-concentration of lead traces for on-line monitoring using anodic stripping voltammetry.

    Science.gov (United States)

    Belostotsky, Inessa; Gridin, Vladimir V; Schechter, Israel; Yarnitzky, Chaim N

    2003-02-01

    An improved analytical method for airborne lead traces is reported. It is based on using a Venturi scrubber sampling device for simultaneous thin-film stripping and droplet entrapment of aerosol influxes. At least threefold enhancement of the lead-trace pre-concentration is achieved. The sampled traces are analyzed by square-wave anodic stripping voltammetry. The method was tested by a series of pilot experiments. These were performed using contaminant-controlled air intakes. Reproducible calibration plots were obtained. The data were validated by traditional analysis using filter sampling. LODs are comparable with the conventional techniques. The method was successfully applied to on-line and in situ environmental monitoring of lead.

  8. Combined micro-droplet and thin-film-assisted pre-concentration of lead traces for on-line monitoring using anodic stripping voltammetry

    Energy Technology Data Exchange (ETDEWEB)

    Belostotsky, Inessa; Gridin, Vladimir V.; Schechter, Israel; Yarnitzky, Chaim N. [Department of Chemistry, Technion Israel Institute of Technology, 32000, Haifa (Israel)

    2003-02-01

    An improved analytical method for airborne lead traces is reported. It is based on using a Venturi scrubber sampling device for simultaneous thin-film stripping and droplet entrapment of aerosol influxes. At least threefold enhancement of the lead-trace pre-concentration is achieved. The sampled traces are analyzed by square-wave anodic stripping voltammetry. The method was tested by a series of pilot experiments. These were performed using contaminant-controlled air intakes. Reproducible calibration plots were obtained. The data were validated by traditional analysis using filter sampling. LODs are comparable with the conventional techniques. The method was successfully applied to on-line and in situ environmental monitoring of lead. (orig.)

  9. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  10. Piezoelectric trace vapor calibrator

    International Nuclear Information System (INIS)

    Verkouteren, R. Michael; Gillen, Greg; Taylor, David W.

    2006-01-01

    The design and performance of a vapor generator for calibration and testing of trace chemical sensors are described. The device utilizes piezoelectric ink-jet nozzles to dispense and vaporize precisely known amounts of analyte solutions as monodisperse droplets onto a hot ceramic surface, where the generated vapors are mixed with air before exiting the device. Injected droplets are monitored by microscope with strobed illumination, and the reproducibility of droplet volumes is optimized by adjustment of piezoelectric wave form parameters. Complete vaporization of the droplets occurs only across a 10 deg. C window within the transition boiling regime of the solvent, and the minimum and maximum rates of trace analyte that may be injected and evaporated are determined by thermodynamic principles and empirical observations of droplet formation and stability. By varying solution concentrations, droplet injection rates, air flow, and the number of active nozzles, the system is designed to deliver--on demand--continuous vapor concentrations across more than six orders of magnitude (nominally 290 fg/l to 1.05 μg/l). Vapor pulses containing femtogram to microgram quantities of analyte may also be generated. Calibrated ranges of three explosive vapors at ng/l levels were generated by the device and directly measured by ion mobility spectrometry (IMS). These data demonstrate expected linear trends within the limited working range of the IMS detector and also exhibit subtle nonlinear behavior from the IMS measurement process

  11. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  12. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  13. ADLIB: A simple database framework for beamline codes

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1993-01-01

    There are many well developed codes available for beamline design and analysis. A significant fraction of each of these codes is devoted to processing its own unique input language for describing the problem. None of these large, complex, and powerful codes does everything. Adding a new bit of specialized physics can be a difficult task whose successful completion makes the code even larger and more complex. This paper describes an attempt to move in the opposite direction, toward a family of small, simple, single purpose physics and utility modules, linked by an open, portable, public domain database framework. These small specialized physics codes begin with the beamline parameters already loaded in the database, and accessible via the handful of subroutines that constitute ADLIB. Such codes are easier to write, and inherently organized in a manner suitable for incorporation in model based control system algorithms. Examples include programs for analyzing beamline misalignment sensitivities, for simulating and fitting beam steering data, and for translating among MARYLIE, TRANSPORT, and TRACE3D formats

  14. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  15. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  16. Trace coupled with PARCS benchmark against Leibstadt plant data during the turbine trip test

    Energy Technology Data Exchange (ETDEWEB)

    Sekhri, Abdelkrim; Baumann, Peter, E-mail: abdelkrim.sekhri@kkl.ch, E-mail: peter.Baumann@kkl.ch [KernkraftwerkLeibstadt AG, Leibstadt (Switzerland); Hidalga, Patricio; Morera, Daniel; Miro, Rafael; Barrachina, Teresa; Verdu, Gumersindo, E-mail: pathigar@etsii.upv.es, E-mail: dmorera@isirym.upv.es, E-mail: rmiro@isirym.upv.es, E-mail: tbarrachina@isirym.upv.es, E-mail: gverdu@isirym.upv.es [Universitat Politecnica de Valencia (ISIRYM/UPV), Valencia, (Spain). Institute for Industrial, Radiophysical and Environmental Safety

    2013-07-01

    In order to enhance the modeling of Nuclear Power Plant Leibstadt (KKL), the coupling of 3D neutron kinetics PARCS code with TRACE has been developed. To test its performance a complex transient of Turbine Trip has been simulated comparing the results with the existing plant data of Turbine Trip test. For this transient also Cross Sections have been generated and used by PARCS. The thermal-hydraulic TRACE model is retrieved from the already existing model. For the benchmarking the Turbine Trip transient has been simulated according to the test resulting in the closure of the turbine control valve (TCV) and the following opening of the bypass valve (TBV). This transient caused a pressure shock wave towards the Reactor Pressure Vessel (RPV) which provoked the decreasing of the void level and the consequent slight power excursion. The power control capacity of the system showed a good response with the procedure of a Selected Rod Insertion (SRI) and the recirculation loops performance which resulted in the proper thermal power reduction comparable to APRM data recorder from the plant. The comparison with plant data shows good agreement in general and assesses the performance of the coupled model. Due to this, it can be concluded that the coupling of PARCS and TRACE codes in addition with the Cross Section used works successfully for simulating the behavior of the reactor core during complex plant transients. Nevertheless the TRACE model shall be improved and the core neutronics corresponding to the test shall be used in the future to allow quantitative comparison between TRACE and plant recorded data. (author)

  17. Trace conditioning in insects – Keep the trace!

    Directory of Open Access Journals (Sweden)

    Kristina V Dylla

    2013-08-01

    Full Text Available Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS and an unconditioned stimulus (US following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination – a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase, which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  18. Repeat: a framework to assess empirical reproducibility in biomedical research

    Directory of Open Access Journals (Sweden)

    Leslie D. McIntosh

    2017-09-01

    Full Text Available Abstract Background The reproducibility of research is essential to rigorous science, yet significant concerns of the reliability and verifiability of biomedical research have been recently highlighted. Ongoing efforts across several domains of science and policy are working to clarify the fundamental characteristics of reproducibility and to enhance the transparency and accessibility of research. Methods The aim of the proceeding work is to develop an assessment tool operationalizing key concepts of research transparency in the biomedical domain, specifically for secondary biomedical data research using electronic health record data. The tool (RepeAT was developed through a multi-phase process that involved coding and extracting recommendations and practices for improving reproducibility from publications and reports across the biomedical and statistical sciences, field testing the instrument, and refining variables. Results RepeAT includes 119 unique variables grouped into five categories (research design and aim, database and data collection methods, data mining and data cleaning, data analysis, data sharing and documentation. Preliminary results in manually processing 40 scientific manuscripts indicate components of the proposed framework with strong inter-rater reliability, as well as directions for further research and refinement of RepeAT. Conclusions The use of RepeAT may allow the biomedical community to have a better understanding of the current practices of research transparency and accessibility among principal investigators. Common adoption of RepeAT may improve reporting of research practices and the availability of research outputs. Additionally, use of RepeAT will facilitate comparisons of research transparency and accessibility across domains and institutions.

  19. The establishment and analysis of TRACE model for ultimate response guideline of Chinshan nuclear power plant - 15448

    International Nuclear Information System (INIS)

    Huang, J.J.; Wang, J.R.; Shih, C.; Chen, S.W.; Liao, L.Y.; Lin, H.T.

    2015-01-01

    The purpose of this research is to use TRACE code to perform a simulation that executes the procedures of URG (Ultimate Response Guidelines) to deal with Fukushima-like accidents. TRACE is an advanced thermal hydraulic code that has been developed by the United States Nuclear Regulatory Commission for NPP safety analysis. In this work TRACE has been used to analyze the thermal hydraulic model for the URG of the Chinshan nuclear power plant that is composed of 2 BWR-type reactors. URG includes 2-stage depressurization, alternative water injection and removing decay heat through the ejection from containment. The 2-stage depressurization strategy includes controlled depressurization and emergency depressurization to replace traditional one-stage depressurization. Results show that by comparing with one-stage depressurization strategy, 2-stage depressurization strategy is able to reduce peak cladding temperature (PCT) effectively and needs much less minimum flow rate of alternative water injection in the accident

  20. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    Science.gov (United States)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  1. Adaption of the PARCS Code for Core Design Audit Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyong Chol; Lee, Young Jin; Uhm, Jae Beop; Kim, Hyunjik [Nuclear Safety Evaluation, Daejeon (Korea, Republic of); Jeong, Hun Young; Ahn, Seunghoon; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-05-15

    The eigenvalue calculation also includes quasi-static core depletion analyses. PARCS has implemented variety of features and has been qualified as a regulatory audit code in conjunction with other NRC thermal-hydraulic codes such as TRACE or RELAP5. In this study, as an adaptation effort for audit applications, PARCS is applied for an audit analysis of a reload core design. The lattice physics code HELIOS is used for cross section generation. PARCS-HELIOS code system has been established as a core analysis tool. Calculation results have been compared on a wide spectrum of calculations such as power distribution, critical soluble boron concentration, and rod worth. A reasonable agreement between the audit calculation and the reference results has been found.

  2. Toric codes and quantum doubles from two-body Hamiltonians

    Energy Technology Data Exchange (ETDEWEB)

    Brell, Courtney G; Bartlett, Stephen D; Doherty, Andrew C [Centre for Engineered Quantum Systems, School of Physics, University of Sydney, Sydney (Australia); Flammia, Steven T, E-mail: cbrell@physics.usyd.edu.au [Perimeter Institute for Theoretical Physics, Waterloo (Canada)

    2011-05-15

    We present here a procedure to obtain the Hamiltonians of the toric code and Kitaev quantum double models as the low-energy limits of entirely two-body Hamiltonians. Our construction makes use of a new type of perturbation gadget based on error-detecting subsystem codes. The procedure is motivated by a projected entangled pair states (PEPS) description of the target models, and reproduces the target models' behavior using only couplings that are natural in terms of the original Hamiltonians. This allows our construction to capture the symmetries of the target models.

  3. Coupling a system code with computational fluid dynamics for the simulation of complex coolant reactivity effects

    International Nuclear Information System (INIS)

    Bertolotto, D.

    2011-11-01

    The current doctoral research is focused on the development and validation of a coupled computational tool, to combine the advantages of computational fluid dynamics (CFD) in analyzing complex flow fields and of state-of-the-art system codes employed for nuclear power plant (NPP) simulations. Such a tool can considerably enhance the analysis of NPP transient behavior, e.g. in the case of pressurized water reactor (PWR) accident scenarios such as Main Steam Line Break (MSLB) and boron dilution, in which strong coolant flow asymmetries and multi-dimensional mixing effects strongly influence the reactivity of the reactor core, as described in Chap. 1. To start with, a literature review on code coupling is presented in Chap. 2, together with the corresponding ongoing projects in the international community. Special reference is made to the framework in which this research has been carried out, i.e. the Paul Scherrer Institute's (PSI) project STARS (Steady-state and Transient Analysis Research for the Swiss reactors). In particular, the codes chosen for the coupling, i.e. the CFD code ANSYS CFX V11.0 and the system code US-NRC TRACE V5.0, are part of the STARS codes system. Their main features are also described in Chap. 2. The development of the coupled tool, named CFX/TRACE from the names of the two constitutive codes, has proven to be a complex and broad-based task, and therefore constraints had to be put on the target requirements, while keeping in mind a certain modularity to allow future extensions to be made with minimal efforts. After careful consideration, the coupling was defined to be on-line, parallel and with non-overlapping domains connected by an interface, which was developed through the Parallel Virtual Machines (PVM) software, as described in Chap. 3. Moreover, two numerical coupling schemes were implemented and tested: a sequential explicit scheme and a sequential semi-implicit scheme. Finally, it was decided that the coupling would be single

  4. TIM, a ray-tracing program for METATOY research and its dissemination

    Science.gov (United States)

    Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes

    2012-03-01

    TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.

  5. The assessment of containment codes by experiments simulating severe accident scenarios

    International Nuclear Information System (INIS)

    Karwat, H.

    1992-01-01

    Hitherto, a generally applicable validation matrix for codes simulating the containment behaviour under severe accident conditions did not exist. Past code applications have shown that most problems may be traced back to inaccurate thermalhydraulic parameters governing gas- or aerosol-distribution events. A provisional code-validation matrix is proposed, based on a careful selection of containment experiments performed during recent years in relevant test facilities under various operating conditions. The matrix focuses on the thermalhydraulic aspects of the containment behaviour after severe accidents as a first important step. It may be supplemented in the future by additional suitable tests

  6. Simulating an extreme over-the-horizon optical propagation event over Lake Michigan using a coupled mesoscale modeling and ray tracing framework

    NARCIS (Netherlands)

    Basu, S.

    2017-01-01

    Accurate simulation and forecasting of over-the-horizon propagation events are essential for various civilian and defense applications. We demonstrate the prowess of a newly proposed coupled mesoscale modeling and ray tracing framework in reproducing such an event. Wherever possible, routinely

  7. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    Science.gov (United States)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  8. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    Science.gov (United States)

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  9. Assessment of intercentre reproducibility and epidemiological concordance of Legionella pneumophila serogroup 1 genotyping by amplified fragment length polymorphism analysis

    DEFF Research Database (Denmark)

    Fry, N K; Bangsborg, Jette Marie; Bernander, S

    2000-01-01

    The aims of this work were to assess (i) the intercentre reproducibility and epidemiological concordance of amplified fragment length polymorphism analysis for epidemiological typing of Legionella pneumophila serogroup 1, and (ii) the suitability of the method for standardisation and implementation...... by members of the European Working Group on Legionella Infections. Fifty coded isolates comprising two panels of well-characterised strains, a "reproducibility" panel (n=20) and an "epidemiologically related" panel (n=30), were sent to 13 centres in 12 European countries. Analysis was undertaken in each...... using gel analysis software yielded R=1.00 and E=1.00, with 12, 13 or 14 types. This method can be used as a simple, rapid screening tool for epidemiological typing of isolates of Legionella pneumophila serogroup 1. Results demonstrate that the method can be highly reproducible (R=1...

  10. Wood construction codes issues in the United States

    Science.gov (United States)

    Douglas R. Rammer

    2006-01-01

    The current wood construction codes find their origin in the 1935 Wood Handbook: Wood as an Engineering Material published by the USDA Forest Service. Many of the current design recommendations can be traced back to statements from this book. Since this time a series of development both historical and recent has led to a multi-layered system for use of wood products in...

  11. Modernized Approach for Generating Reproducible Heterogeneity Using Transmitted-Light for Flow Visualization Experiments

    Science.gov (United States)

    Jones, A. A.; Holt, R. M.

    2017-12-01

    Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).

  12. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  13. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  14. Consistent Code Qualification Process and Application to WWER-1000 NPP

    International Nuclear Information System (INIS)

    Berthon, A.; Petruzzi, A.; Giannotti, W.; D'Auria, F.; Reventos, F.

    2006-01-01

    Calculation analysis by application of the system codes are performed to evaluate the NPP or the facility behavior during a postulated transient or to evaluate the code capability. The calculation analysis constitutes a process that involves the code itself, the data of the reference plant, the data about the transient, the nodalization, and the user. All these elements affect one each other and affect the results. A major issue in the use of mathematical model is constituted by the model capability to reproduce the plant or facility behavior under steady state and transient conditions. These aspects constitute two main checks that must be satisfied during the qualification process. The first of them is related to the realization of a scheme of the reference plant; the second one is related to the capability to reproduce the transient behavior. The aim of this paper is to describe the UMAE (Uncertainty Method based on Accuracy Extrapolation) methodology developed at University of Pisa for qualifying a nodalization and analysing the calculated results and to perform the uncertainty evaluation of the system code by the CIAU code (Code with the capability of Internal Assessment of Uncertainty). The activity consists with the re-analysis of the Experiment BL-44 (SBLOCA) performed in the LOBI facility and the analysis of a Kv-scaling calculation of the WWER-1000 NPP nodalization taking as reference the test BL-44. Relap5/Mod3.3 has been used as thermal-hydraulic system code and the standard procedure adopted at University of Pisa has been applied to show the capability of the code to predict the significant aspects of the transient and to obtain a qualified nodalization of the WWER-1000 through a systematic qualitative and quantitative accuracy evaluation. The qualitative accuracy evaluation is based on the selection of Relevant Thermal-hydraulic Aspects (RTAs) and is a prerequisite to the application of the Fast Fourier Transform Based Method (FFTBM) which quantifies

  15. Presence of trace elements in fishes from the Chaco-Pampeana plain (Argentina

    Directory of Open Access Journals (Sweden)

    Alejandra V. Volpedo

    2015-06-01

    Full Text Available The Chaco-Pampean plain is one of the greatest plains worldwide; present a wetland macro system (lagoons, marshes, rivers, streams, channels. The water quality of these environments is diverse and has different trace elements natural (As, F, Mo and V and anthropogenic (Cr, Cu and Pb. The Chaco Pampean plain has an important diversity of fish species however the species of commercial importance are limited. This paper presents the reviews of the studies related to the presence of trace elements in tissues of commercial fishes (shad Prochilodus lineatus and silversides Odontesthes bonariensis in the Chaco-Pampean plain in recent decades and a discussion about associated information gaps is presented. The presence of trace elements in commercial fish muscle is a major problem for food security. The results showed that most of the elements present in shad are in lower levels than the maximum limits set by the Argentine Food Code (CAA, 2014. In the case of Pb present in the muscle of the shad, the determined values exceed those considered by the European Food Safety Authority. In the case of silversides the values of As, Pb, Hg are mostly lower than those maximum recommended by the Argentine Food Code. However, according to the European Food Safety Authority, the lower limit on the benchmark dose for a 10% response (BMDL values associated with health risk for As.

  16. Linear scans of hair strands for trace elements by proton induced x-ray emission

    International Nuclear Information System (INIS)

    Jolly, R.K.; Pehrson, G.R.; Gupta, S.K.; Buckle, D.C.; Aceto, H. Jr.

    1974-01-01

    Hair strands obtained from school children in the 10 to 12 year age group were analyzed for trace element concentration as a function of distance from the root by proton-induced x-ray emission to study the history of exposure of the donors to toxic trace metals. These samples were collected from the vicinity of a copper smelter where high levels of As, Cd, Sb, and Pb have been noted. Scans show a continual build-up of Pb as a function of distance from the root, while As shows a reproducible and distinct maximum approximately 10 cm from the root. The concentration of Zn was found to be constant in all samples (without exception) to within the uncertainties of our measurements

  17. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Directory of Open Access Journals (Sweden)

    Wilke Daniel N.

    2017-01-01

    Full Text Available The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  18. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Science.gov (United States)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  19. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  20. Suspension of the NAB Code and Its Effect on Regulation of Advertising.

    Science.gov (United States)

    Maddox, Lynda M.; Zanot, Eric J.

    1984-01-01

    Traces events leading to the suspension of the Television Code of the National Association of Broadcasters in 1982 and looks at changes that have occurred in the informal and formal regulation of advertising as a result of that suspension. (FL)

  1. Assessment of systems codes and their coupling with CFD codes in thermal–hydraulic applications to innovative reactors

    Energy Technology Data Exchange (ETDEWEB)

    Bandini, G., E-mail: giacomino.bandini@enea.it [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Polidori, M. [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Gerschenfeld, A.; Pialla, D.; Li, S. [Commissariat à l’Energie Atomique (CEA) (France); Ma, W.M.; Kudinov, P.; Jeltsov, M.; Kööp, K. [Royal Institute of Technology (KTH) (Sweden); Huber, K.; Cheng, X.; Bruzzese, C.; Class, A.G.; Prill, D.P. [Karlsruhe Institute of Technology (KIT) (Germany); Papukchiev, A. [Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) (Germany); Geffray, C.; Macian-Juan, R. [Technische Universität München (TUM) (Germany); Maas, L. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN) (France)

    2015-01-15

    Highlights: • The assessment of RELAP5, TRACE and CATHARE system codes on integral experiments is presented. • Code benchmark of CATHARE, DYN2B, and ATHLET on PHENIX natural circulation experiment. • Grid-free pool modelling based on proper orthogonal decomposition for system codes is explained. • The code coupling methodologies are explained. • The coupling of several CFD/system codes is tested against integral experiments. - Abstract: The THINS project of the 7th Framework EU Program on nuclear fission safety is devoted to the investigation of crosscutting thermal–hydraulic issues for innovative nuclear systems. A significant effort in the project has been dedicated to the qualification and validation of system codes currently employed in thermal–hydraulic transient analysis for nuclear reactors. This assessment is based either on already available experimental data, or on the data provided by test campaigns carried out in the frame of THINS project activities. Data provided by TALL and CIRCE facilities were used in the assessment of system codes for HLM reactors, while the PHENIX ultimate natural circulation test was used as reference for a benchmark exercise among system codes for sodium-cooled reactor applications. In addition, a promising grid-free pool model based on proper orthogonal decomposition is proposed to overcome the limits shown by the thermal–hydraulic system codes in the simulation of pool-type systems. Furthermore, multi-scale system-CFD solutions have been developed and validated for innovative nuclear system applications. For this purpose, data from the PHENIX experiments have been used, and data are provided by the tests conducted with new configuration of the TALL-3D facility, which accommodates a 3D test section within the primary circuit. The TALL-3D measurements are currently used for the validation of the coupling between system and CFD codes.

  2. An RNA-Seq strategy to detect the complete coding and non-coding transcriptome including full-length imprinted macro ncRNAs.

    Directory of Open Access Journals (Sweden)

    Ru Huang

    Full Text Available Imprinted macro non-protein-coding (nc RNAs are cis-repressor transcripts that silence multiple genes in at least three imprinted gene clusters in the mouse genome. Similar macro or long ncRNAs are abundant in the mammalian genome. Here we present the full coding and non-coding transcriptome of two mouse tissues: differentiated ES cells and fetal head using an optimized RNA-Seq strategy. The data produced is highly reproducible in different sequencing locations and is able to detect the full length of imprinted macro ncRNAs such as Airn and Kcnq1ot1, whose length ranges between 80-118 kb. Transcripts show a more uniform read coverage when RNA is fragmented with RNA hydrolysis compared with cDNA fragmentation by shearing. Irrespective of the fragmentation method, all coding and non-coding transcripts longer than 8 kb show a gradual loss of sequencing tags towards the 3' end. Comparisons to published RNA-Seq datasets show that the strategy presented here is more efficient in detecting known functional imprinted macro ncRNAs and also indicate that standardization of RNA preparation protocols would increase the comparability of the transcriptome between different RNA-Seq datasets.

  3. Active Inference and Learning in the Cerebellum.

    Science.gov (United States)

    Friston, Karl; Herreros, Ivan

    2016-09-01

    This letter offers a computational account of Pavlovian conditioning in the cerebellum based on active inference and predictive coding. Using eyeblink conditioning as a canonical paradigm, we formulate a minimal generative model that can account for spontaneous blinking, startle responses, and (delay or trace) conditioning. We then establish the face validity of the model using simulated responses to unconditioned and conditioned stimuli to reproduce the sorts of behavior that are observed empirically. The scheme's anatomical validity is then addressed by associating variables in the predictive coding scheme with nuclei and neuronal populations to match the (extrinsic and intrinsic) connectivity of the cerebellar (eyeblink conditioning) system. Finally, we try to establish predictive validity by reproducing selective failures of delay conditioning, trace conditioning, and extinction using (simulated and reversible) focal lesions. Although rather metaphorical, the ensuing scheme can account for a remarkable range of anatomical and neurophysiological aspects of cerebellar circuitry-and the specificity of lesion-deficit mappings that have been established experimentally. From a computational perspective, this work shows how conditioning or learning can be formulated in terms of minimizing variational free energy (or maximizing Bayesian model evidence) using exactly the same principles that underlie predictive coding in perception.

  4. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  5. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  6. Development of particle and heavy ion transport code system

    International Nuclear Information System (INIS)

    Niita, Koji

    2004-01-01

    Particle and heavy ion transport code system (PHITS) is 3 dimension general purpose Monte Carlo simulation codes for description of transport and reaction of particle and heavy ion in materials. It is developed on the basis of NMTC/JAM for design and safety of J-PARC. What is PHITS, it's physical process, physical models and development process of PHITC code are described. For examples of application, evaluation of neutron optics, cancer treatment by heavy particle ray and cosmic radiation are stated. JAM and JQMD model are used as the physical model. Neutron motion in six polar magnetic field and gravitational field, PHITC simulation of trace of C 12 beam and secondary neutron track of small model of cancer treatment device in HIMAC and neutron flux in Space Shuttle are explained. (S.Y.)

  7. Code of Practice on the International Transboundary Movement of Radioactive Waste; Code De Bonne Pratique Sur Le Mouvement Transfrontiere International De Dechets Radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-11-03

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States [French] Le 21 septembre 1990, la Conference generale, par la resolution GC(XXXIV)/RES/530, a adopte le Code de bonne pratique sur le mouvement transfrontiere international de dechets radioactifs et a prie le Directeur general-notamment-de prendre toutes les mesures necessaires pour assurer une large diffusion du Code de bonne pratique aux niveaux tant national qu'international. Le Code de bonne pratique a ete elabore par un groupe d'experts cree en application de la resolution GC(XXXII)/RES/490 adoptee par la Conference generale en 1988. Le texte du Code de bonne pratique est reproduit ci-apres pour l'information de tous les Etats Membres.

  8. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Directory of Open Access Journals (Sweden)

    Ling-Hong Hung

    Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  9. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Science.gov (United States)

    Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2016-01-01

    Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  10. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs

    Directory of Open Access Journals (Sweden)

    Yu-yan Yu

    2018-04-01

    Full Text Available A novel and sensitive assay for aflatoxin B1 (AFB1 detection has been developed by using bio-bar code assay (BCA. The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP and monoclonal antibodies modified magnetic microparticle (MMP, and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10−8 ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Keywords: Aflatoxin B1, Bio-bar code assay, Chinese herbs, Magnetic microparticle probes, Nanoparticle probes

  11. Determination of trace amount of formaldehyde base on a bromate-Malachite Green system.

    Science.gov (United States)

    Tang, Yufang; Chen, Hao; Weng, Chao; Tang, Xiaohui; Zhang, Miaoling; Hu, Tao

    2015-01-25

    A novel catalytic kinetic spectrophotometric method for determination of trace amount of formaldehyde (FA) has been established, based on catalytic effect of trace amount of FA on the oxidation of Malachite Green (MG) by potassium bromate in presence of sulfuric acid medium, and was reported for the first time. The method was monitored by measuring the decrease in absorbance of MG at 617 nm and allowed a precise determination of FA in the range of 0.003-0.08 μg mL(-1), with a limit of detection down to 1 ng mL(-1). The relative standard deviation of 10 replicate measurements was 1.63%. The method developed was approved to be sensitive, selective and accurate, and adopted to determinate free FA in samples directly with good accuracy and reproducibility. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    Science.gov (United States)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  13. Parallelization of a three-dimensional whole core transport code DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Jin Young, Cho; Han Gyu, Joo; Ha Yong, Kim; Moon-Hee, Chang [Korea Atomic Energy Research Institute, Yuseong-gu, Daejon (Korea, Republic of)

    2003-07-01

    Parallelization of the DeCART (deterministic core analysis based on ray tracing) code is presented that reduces the computational burden of the tremendous computing time and memory required in three-dimensional whole core transport calculations. The parallelization employs the concept of MPI grouping and the MPI/OpenMP mixed scheme as well. Since most of the computing time and memory are used in MOC (method of characteristics) and the multi-group CMFD (coarse mesh finite difference) calculation in DeCART, variables and subroutines related to these two modules are the primary targets for parallelization. Specifically, the ray tracing module was parallelized using a planar domain decomposition scheme and an angular domain decomposition scheme. The parallel performance of the DeCART code is evaluated by solving a rodded variation of the C5G7MOX three dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In C5G7MOX problem with 24 CPUs, a speedup of maximum 21 is obtained on an IBM Regatta machine and 22 on a LINUX Cluster in the MOC kernel, which indicates good parallel performance of the DeCART code. In the simplified SMART problem, the memory requirement of about 11 GBytes in the single processor cases reduces to 940 Mbytes with 24 processors, which means that the DeCART code can now solve large core problems with affordable LINUX clusters. (authors)

  14. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, and technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.

  15. Input research and testing of code TOODY. Quarterly report, July--September 1971

    International Nuclear Information System (INIS)

    Haynie, G.A.

    1997-01-01

    The purpose of this report is to simplify and further explain input instructions for Code TOODY and to demonstrate the ability of the code to reproduce cylinder test results. This input is intended to be a supplement to, and not a replacement for, the existing TOODY manual. The TOODY manual should be read and understood before attempting to read this report. Problems arise in the preparation of the input data in four areas: material definition, initial shape definition, the restart feature, and the limiting of output. Aside from these areas, the code is adequately discussed in the manual, 'TOODY, A Computer Program For Calculating Problems Of Motion In Two Dimensions'

  16. Mantle End-Members: The Trace Element Perspective

    Science.gov (United States)

    Willbold, M.; Stracke, A.; Hofmann, A. W.

    2004-12-01

    . Although there are some trace element characteristics common to all EM-type basalts, which distinguish them from HIMU-type basalts (e.g. uniformly high Th/U ratios of 4.7 ± 0.3, and enrichment in Cs-U), each suite of EM-type basalts has unique trace element signatures that distinguish them from any other suite of EM-type basalts. This is especially obvious when comparing the trace element composition of EM basalts from one isotopic family, for example EM1-type basalts from Tristan, Gough and Pitcairn. Consequently, the trace element systematics of EM-type basalts suggest that there are many different EM-type sources, whereas the isotopic composition of EM-type basalts suggest derivation from two broadly similar sources, i.e. EM1 and EM2. The large variability in subducting sediments with respect to both parent-daughter (e.g. Rb/Sr, Sm/Nd, U/Pb, Th/Pb,...) and other trace element ratios makes it unlikely that there are reproducible mixtures of sediments leading to two different isotopic evolution paths (EM1 and EM2) while preserving a range of incompatible element contents for each isotopic family, as would be required to reconcile the isotopic and trace element characteristics of EM-type basalts. Although this does not a priori argue against sediments as possible source components for OIB, it does argue against two distinct groups of sediments as EM1 and EM2 sources. Further characterization of sources with the same general origin (e.g. a certain type of crust or lithosphere) or identification of processes leading to reservoirs with similar parent-daughter ratio characteristics but different incompatible trace element contents could resolve the apparent conundrum.

  17. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  18. RELAP5 and SIMMER-III code assessment on CIRCE decay heat removal experiments

    International Nuclear Information System (INIS)

    Bandini, Giacomino; Polidori, Massimiliano; Meloni, Paride; Tarantino, Mariano; Di Piazza, Ivan

    2015-01-01

    Highlights: • The CIRCE DHR experiments simulate LOHS+LOF transients in LFR systems. • Decay heat removal by natural circulation through immersed heat exchangers is investigated. • The RELAP5 simulation of DHR experiments is presented. • The SIMMER-III simulation of DHR experiments is presented. • The focus is on the transition from forced to natural convection and stratification in a large pool. - Abstract: In the frame of THINS Project of the 7th Framework EU Program on Nuclear Fission Safety, some experiments were carried out on the large scale LBE-cooled CIRCE facility at the ENEA/Brasimone Research Center to investigate relevant safety aspects associated with the removal of decay heat through heat exchangers (HXs) immersed in the primary circuit of a pool-type lead fast reactor (LFR), under loss of heat sink (LOHS) accidental conditions. The start-up and operation of this decay heat removal (DHR) system relies on natural convection on the primary side and then might be affected by coolant mixing and temperature stratification phenomena occurring in the LBE pool. The main objectives of the CIRCE experimental campaign were to verify the behavior of the DHR system under representative accidental conditions and provide a valuable database for the assessment of both CFD and system codes. The reproduced accidental conditions refer to a station blackout scenario, namely a protected LOHS and loss of flow (LOF) transient. In this paper the results of 1D RELAP5 and 2D SIMMER-III simulations are compared with the experimental data of more representative DHR transients T-4 and T-5 in order to verify the capability of these codes to reproduce both forced and natural convection conditions observed in the primary circuit and the right operation of the DHR system for decay heat removal. Both codes are able to reproduce the stationary conditions and with some uncertainties the transition to natural convection conditions until the end of the transient phase. The trend

  19. Parallelization characteristics of the DeCART code

    International Nuclear Information System (INIS)

    Cho, J. Y.; Joo, H. G.; Kim, H. Y.; Lee, C. C.; Chang, M. H.; Zee, S. Q.

    2003-12-01

    This report is to describe the parallelization characteristics of the DeCART code and also examine its parallel performance. Parallel computing algorithms are implemented to DeCART to reduce the tremendous computational burden and memory requirement involved in the three-dimensional whole core transport calculation. In the parallelization of the DeCART code, the axial domain decomposition is first realized by using MPI (Message Passing Interface), and then the azimuthal angle domain decomposition by using either MPI or OpenMP. When using the MPI for both the axial and the angle domain decomposition, the concept of MPI grouping is employed for convenient communication in each communication world. For the parallel computation, most of all the computing modules except for the thermal hydraulic module are parallelized. These parallelized computing modules include the MOC ray tracing, CMFD, NEM, region-wise cross section preparation and cell homogenization modules. For the distributed allocation, most of all the MOC and CMFD/NEM variables are allocated only for the assigned planes, which reduces the required memory by a ratio of the number of the assigned planes to the number of all planes. The parallel performance of the DeCART code is evaluated by solving two problems, a rodded variation of the C5G7 MOX three-dimensional benchmark problem and a simplified three-dimensional SMART PWR core problem. In the aspect of parallel performance, the DeCART code shows a good speedup of about 40.1 and 22.4 in the ray tracing module and about 37.3 and 20.2 in the total computing time when using 48 CPUs on the IBM Regatta and 24 CPUs on the LINUX cluster, respectively. In the comparison between the MPI and OpenMP, OpenMP shows a somewhat better performance than MPI. Therefore, it is concluded that the first priority in the parallel computation of the DeCART code is in the axial domain decomposition by using MPI, and then in the angular domain using OpenMP, and finally the angular

  20. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  1. Validation of a thermal-hydraulic system code on a simple example

    International Nuclear Information System (INIS)

    Kopecek, Vit; Zacha, Pavel

    2014-01-01

    A mathematical model of a U tube was set up and the analytical solution was calculated and used in the assessment of the numerical solutions obtained by using the RELAP5 mod3.3 and TRACE V5 thermal hydraulics codes. A good agreement between the 2 types of calculation was obtained.

  2. A multi-national study of reading and tracing skills in novice programmers

    DEFF Research Database (Denmark)

    Lindholm Nielsen, Morten; Lister, Raymond; Adams, Elisabeth Shaw

    2004-01-01

    A study by a ITiCSE 2001 working group ("the McCracken Group") established that many students do not know how to program at the conclusion of their introductory courses. A popular explanation for this incapacity is that the students lack the ability to problem-solve. That is, they lack the ability...... tasks, such as tracing (or "desk checking") through code. This ITiCSE 2004 working group studied the alternative explanation, by testing students from seven countries, in two ways. First, students were tested on their ability to predict the outcome of executing a short piece of code. Second, students...... of skills that are a prerequisite for problem-solving....

  3. (U) Second-Order Sensitivity Analysis of Uncollided Particle Contributions to Radiation Detector Responses Using Ray-Tracing

    Energy Technology Data Exchange (ETDEWEB)

    Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-30

    The Second-Level Adjoint Sensitivity System (2nd-LASS) that yields the second-order sensitivities of a response of uncollided particles with respect to isotope densities, cross sections, and source emission rates is derived in Refs. 1 and 2. In Ref. 2, we solved problems for the uncollided leakage from a homogeneous sphere and a multiregion cylinder using the PARTISN multigroup discrete-ordinates code. In this memo, we derive solutions of the 2nd-LASS for the particular case when the response is a flux or partial current density computed at a single point on the boundary, and the inner products are computed using ray-tracing. Both the PARTISN approach and the ray-tracing approach are implemented in a computer code, SENSPG. The next section of this report presents the equations of the 1st- and 2nd-LASS for uncollided particles and the first- and second-order sensitivities that use the solutions of the 1st- and 2nd-LASS. Section III presents solutions of the 1st- and 2nd-LASS equations for the case of ray-tracing from a detector point. Section IV presents specific solutions of the 2nd-LASS and derives the ray-trace form of the inner products needed for second-order sensitivities. Numerical results for the total leakage from a homogeneous sphere are presented in Sec. V and for the leakage from one side of a two-region slab in Sec. VI. Section VII is a summary and conclusions.

  4. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    Science.gov (United States)

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  5. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  6. Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.

    Science.gov (United States)

    Sisco, Edward; Dake, Jeffrey; Bridge, Candice

    2013-10-10

    Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Assessment of heat transfer correlations for supercritical water in the frame of best-estimate code validation

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Espinoza, Victor H. Sanchez; Schneider, Niko; Hurtado, Antonio

    2009-01-01

    Within the frame of the Generation IV international forum six innovative reactor concepts are the subject of comprehensive investigations. In some projects supercritical water will be considered as coolant, moderator (as for the High Performance Light Water Reactor) or secondary working fluid (one possible option for Liquid Metal-cooled Fast Reactors). Supercritical water is characterized by a pronounced change of the thermo-physical properties when crossing the pseudo-critical line, which goes hand in hand with a change in the heat transfer (HT) behavior. Hence, it is essential to estimate, in a proper way, the heat-transfer coefficient and subsequently the wall temperature. The scope of this paper is to present and discuss the activities at the Institute for Reactor Safety (IRS) related to the implementation of correlations for wall-to-fluid HT at supercritical conditions in Best-Estimate codes like TRACE as well as its validation. It is important to validate TRACE before applying it to safety analyses of HPLWR or of other reactor systems. In the past 3 decades various experiments have been performed all over the world to reveal the peculiarities of wall-to-fluid HT at supercritical conditions. Several different heat transfer phenomena such as HT enhancement (due to higher Prandtl numbers in the vicinity of the pseudo-critical point) or HT deterioration (due to strong property variations) were observed. Since TRACE is a component based system code with a finite volume method the resolution capabilities are limited and not all physical phenomena can be modeled properly. But Best -Estimate system codes are nowadays the preferred option for safety related investigations of full plants or other integral systems. Thus, the increase of the confidence in such codes is of high priority. In this paper, the post-test analysis of experiments with supercritical parameters will be presented. For that reason various correlations for the HT, which considers the characteristics

  8. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  9. Development of code PRETOR for stellarator simulation

    International Nuclear Information System (INIS)

    Dies, J.; Fontanet, J.; Fontdecaba, J.M.; Castejon, F.; Alejandre, C.

    1998-01-01

    The Department de Fisica i Enginyeria Nuclear (DFEN) of the UPC has some experience in the development of the transport code PRETOR. This code has been validated with shots of DIII-D, JET and TFTR, it has also been used in the simulation of operational scenarios of ITER fast burnt termination. Recently, the association EURATOM-CIEMAT has started the operation of the TJ-II stellarator. Due to the need of validating the results given by others transport codes applied to stellarators and because all of them made some approximations, as a averaging magnitudes in each magnetic surface, it was thought suitable to adapt the PRETOR code to devices without axial symmetry, like stellarators, which is very suitable for the specific needs of the study of TJ-II. Several modifications are required in PRETOR; the main concerns to the models of: magnetic equilibrium, geometry and transport of energy and particles. In order to solve the complex magnetic equilibrium geometry the powerful numerical code VMEC has been used. This code gives the magnetic surface shape as a Fourier series in terms of the harmonics (m,n). Most of the geometric magnitudes are also obtained from the VMEC results file. The energy and particle transport models will be replaced by other phenomenological models that are better adapted to stellarator simulation. Using the proposed models, it is pretended to reproduce experimental data available from present stellarators, given especial attention to the TJ-II of the association EURATOM-CIEMAT. (Author)

  10. GetData Digitizing Program Code: Description, Testing, Training

    International Nuclear Information System (INIS)

    Taova, S.

    2013-01-01

    90 percents of compilation in our center is obtained by data digitizing. So we are rather interested in the development of different techniques of data digitizing. Plots containing a great amount of points and solid lines are most complicated for digitizing. From our point of view including to the Exfor-Digitizer procedures of automatic or semi-automatic digitizing will allow to simplify significantly this process. We managed to test some free available program codes. Program GETDATA Graph Digitizer (www.getdata- graph-digitizer.com) looks more suitable for our purposes. GetData Graph Digitizer is a program for digitizing graphs, plots and maps. Main features of GetData Graph Digitizer are: - supported graphics formats are TIFF, JPEG, BMP and PCX; - two algorithms for automatic digitizing; - convenient manual digitizing; - reorder tool for easy points reordering; - save/open workspace, which allows to save the work and return to it later; - obtained data can be exported to the clipboard; - export to the formats: TXT (text file), XLS (MS Excel), XML, DXF (AutoCAD) and EPS (PostScript). GetData Graph Digitizer includes two algorithms for automatic digitizing. Auto trace lines: This method is designed to digitize solid lines. Choose the starting point, and the program will trace the line, stopping at it's end. To trace the line use Operations =>Auto trace lines menu or context menu ('Auto trace lines' item). To choose starting point click left mouse button, or click right mouse button to additionally choose direction for line tracing. Digitize area: The second way is to set digitizing area. This method works for any type of lines, including dashed lines. Data points are set at the intersection of grid with the line. You can choose the type of grid (X grid or Y grid), and set the distance between grid lines. You can also make the grid be shifted in such a way, that it will pass through a specific X (or Y) value. To digitize area use Operations →Digitize area menu

  11. Sensitivity analysis of the titan hybrid deterministic transport code for SPECT simulation

    International Nuclear Information System (INIS)

    Royston, Katherine K.; Haghighat, Alireza

    2011-01-01

    Single photon emission computed tomography (SPECT) has been traditionally simulated using Monte Carlo methods. The TITAN code is a hybrid deterministic transport code that has recently been applied to the simulation of a SPECT myocardial perfusion study. For modeling SPECT, the TITAN code uses a discrete ordinates method in the phantom region and a combined simplified ray-tracing algorithm with a fictitious angular quadrature technique to simulate the collimator and generate projection images. In this paper, we compare the results of an experiment with a physical phantom with predictions from the MCNP5 and TITAN codes. While the results of the two codes are in good agreement, they differ from the experimental data by ∼ 21%. In order to understand these large differences, we conduct a sensitivity study by examining the effect of different parameters including heart size, collimator position, collimator simulation parameter, and number of energy groups. (author)

  12. Development of a model of a NSSS of the PWR reactor with thermo-hydraulic code GOTHIC

    International Nuclear Information System (INIS)

    Gomez Garcia-Torano, I.; Jimenez, G.

    2013-01-01

    The Thermo-hydraulic code GOTHIC is often used in the nuclear industry for licensing transient analysis inside containment of generation II (PWR, BWR) plants as Gen III and III + (AP1000, ESBWR, APWR). After entering the mass and energy released to the containment, previously calculated by other codes (basis, TRACE), GOTHIC allows to calculate in detail the evolution of basic parameters in the containment.

  13. Validation of a new library of nuclear constants of the WIMS code

    International Nuclear Information System (INIS)

    Aguilar H, F.

    1991-10-01

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H 1 , O 16 , Al 27 , U 235 and U 238 was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  14. Multi-trace deformations in AdS/CFT. Exploring the vacuum structure of the deformed CFT

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitriou, I. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)]|[Center for Mathematical Physics, Hamburg (Germany)

    2007-03-15

    We present a general and systematic treatment of multi-trace deformations in the AdS/CFT correspondence in the large N limit, pointing out and clarifying subtleties relating to the formulation of the boundary value problem on a conformal boundary. We then apply this method to study multi-trace deformations in the presence of a scalar VEV, which requires the coupling to gravity to be taken into account. We show that supergravity solutions subject to 'mixed' boundary conditions are in one-to-one correspondence with critical points of the holographic effective action of the dual theory in the presence of a multi-trace deformation, and we find a number of new exact analytic solutions involving a minimally or conformally coupled scalar field satisfying 'mixed' boundary conditions. These include the generalization to any dimension of the instanton solution recently found in hep-th/0611315. Finally, we provide a systematic method for computing the holographic effective action in the presence of a multi-trace deformation in a derivative expansion away from the conformal vacuum using Hamilton-Jacobi theory. Requiring that this effective action exists and is bounded from below reproduces recent results on the stability of the AdS vacuum in the presence of 'mixed' boundary conditions. (orig.)

  15. Multi-trace deformations in AdS/CFT. Exploring the vacuum structure of the deformed CFT

    International Nuclear Information System (INIS)

    Papadimitriou, I.

    2007-03-01

    We present a general and systematic treatment of multi-trace deformations in the AdS/CFT correspondence in the large N limit, pointing out and clarifying subtleties relating to the formulation of the boundary value problem on a conformal boundary. We then apply this method to study multi-trace deformations in the presence of a scalar VEV, which requires the coupling to gravity to be taken into account. We show that supergravity solutions subject to 'mixed' boundary conditions are in one-to-one correspondence with critical points of the holographic effective action of the dual theory in the presence of a multi-trace deformation, and we find a number of new exact analytic solutions involving a minimally or conformally coupled scalar field satisfying 'mixed' boundary conditions. These include the generalization to any dimension of the instanton solution recently found in hep-th/0611315. Finally, we provide a systematic method for computing the holographic effective action in the presence of a multi-trace deformation in a derivative expansion away from the conformal vacuum using Hamilton-Jacobi theory. Requiring that this effective action exists and is bounded from below reproduces recent results on the stability of the AdS vacuum in the presence of 'mixed' boundary conditions. (orig.)

  16. HYDRASTAR - a code for stochastic simulation of groundwater flow

    International Nuclear Information System (INIS)

    Norman, S.

    1992-05-01

    The computer code HYDRASTAR was developed as a tool for groundwater flow and transport simulations in the SKB 91 safety analysis project. Its conceptual ideas can be traced back to a report by Shlomo Neuman in 1988, see the reference section. The main idea of the code is the treatment of the rock as a stochastic continuum which separates it from the deterministic methods previously employed by SKB and also from the discrete fracture models. The current report is a comprehensive description of HYDRASTAR including such topics as regularization or upscaling of a hydraulic conductivity field, unconditional and conditional simulation of stochastic processes, numerical solvers for the hydrology and streamline equations and finally some proposals for future developments

  17. The validation of a pixe system for trace element analysis of biological samples

    Science.gov (United States)

    Saied, S. O.; Crumpton, D.; Francois, P. E.

    1981-03-01

    A PIXE system has been developed for measuring trace element levels in biological samples and a study made of the precision and accuracy achievable. The calibration of the system has been established using thin targets of known elemental composition and the reproducibility studied using protons of energy 2.5 MeV. Both thick and thin samples prepared from NBS bovine liver have been analysed and the elemental ratios present established for a set of replicate samples. These are compared with the results of other workers. Problems relating to sample preparation are discussed.

  18. Investigation of Loop Seal Clearing Phenomena for the ATLAS SBLOCA Long Term Cooling Test using TRACE and MARS-KS

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Min Jeong; Park, M. H.; Marigomen Ralph; Sim, S. K. [Environment and Energy Technology, Daejeon (Korea, Republic of)

    2016-10-15

    During Design Certificate(DC) review of the APR1400, USNRC raised a long term cooling safety issue on the effect of loop seal clearing during cold leg Small Break Loss Of Coolant Accident(SBLOCA) due to relatively deep cross-over loop compared to the US PWRs. The objective of this study is thus to investigate the loop seal clearing phenomena during cold leg slot break SBLOCA long term cooling and resolve the safety issue on the SBLOCA long term cooling related to the APR1400 DC. TRACE and MARS-KS were used to predict the test results and to perform sensitivity studies for the SBLOCA loop seal clearing phenomena. The calculation shows that the TRACE code well predict the sequence of Test LTC-CL-04R. However, compared to the experiment, the TRACE over predicts the primary pressure due to smaller break flow prediction. MARS-KS well predicts major thermal hydraulic parameters during the transient with reasonable agreement. MARS-KS better predicts ATLAS LTC-CL-04R test data with a good agreement than the TRACE due to better prediction of the break flow. Overall, compared to the experiment, the TRACE and MARS-KS Codes show a discrepancy in predicting the loop seal clearing and reformation time. Both TRACE and MARS-KS correctly predicts core water level and fuel cladding temperatures. From this study, it can be said that even though APR1400 cross-over leg design has slightly deeper loop seals, the effect on the safety of the SBLOCA long term cooling is minimal compared to the SBLOCA cladding failure criteria. Further study on the SBLOCA loop seal clearing phenomena is needed.

  19. Fast Computation of Pulse Height Spectra Using SGRD Code

    Directory of Open Access Journals (Sweden)

    Humbert Philippe

    2017-01-01

    Full Text Available SGRD (Spectroscopy, Gamma rays, Rapid, Deterministic code is used for fast calculation of the gamma ray spectrum produced by a spherical shielded source and measured by a detector. The photon source lines originate from the radioactive decay of the unstable isotopes. The emission rate and spectrum of these primary sources are calculated using the DARWIN code. The leakage spectrum is separated in two parts, the uncollided component is transported by ray-tracing and the scattered component is calculated using a multigroup discrete ordinates method. The pulsed height spectrum is then simulated by folding the leakage spectrum with the detector response functions which are pre-calculated using MCNP5 code for each considered detector type. An application to the simulation of the gamma spectrum produced by a natural uranium ball coated with plexiglass and measured using a NaI detector is presented.

  20. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  1. Use of trace elements as indicators for underground fluid circulations in karstic environment; Utilisation des elements en trace comme traceurs des circulations souterraines en milieu karstique (site du Lamalou, Herault)

    Energy Technology Data Exchange (ETDEWEB)

    Pane-Escribe, M B

    1995-06-29

    The geochemical study of the trace element behaviour in karstic groundwaters has been carried out over the experimental site of Lamalou (Herault, France). Routine measurements of the physico-chemical parameters and of the dissolved elements concentrations have been achieved during two hydrological cycles. Radon has been monitored by passive detectors and by automatic electronic probes. Trace elements (Sc, Ti, V, Cr, Ni, Cu, Zn, As Rb, Sr, Mo, Cd, Sb, Cs, Ba, Th, U) were analyzed by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The first part of this work presents the methodologies employed with in particular the improvement of the analytical performances of ICP-MS for water samples analysis. The detection limit for each considered element has been determined. The short and long term reproducibility for the samples analysis has also been tested. The second part of this study presents the treatment and interpretation of the results. This analysis has pointed our the influence of the aquifer structure on the chemical elements distribution. The trace and major elements concentrations are effectively related to the fracturing state of the reservoir and allow to individualize the high transmissivity zones from zones with a lower transmissivity in this mono-lithological context, trace elements appear to be particularly efficient tracers for determining the water origin and circulation their spatial and temporal behaviour leads to identify three different origins for the water mineralization over the studied area: limestones, clays and external sources (rainfalls and occasional pollutions). (author). 154 refs.

  2. Comparing TCV experimental VDE responses with DINA code simulations

    Science.gov (United States)

    Favez, J.-Y.; Khayrutdinov, R. R.; Lister, J. B.; Lukash, V. E.

    2002-02-01

    The DINA free-boundary equilibrium simulation code has been implemented for TCV, including the full TCV feedback and diagnostic systems. First results showed good agreement with control coil perturbations and correctly reproduced certain non-linear features in the experimental measurements. The latest DINA code simulations, presented in this paper, exploit discharges with different cross-sectional shapes and different vertical instability growth rates which were subjected to controlled vertical displacement events (VDEs), extending previous work with the DINA code on the DIII-D tokamak. The height of the TCV vessel allows observation of the non-linear evolution of the VDE growth rate as regions of different vertical field decay index are crossed. The vertical movement of the plasma is found to be well modelled. For most experiments, DINA reproduces the S-shape of the vertical displacement in TCV with excellent precision. This behaviour cannot be modelled using linear time-independent models because of the predominant exponential shape due to the unstable pole of any linear time-independent model. The other most common equilibrium parameters like the plasma current Ip, the elongation κ, the triangularity δ, the safety factor q, the ratio between the averaged plasma kinetic pressure and the pressure of the poloidal magnetic field at the edge of the plasma βp, and the internal self inductance li also show acceptable agreement. The evolution of the growth rate γ is estimated and compared with the evolution of the closed-loop growth rate calculated with the RZIP linear model, confirming the origin of the observed behaviour.

  3. Comparing TCV experimental VDE responses with DINA code simulations

    International Nuclear Information System (INIS)

    Favez, J.Y.; Khayrutdinov, J.B.; Lister, J.B.; Lukash, V.E.

    2001-10-01

    The DINA free-boundary equilibrium simulation code has been implemented for TCV, including the full TCV feedback and diagnostic systems. First results showed good agreement with control coil perturbations and correctly reproduced certain non-linear features in the experimental measurements. The latest DINA code simulations, presented in this paper, exploit discharges with different cross- sectional shapes and different vertical instability growth rates which were subjected to controlled Vertical Displacement Events, extending previous work with the DINA code on the DIII-D tokamak. The height of the TCV vessel allows observation of the non- linear evolution of the VDE growth rate as regions of different vertical field decay index are crossed. The vertical movement of the plasma is found to be well modelled. For most experiments, DINA reproduces the S-shape of the vertical displacement in TCV with excellent precision. This behaviour cannot be modelled using linear time-independent models because of the predominant exponential shape due to the unstable pole of any linear time-independent model. The other most common equilibrium parameters like the plasma current Ip, the elongation K, the triangularity d, the safety factor q, the ratio between the averaged plasma kinetic pressure and the pressure of the poloidal magnetic field at the edge of the plasma bp and the internal self inductance l also show acceptable agreement. The evolution of the growth rate g is estimated and compared with the evolution of the closed loop growth rate calculated with the RZIP linear model, confirming the origin of the observed behaviour. (author)

  4. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  5. Tissue Trace Elements and Lipid Peroxidation in Breeding Female Bank Voles Myodes glareolus.

    Science.gov (United States)

    Bonda-Ostaszewska, Elżbieta; Włostowski, Tadeusz; Łaszkiewicz-Tiszczenko, Barbara

    2018-04-27

    Recent studies have demonstrated that reproduction reduces oxidative damage in various tissues of small mammal females. The present work was designed to determine whether the reduction of oxidative stress in reproductive bank vole females was associated with changes in tissue trace elements (iron, copper, zinc) that play an essential role in the production of reactive oxygen species. Lipid peroxidation (a marker of oxidative stress) and iron concentration in liver, kidneys, and skeletal muscles of reproducing bank vole females that weaned one litter were significantly lower than in non-reproducing females; linear regression analysis confirmed a positive relation between the tissue iron and lipid peroxidation. The concentrations of copper were significantly lower only in skeletal muscles of reproductive females and correlated positively with lipid peroxidation. No changes in tissue zinc were found in breeding females when compared with non-breeding animals. These data indicate that decreases in tissue iron and copper concentrations may be responsible for the reduction of oxidative stress in reproductive bank vole females.

  6. Use of trace elements as indicators for underground fluid circulations in karstic environment

    International Nuclear Information System (INIS)

    Pane-Escribe, M.B.

    1995-01-01

    The geochemical study of the trace element behaviour in karstic groundwaters has been carried out over the experimental site of Lamalou (Herault, France). Routine measurements of the physico-chemical parameters and of the dissolved elements concentrations have been achieved during two hydrological cycles. Radon has been monitored by passive detectors and by automatic electronic probes. Trace elements (Sc, Ti, V, Cr, Ni, Cu, Zn, As Rb, Sr, Mo, Cd, Sb, Cs, Ba, Th, U) were analyzed by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The first part of this work presents the methodologies employed with in particular the improvement of the analytical performances of ICP-MS for water samples analysis. The detection limit for each considered element has been determined. The short and long term reproducibility for the samples analysis has also been tested. The second part of this study presents the treatment and interpretation of the results. This analysis has pointed our the influence of the aquifer structure on the chemical elements distribution. The trace and major elements concentrations are effectively related to the fracturing state of the reservoir and allow to individualize the high transmissivity zones from zones with a lower transmissivity in this mono-lithological context, trace elements appear to be particularly efficient tracers for determining the water origin and circulation their spatial and temporal behaviour leads to identify three different origins for the water mineralization over the studied area: limestones, clays and external sources (rainfalls and occasional pollutions). (author)

  7. Code Differentiation for Hydrodynamic Model Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Henninger, R.J.; Maudlin, P.J.

    1999-06-27

    Use of a hydrodynamics code for experimental data fitting purposes (an optimization problem) requires information about how a computed result changes when the model parameters change. These so-called sensitivities provide the gradient that determines the search direction for modifying the parameters to find an optimal result. Here, the authors apply code-based automatic differentiation (AD) techniques applied in the forward and adjoint modes to two problems with 12 parameters to obtain these gradients and compare the computational efficiency and accuracy of the various methods. They fit the pressure trace from a one-dimensional flyer-plate experiment and examine the accuracy for a two-dimensional jet-formation problem. For the flyer-plate experiment, the adjoint mode requires similar or less computer time than the forward methods. Additional parameters will not change the adjoint mode run time appreciably, which is a distinct advantage for this method. Obtaining ''accurate'' sensitivities for the j et problem parameters remains problematic.

  8. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang

    2016-12-01

    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  9. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  10. Development of a new simulation code for evaluation of criticality transients involving fissile solution boiling

    International Nuclear Information System (INIS)

    Basoglu, Benan; Yamamoto, Toshihiro; Okuno, Hiroshi; Nomura, Yasushi

    1998-03-01

    In this work, we report on the development of a new computer code named TRACE for predicting the excursion characteristics of criticality excursions involving fissile solutions. TRACE employs point neutronics coupled with simple thermal-hydraulics. The temperature, the radiolytic gas effects, and the boiling phenomena are estimated using the transient heat conduction equation, a lumped-parameter energy model, and a simple boiling model, respectively. To evaluate the model, we compared our results with the results of CRAC experiments. The agreement in these comparisons is quite satisfactory. (author)

  11. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  12. Repeatability and reproducibility of intracellular molar concentration assessed by synchrotron-based x-ray fluorescence microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Merolle, L., E-mail: lucia.merolle@elettra.eu; Gianoncelli, A. [Elettra - Sincrotrone Trieste, 34149 Basovizza, Trieste (Italy); Malucelli, E., E-mail: emil.malucelli@unibo.it; Cappadone, C.; Farruggia, G.; Sargenti, A.; Procopio, A. [Department of Pharmacy and Biotechnology, University of Bologna, Bologna 40127 (Italy); Fratini, M. [Museo Storico della Fisica e Centro Studi e Ricerche Enrico Fermi, Piazza del Viminale 1, 00184 Roma Italy (Italy); Department of Science, Roma Tre University, Via della Vasca Navale 84, I-00146 Rome (Italy); Notargiacomo, A. [Institute for Photonics and Nanotechnology, Consiglio Nazionale delle Richerche, 00156 Rome (Italy); Lombardo, M. [Department of Chemistry “G. Ciamician”, University of Bologna, Bologna 40126 (Italy); Lagomarsino, S. [Institute of Chemical-Physical Processes, Sapienza University of Rome, 00185 Rome (Italy); National Institute of Biostructures and Biosystems, 00136 Rome (Italy); Iotti, S. [Department of Pharmacy and Biotechnology, University of Bologna, Bologna 40127 (Italy); National Institute of Biostructures and Biosystems, 00136 Rome (Italy)

    2016-01-28

    Elemental analysis of biological sample can give information about content and distribution of elements essential for human life or trace elements whose absence is the cause of abnormal biological function or development. However, biological systems contain an ensemble of cells with heterogeneous chemistry and elemental content; therefore, accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. Powerful methods in molecular biology are abundant, among them X-Ray microscopy based on synchrotron light source has gaining increasing attention thanks to its extremely sensitivity. However, reproducibility and repeatability of these measurements is one of the major obstacles in achieving a statistical significance in single cells population analysis. In this study, we compared the elemental content of human colon adenocarcinoma cells obtained by three distinct accesses to synchrotron radiation light.

  13. Multi-code analysis of scrape-off layer filament dynamics in MAST

    DEFF Research Database (Denmark)

    Militello, F.; Walkden, N. R.; Farley, T.

    2016-01-01

    velocities of the order of 1 km s(-1), a perpendicular diameter of around 2-3 cm and a density amplitude 2-3.5 times the background plasma. 3D and 2D numerical codes (the STORM module of BOUT++, GBS, HESEL and TOKAM3X) are used to reproduce the motion of the observed filaments with the purpose of validating...

  14. A Message Without a Code?

    Directory of Open Access Journals (Sweden)

    Tom Conley

    1981-01-01

    Full Text Available The photographic paradox is said to be that of a message without a code, a communication lacking a relay or gap essential to the process of communication. Tracing the recurrence of Barthes's definition in the essays included in Image/Music/Text and in La Chambre claire , this paper argues that Barthes's definition is platonic in its will to dematerialize the troubling — graphic — immediacy of the photograph. He writes of the image in order to flee its signature. As a function of media, his categories are written in order to be insufficient and inadequate; to maintain an ineluctable difference between language heard and letters seen; to protect an idiom of loss which the photograph disallows. The article studies the strategies of his definition in «The Photographic Paradox» as instrument of abstraction, opposes the notion of code, in an aural sense, to audio-visual markers of closed relay in advertising, and critiques the layout and order of La Chambre claire in respect to Barthes's ideology of absence.

  15. Modelling of SOL flows and target asymmetries in JET field reversal experiments with EDGE2D code

    International Nuclear Information System (INIS)

    Chankin, A.; Coad, J.; Corrigan, G.

    1999-11-01

    The EDGE2D code with drifts can reproduce the main trends of target asymmetries observed in field reversal experiments. It also re-produces qualitatively the main feature of recent JET results obtained with double-sided reciprocating Langmuir probes introduced near the top of the torus: the reversal of parallel plasma flow with toroidal field reversal. The code results suggest that the major contributor to the observed target asymmetries is the co-current toroidal momentum generated inside the scrape-off layer (SOL) by j r xB forces due to the presence of large up-down pressure asymmetries. Contrary to previous expectations of the predominant role of ExB drifts in creating target asymmetries, ∇B and centrifugal drifts were found to be mainly responsible for both parallel flows and target asymmetries. (author)

  16. Trace analysis of auxiliary feedwater capacity for Maanshan PWR loss-of-normal-feedwater transient

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Che-Hao; Shih, Chunkuan [National Tsing Hua Univ., Taiwan (China). Inst. of Nuclear Engineering and Science; Wang, Jong-Rong; Lin, Hao-Tzu [Atomic Energy Council, Taiwan (China). Inst. of Nuclear Energy Research

    2013-07-01

    Maanshan nuclear power plant is a Westinghouse PWR of Taiwan Power Company (Taipower, TPC). A few years ago, TPC has made many assessments in order to uprate the power of Maanshan NPP. The assessments include NSSS (Nuclear Steam Supply System) parameters calculation, uncertainty acceptance, integrity of pressure vessel, reliability of auxiliary systems, and transient analyses, etc. Since the Fukushima Daiichi accident happened, it is necessary to consider transients with multiple-failure. Base on the analysis, we further study the auxiliary feedwater capability for Loss-of-Normal-Feedwater (LONF) transient. LONF is the limiting transient of non-turbine trip initiated event for ATWS (Anticipated Transient Without Scram) which results in a reduction in capability of the secondary system to remove the heat generated in the reactor core. If the turbine fails to trip immediately, the secondary water inventory will decrease significantly before the actuation of auxiliary feedwater (AFW) system. The heat removal from the primary side decreases, and this leads to increases of primary coolant temperature and pressure. The water level of pressurizer also increases subsequently. The heat removal through the relief valves and the auxiliary feedwater is not sufficient to fully cope with the heat generation from primary side. The pressurizer will be filled with water finally, and the RCS pressure might rise above the set point of relief valves for water discharge. RCS pressure depends on steam generator inventory, primary coolant temperature, negative reactivity feedback, and core power, etc. The RCS pressure may reach its peak after core power reduction. According to ASME Code Level C service limit criteria, the Reactor Coolant System (RCS) pressure must be under 22.06 MPa. The USNRC is developing an advanced thermal hydraulic code named TRACE for nuclear power plant safety analysis. The development of TRACE is based on TRAC and integrating with RELAP5 and other programs. SNAP

  17. Trace analysis of auxiliary feedwater capacity for Maanshan PWR loss-of-normal-feedwater transient

    International Nuclear Information System (INIS)

    Chen, Che-Hao; Shih, Chunkuan; Wang, Jong-Rong; Lin, Hao-Tzu

    2013-01-01

    Maanshan nuclear power plant is a Westinghouse PWR of Taiwan Power Company (Taipower, TPC). A few years ago, TPC has made many assessments in order to uprate the power of Maanshan NPP. The assessments include NSSS (Nuclear Steam Supply System) parameters calculation, uncertainty acceptance, integrity of pressure vessel, reliability of auxiliary systems, and transient analyses, etc. Since the Fukushima Daiichi accident happened, it is necessary to consider transients with multiple-failure. Base on the analysis, we further study the auxiliary feedwater capability for Loss-of-Normal-Feedwater (LONF) transient. LONF is the limiting transient of non-turbine trip initiated event for ATWS (Anticipated Transient Without Scram) which results in a reduction in capability of the secondary system to remove the heat generated in the reactor core. If the turbine fails to trip immediately, the secondary water inventory will decrease significantly before the actuation of auxiliary feedwater (AFW) system. The heat removal from the primary side decreases, and this leads to increases of primary coolant temperature and pressure. The water level of pressurizer also increases subsequently. The heat removal through the relief valves and the auxiliary feedwater is not sufficient to fully cope with the heat generation from primary side. The pressurizer will be filled with water finally, and the RCS pressure might rise above the set point of relief valves for water discharge. RCS pressure depends on steam generator inventory, primary coolant temperature, negative reactivity feedback, and core power, etc. The RCS pressure may reach its peak after core power reduction. According to ASME Code Level C service limit criteria, the Reactor Coolant System (RCS) pressure must be under 22.06 MPa. The USNRC is developing an advanced thermal hydraulic code named TRACE for nuclear power plant safety analysis. The development of TRACE is based on TRAC and integrating with RELAP5 and other programs. SNAP

  18. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  19. Evaluations of the CCFL and critical flow models in TRACE for PWR LBLOCA analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jung-Hua; Lin, Hao Tzu [National Tsing Hua Univ., HsinChu, Taiwan (China). Dept. of Engineering and System Science; Wang, Jong-Rong [Atomic Energy Council, Taoyuan County, Taiwan (China). Inst. of Nuclear Energy Research; Shih, Chunkuan [National Tsing Hua Univ., HsinChu, Taiwan (China). Inst. of Nuclear Engineering and Science

    2012-12-15

    This study aims to develop the Maanshan Pressurized Water Reactor (PWR) analysis model by using the TRACE (TRAC/RELAP Advanced Computational Engine) code. By analyzing the Large Break Loss of Coolant Accident (LBLOCA) sequence, the results are compared with the Maanshan Final Safety Analysis Report (FSAR) data. The critical flow and Counter Current Flow Limitation (CCFL) play an important role in the overall performance of TRACE LBLOCA prediction. Therefore, the sensitivity study on the discharge coefficients of critical flow model and CCFL modeling among different regions are also discussed. The current conclusions show that modeling CCFL in downcomer has more significant impact on the peak cladding temperature than modeling CCFL in hot-legs does. No CCFL phenomena occurred in the pressurizer surge line. The best value for the multipliers of critical flow model would be 0.5 and the TRACE could consistently predict the break flow rate in the LBLOCA analysis as shown in FSAR. (orig.)

  20. Accuracy assessment of a new Monte Carlo based burnup computer code

    International Nuclear Information System (INIS)

    El Bakkari, B.; ElBardouni, T.; Nacir, B.; ElYounoussi, C.; Boulaich, Y.; Meroun, O.; Zoubair, M.; Chakir, E.

    2012-01-01

    Highlights: ► A new burnup code called BUCAL1 was developed. ► BUCAL1 uses the MCNP tallies directly in the calculation of the isotopic inventories. ► Validation of BUCAL1 was done by code to code comparison using VVER-1000 LEU Benchmark Assembly. ► Differences from BM value were found to be ± 600 pcm for k ∞ and ±6% for the isotopic compositions. ► The effect on reactivity due to the burnup of Gd isotopes is well reproduced by BUCAL1. - Abstract: This study aims to test for the suitability and accuracy of a new home-made Monte Carlo burnup code, called BUCAL1, by investigating and predicting the neutronic behavior of a “VVER-1000 LEU Assembly Computational Benchmark”, at lattice level. BUCAL1 uses MCNP tally information directly in the computation; this approach allows performing straightforward and accurate calculation without having to use the calculated group fluxes to perform transmutation analysis in a separate code. ENDF/B-VII evaluated nuclear data library was used in these calculations. Processing of the data library is performed using recent updates of NJOY99 system. Code to code comparisons with the reported Nuclear OECD/NEA results are presented and analyzed.

  1. Analysing traces of autoinducer-2 requires standardization of the Vibrio harveyi bioassay.

    Science.gov (United States)

    Vilchez, Ramiro; Lemme, André; Thiel, Verena; Schulz, Stefan; Sztajer, Helena; Wagner-Döbler, Irene

    2007-01-01

    Autoinducer-2 (furanosyl borate diester) is a biologically active compound whose role as a universal bacterial signalling molecule is currently under intense investigation. Because of its instability and the low concentrations of it found in biological samples, its detection relies at present on a bioassay that measures the difference in the timing of the luminescence of the Vibrio harveyi BB170 sensor strain with and without externally added AI-2. Here we systematically investigated which parameters affected the fold induction values of luminescence obtained in the bioassay and developed a modified protocol. Our experiments showed that growth and luminescence of V. harveyi BB170 are strongly influenced by trace elements. In particular, addition of Fe(3+) within a certain concentration range to the growth medium of the preinoculum culture improved the reproducibility and reduced the variance of the bioassay. In contrast, trace elements and vitamins introduced directly into the bioassay caused inhibitory effects. The initial density and luminescence of the sensor strain are very important and the values required for these parameters were defined. Borate interferes with the detection of AI-2 by giving false positive results. The response of V. harveyi BB170 to chemically synthesized AI-2 in the bioassay is nonlinear except over a very small concentration range; it is maximum over three orders of magnitude and shows inhibition above 35 microM. Based on the modified protocol, we were able to detect AI-2 in the absence of inhibitors with maximum fold induction values for the positive control (chemically synthesized AI-2) of >120 with a standard deviation of approximately 30% in a reliable and reproducible way.

  2. A molecular dynamics simulation code ISIS

    International Nuclear Information System (INIS)

    Kambayashi, Shaw

    1992-06-01

    Computer simulation based on the molecular dynamics (MD) method has become an important tool complementary to experiments and theoretical calculations in a wide range of scientific fields such as physics, chemistry, biology, and so on. In the MD method, the Newtonian equations-of-motion of classical particles are integrated numerically to reproduce a phase-space trajectory of the system. In the 1980's, several new techniques have been developed for simulation at constant-temperature and/or constant-pressure in convenient to compare result of computer simulation with experimental results. We first summarize the MD method for both microcanonical and canonical simulations. Then, we present and overview of a newly developed ISIS (Isokinetic Simulation of Soft-spheres) code and its performance on various computers including vector processors. The ISIS code has a capability to make a MD simulation under constant-temperature condition by using the isokinetic constraint method. The equations-of-motion is integrated by a very accurate fifth-order finite differential algorithm. The bookkeeping method is also utilized to reduce the computational time. Furthermore, the ISIS code is well adopted for vector processing: Speedup ratio ranged from 16 to 24 times is obtained on a VP2600/10 vector processor. (author)

  3. ER@CEBAF: Modeling code developments

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Roblin, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-04-13

    A proposal for a multiple-pass, high-energy, energy-recovery experiment using CEBAF is under preparation in the frame of a JLab-BNL collaboration. In view of beam dynamics investigations regarding this project, in addition to the existing model in use in Elegant a version of CEBAF is developed in the stepwise ray-tracing code Zgoubi, Beyond the ER experiment, it is also planned to use the latter for the study of polarization transport in the presence of synchrotron radiation, down to Hall D line where a 12 GeV polarized beam can be delivered. This Note briefly reports on the preliminary steps, and preliminary outcomes, based on an Elegant to Zgoubi translation.

  4. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  5. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  6. Predictive modelling of the impact of argon injection on H-mode plasmas in JET with the RITM code

    International Nuclear Information System (INIS)

    Unterberg, B; Kalupin, D; Tokar', M Z; Corrigan, G; Dumortier, P; Huber, A; Jachmich, S; Kempenaars, M; Kreter, A; Messiaen, A M; Monier-Garbet, P; Ongena, J; Puiatti, M E; Valisa, M; Hellermann, M von

    2004-01-01

    Self-consistent modelling of energy and particle transport of the plasma background and impurities has been performed with the code RITM for argon seeded high density H-mode plasmas in JET. The code can reproduce both the profiles in the plasma core and the structure of the edge pedestal. The impact of argon on core transport is found to be small; in particular, no significant change in confinement is observed in both experimental and modelling results. The same transport model, which has been used to reproduce density peaking in the radiative improved mode in TEXTOR, reveals a flat density profile in Ar seeded JET H-mode plasmas in agreement with the experimental observations. This behaviour is attributed to the rather flat profile of the safety factor in the bulk of H-mode discharges

  7. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  8. On initial Brain Activity Mapping of episodic and semantic memory code in the hippocampus.

    Science.gov (United States)

    Tsien, Joe Z; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Wang, Phillip Lei; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui

    2013-10-01

    It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Method for calculating internal radiation and ventilation with the ADINAT heat-flow code

    International Nuclear Information System (INIS)

    Butkovich, T.R.; Montan, D.N.

    1980-01-01

    One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation and ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation

  10. Sticks and Stones: Why First Amendment Absolutism Fails When Applied to Campus Harassment Codes.

    Science.gov (United States)

    Lumsden, Linda

    This paper analyzes how absolutist arguments against campus harassment codes violate the spirit of the first amendment, examining in particular the United States Supreme Court ruling in "RAV v. St. Paul." The paper begins by tracing the current development of first amendment doctrine, analyzing its inadequacy in the campus hate speech…

  11. Post-test analysis of PIPER-ONE PO-IC-2 experiment by RELAP5/MOD3 codes

    International Nuclear Information System (INIS)

    Bovalini, R.; D'Auria, F.; Galassi, G.M.; Mazzini, M.

    1996-11-01

    RELAP5/MOD3.1 was applied to the PO-IC-2 experiment performed in PIPER-ONE facility, which has been modified to reproduce typical isolation condenser thermal-hydraulic conditions. RELAP5 is a well known code widely used at the University of Pisa during the past seven years. RELAP5/MOD3.1 was the latest version of the code made available by the Idaho National Engineering Laboratory at the time of the reported study. PIPER-ONE is an experimental facility simulating a General Electric BWR-6 with volume and height scaling ratios of 1/2,200 and 1./1, respectively. In the frame of the present activity a once-through heat exchanger immersed in a pool of ambient temperature water, installed approximately 10 m above the core, was utilized to reproduce qualitatively the phenomenologies expected for the Isolation Condenser in the simplified BWR (SBWR). The PO-IC-2 experiment is the flood up of the PO-SD-8 and has been designed to solve some of the problems encountered in the analysis of the PO-SD-8 experiment. A very wide analysis is presented hereafter including the use of different code versions

  12. Predictions of bubbly flows in vertical pipes using two-fluid models in CFDS-FLOW3D code

    International Nuclear Information System (INIS)

    Banas, A.O.; Carver, M.B.; Unrau, D.

    1995-01-01

    This paper reports the results of a preliminary study exploring the performance of two sets of two-fluid closure relationships applied to the simulation of turbulent air-water bubbly upflows through vertical pipes. Predictions obtained with the default CFDS-FLOW3D model for dispersed flows were compared with the predictions of a new model (based on the work of Lee), and with the experimental data of Liu. The new model, implemented in the CFDS-FLOW3D code, included additional source terms in the open-quotes standardclose quotes κ-ε transport equations for the liquid phase, as well as modified model coefficients and wall functions. All simulations were carried out in a 2-D axisymmetric format, collapsing the general multifluid framework of CFDS-FLOW3D to the two-fluid (air-water) case. The newly implemented model consistently improved predictions of radial-velocity profiles of both phases, but failed to accurately reproduce the experimental phase-distribution data. This shortcoming was traced to the neglect of anisotropic effects in the modelling of liquid-phase turbulence. In this sense, the present investigation should be considered as the first step toward the ultimate goal of developing a theoretically sound and universal CFD-type two-fluid model for bubbly flows in channels

  13. Predictions of bubbly flows in vertical pipes using two-fluid models in CFDS-FLOW3D code

    Energy Technology Data Exchange (ETDEWEB)

    Banas, A.O.; Carver, M.B. [Chalk River Laboratories (Canada); Unrau, D. [Univ. of Toronto (Canada)

    1995-09-01

    This paper reports the results of a preliminary study exploring the performance of two sets of two-fluid closure relationships applied to the simulation of turbulent air-water bubbly upflows through vertical pipes. Predictions obtained with the default CFDS-FLOW3D model for dispersed flows were compared with the predictions of a new model (based on the work of Lee), and with the experimental data of Liu. The new model, implemented in the CFDS-FLOW3D code, included additional source terms in the {open_quotes}standard{close_quotes} {kappa}-{epsilon} transport equations for the liquid phase, as well as modified model coefficients and wall functions. All simulations were carried out in a 2-D axisymmetric format, collapsing the general multifluid framework of CFDS-FLOW3D to the two-fluid (air-water) case. The newly implemented model consistently improved predictions of radial-velocity profiles of both phases, but failed to accurately reproduce the experimental phase-distribution data. This shortcoming was traced to the neglect of anisotropic effects in the modelling of liquid-phase turbulence. In this sense, the present investigation should be considered as the first step toward the ultimate goal of developing a theoretically sound and universal CFD-type two-fluid model for bubbly flows in channels.

  14. Investigation of trace elements in Elbe water by means of instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Motamedi, K.

    1977-01-01

    Investigations of trace elements in Elbe water were carried out as a contribution to environmental research, hydrology, and geochemistry. The method applied - instrumental neutron activation analysis - is described, and problems connected with the course of analysis - sample taking, handling and preparation as well as optimization of in-pile irradiation and measurement by means of γ spectrometry - are discussed and presented one by one. The computer programme set up for automatic evaluation is described in more detail. This programme AKAN has a very general concept which makes it applicable for general use. The reliability of the evaluation procedure - monostandard method - and the reproducibility of the results are discussed. For the studies, samples were taken at different times, every time from 8 positions along a long section of the Elbe. The content of solids was analyzed; in a number of samples, this was done by separating suspended and dissolved materials. Up to 38 elements were analyzed, whose local and time-dependent concentration curves are given. The contents of some elements are compared with the few available data from literature. Correlation calculations indicate a similar behaviour of single element groups and yield information on the natural origin of the trace elements and on anthropogenic influence to be noticed in the trace element contents. (orig.) [de

  15. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  16. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  17. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    International Nuclear Information System (INIS)

    Suer, Pascal; Lindqvist, Jan-Erik; Arm, Maria; Frogner-Kockum, Paul

    2009-01-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO 2 ) were used for accelerated ageing. Time (7-14 days), temperature (20-40 o C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO 2 and seven days at 40 o C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO 4 , DOC and Cr were not reproduced.

  18. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  19. Nuclear traces in glass

    International Nuclear Information System (INIS)

    Segovia A, M. de N.

    1978-01-01

    The charged particles produce, in dielectric materials, physical and chemical effects which make evident the damaged zone along the trajectory of the particle. This damaged zone is known as the latent trace. The latent traces can be enlarged by an etching of the detector material. This treatment attacks preferently the zones of the material where the charged particles have penetrated, producing concavities which can be observed through a low magnification optical microscope. These concavities are known as developed traces. In this work we describe the glass characteristics as a detector of the fission fragments traces. In the first chapter we present a summary of the existing basic theories to explain the formation of traces in solids. In the second chapter we describe the etching method used for the traces development. In the following chapters we determine some chatacteristics of the traces formed on the glass, such as: the development optimum time; the diameter variation of the traces and their density according to the temperature variation of the detector; the glass response to a radiation more penetrating than that of the fission fragments; the distribution of the developed traces and the existing relation between this ditribution and the fission fragments of 252 Cf energies. The method which has been used is simple and cheap and can be utilized in laboratories whose resources are limited. The commercial glass which has been employed allows the registration of the fission fragments and subsequently the realization of experiments which involve the counting of the traces as well as the identification of particles. (author)

  20. [INVITED] Luminescent QR codes for smart labelling and sensing

    Science.gov (United States)

    Ramalho, João F. C. B.; António, L. C. F.; Correia, S. F. H.; Fu, L. S.; Pinho, A. S.; Brites, C. D. S.; Carlos, L. D.; André, P. S.; Ferreira, R. A. S.

    2018-05-01

    QR (Quick Response) codes are two-dimensional barcodes composed of special geometric patterns of black modules in a white square background that can encode different types of information with high density and robustness, correct errors and physical damages, thus keeping the stored information protected. Recently, these codes have gained increased attention as they offer a simple physical tool for quick access to Web sites for advertising and social interaction. Challenges encompass the increase of the storage capacity limit, even though they can store approximately 350 times more information than common barcodes, and encode different types of characters (e.g., numeric, alphanumeric, kanji and kana). In this work, we fabricate luminescent QR codes based on a poly(methyl methacrylate) substrate coated with organic-inorganic hybrid materials doped with trivalent terbium (Tb3+) and europium (Eu3+) ions, demonstrating the increase of storage capacity per unit area by a factor of two by using the colour multiplexing, when compared to conventional QR codes. A novel methodology to decode the multiplexed QR codes is developed based on a colour separation threshold where a decision level is calculated through a maximum-likelihood criteria to minimize the error probability of the demultiplexed modules, maximizing the foreseen total storage capacity. Moreover, the thermal dependence of the emission colour coordinates of the Eu3+/Tb3+-based hybrids enables the simultaneously QR code colour-multiplexing and may be used to sense temperature (reproducibility higher than 93%), opening new fields of applications for QR codes as smart labels for sensing.

  1. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  2. Digital coherent detection research on Brillouin optical time domain reflectometry with simplex pulse codes

    International Nuclear Information System (INIS)

    Hao Yun-Qi; Ye Qing; Pan Zheng-Qing; Cai Hai-Wen; Qu Rong-Hui

    2014-01-01

    The digital coherent detection technique has been investigated without any frequency-scanning device in the Brillouin optical time domain reflectometry (BOTDR), where the simplex pulse codes are applied in the sensing system. The time domain signal of every code sequence is collected by the data acquisition card (DAQ). A shift-averaging technique is applied in the frequency domain for the reason that the local oscillator (LO) in the coherent detection is fix-frequency deviated from the primary source. With the 31-bit simplex code, the signal-to-noise ratio (SNR) has 3.5-dB enhancement with the same single pulse traces, accordant with the theoretical analysis. The frequency fluctuation for simplex codes is 14.01 MHz less than that for a single pulse as to 4-m spatial resolution. The results are believed to be beneficial for the BOTDR performance improvement. (general)

  3. Tracing the evolution of critical evaluation skills in students' use of the Internet.

    OpenAIRE

    Blumberg, P; Sparks, J

    1999-01-01

    This paper documents the evolving uses of the Internet made by public health graduate students and traces the development of their search methods and critical evaluative criteria. Early in the first semester and again six months later, twenty-four graduate students in a problem-based learning curriculum, which emphasizes evidence-based critical thinking skills, were required to describe their most helpful resources and to evaluate these resources critically. The answers were coded for the typ...

  4. TraceContract: A Scala DSL for Trace Analysis

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus

    2011-01-01

    In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.

  5. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  6. Applicability of Coupled Thermalhydraulic Codes for Safety Analysis of Nuclear Reactors

    International Nuclear Information System (INIS)

    Gairola, A.; Bhowmik, P. K.; Shamim, J. A.; Suh, K. Y.

    2014-01-01

    To this end computational codes like RELAP and TRACE are used to model thermal-hydraulic response of nuclear power plant during an accident. By careful modeling and significant user experience these system codes are able to simulate the behavior of primary system and the containment to a reasonable extent. Comparatively decoupled simulation is simple but might not produce reality and the physics involved in an accurate manner. Thus simulation using two different system codes is interesting as the whole system is coupled through the pressure in the containment and flow through the break. Using this methodology it might be possible to get new insight about the primary and containment behavior by the precise simulation of the accident both in the current reactors and future Gen-III/III+ reactors. Couple thermalhydraulic code methodology is still new and require further investigations. Applicability of such methodology to the GEN-II plants have met with limited success, however a number of situations in which this methodology could be applied are still unexplored and thus provides a room for improvement and modifications

  7. Applicability of Coupled Thermalhydraulic Codes for Safety Analysis of Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Gairola, A.; Bhowmik, P. K.; Shamim, J. A.; Suh, K. Y. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    To this end computational codes like RELAP and TRACE are used to model thermal-hydraulic response of nuclear power plant during an accident. By careful modeling and significant user experience these system codes are able to simulate the behavior of primary system and the containment to a reasonable extent. Comparatively decoupled simulation is simple but might not produce reality and the physics involved in an accurate manner. Thus simulation using two different system codes is interesting as the whole system is coupled through the pressure in the containment and flow through the break. Using this methodology it might be possible to get new insight about the primary and containment behavior by the precise simulation of the accident both in the current reactors and future Gen-III/III+ reactors. Couple thermalhydraulic code methodology is still new and require further investigations. Applicability of such methodology to the GEN-II plants have met with limited success, however a number of situations in which this methodology could be applied are still unexplored and thus provides a room for improvement and modifications.

  8. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  9. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  10. Space-Time Convolutional Codes over Finite Fields and Rings for Systems with Large Diversity Order

    Directory of Open Access Journals (Sweden)

    B. F. Uchôa-Filho

    2008-06-01

    Full Text Available We propose a convolutional encoder over the finite ring of integers modulo pk,ℤpk, where p is a prime number and k is any positive integer, to generate a space-time convolutional code (STCC. Under this structure, we prove three properties related to the generator matrix of the convolutional code that can be used to simplify the code search procedure for STCCs over ℤpk. Some STCCs of large diversity order (≥4 designed under the trace criterion for n=2,3, and 4 transmit antennas are presented for various PSK signal constellations.

  11. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  12. Investigation of alpha experiment by severe accident analysis code SAMPSON

    International Nuclear Information System (INIS)

    Baglietto, Emilio; Ninokata, Hisashi; Naitoh, Masanori

    2006-01-01

    The severe accident analysis code SAMPSON is adopted in this work to evaluate its capability of reproducing the complex gap cooling phenomenon. The ALPHA experiment is adopted for validation, where molten aluminum oxide (Al 2 O 3 ) produced by a thermite reaction is poured into a water filled hemispherical vessel at the ambient pressure of approximately 1.3 MPa. The spreading and cooling of the debris that has relocated into the pressure vessel lower plenum are simulated, including the analysis of the RPV failure. The model included in the core to mimic the water penetration inside the gap is evaluated and improvements are proposed. The importance of the introduction of some mechanistic approach to describe the gap formation and evolution is underlined, where the results show its necessity in order to correctly reproduce the experimental trends. (author)

  13. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  14. Chelatable trace zinc causes low, irreproducible KDAC8 activity.

    Science.gov (United States)

    Toro, Tasha B; Edenfield, Samantha A; Hylton, Brandon J; Watt, Terry J

    2018-01-01

    Acetylation is an important regulatory mechanism in cells, and emphasis is being placed on identifying substrates and small molecule modulators of this post-translational modification. However, the reported in vitro activity of the lysine deacetylase KDAC8 is inconsistent across experimental setups, even with the same substrate, complicating progress in the field. We detected trace levels of zinc, a known inhibitor of KDAC8 when present in excess, even in high-quality buffer reagents, at concentrations that are sufficient to significantly inhibit the enzyme under common reaction conditions. We hypothesized that trace zinc in solution could account for the observed variability in KDAC8 activity. We demonstrate that addition of chelators, including BSA, EDTA, and citrate, and/or the use of a phosphate-based buffer instead of the more common tris-based buffer, eliminates the inhibition from low levels of zinc as well as the dependence of specific activity on enzyme concentration. This results in high KDAC8 activity that is consistent across buffer systems, even using low concentrations of enzyme. We report conditions that are suitable for several assays to increase both enzyme activity and reproducibility. Our results have significant implications for approaches used to identify substrates and small molecule modulators of KDAC8 and interpretation of existing data. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  16. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  17. Simulations of vertical disruptions with VDE code: Hiro and Evans currents

    Science.gov (United States)

    Li, Xujing; Di Hu Team; Leonid Zakharov Team; Galkin Team

    2014-10-01

    The recently created numerical code VDE for simulations of vertical instability in tokamaks is presented. The numerical scheme uses the Tokamak MHD model, where the plasma inertia is replaced by the friction force, and an adaptive grid numerical scheme. The code reproduces well the surface currents generated at the plasma boundary by the instability. Five regimes of the vertical instability are presented: (1) Vertical instability in a given plasma shaping field without a wall; (2) The same with a wall and magnetic flux ΔΨ|plX< ΔΨ|Xwall(where X corresponds to the X-point of a separatrix); (3) The same with a wall and magnetic flux ΔΨ|plX> ΔΨ|Xwall; (4) Vertical instability without a wall with a tile surface at the plasma path; (5) The same in the presence of a wall and a tile surface. The generation of negative Hiro currents along the tile surface, predicted earlier by the theory and measured on EAST in 2012, is well-reproduced by simulations. In addition, the instability generates the force-free Evans currents at the free plasma surface. The new pattern of reconnection of the plasma with the vacuum magnetic field is discovered. This work is supported by US DoE Contract No. DE-AC02-09-CH11466.

  18. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  19. Present and Future Challenges in Trace and Ultra-Trace Analysis

    International Nuclear Information System (INIS)

    Toulhoat, P.

    2005-01-01

    The analysis of trace and ultra-trace elements is continuously stimulating the progress in analytical chemistry. Environmental chemistry, radiochemistry, biology, health, agri-food are prescribers of trace analyses, with continuously increasing exigencies: lowering detection limits, lowering costs and analysis time, improving the quality of analytical information. Precise data about the chemical identity and chemical environment of analytes are now requested. Such pieces of information, beyond simple numerical data and confidence intervals, are necessary to understand studied systems, and to predict their evolution. From environmental contamination cases, one can envisage the various aspects of a problem, with for each of them its own exigencies and specificities in terms of analytical methods and approaches. The detection of traces and ultra-traces of actinides and fission products has been recently revisited and stimulates new technological developments (non proliferation issues, waste management). Data on their speciation in geological and biological media are essential for evaluating the safety of nuclear waste repositories. Various techniques are now used to determine speciation in liquid samples or on surfaces, with tremendous spatial resolutions or sensitivities. A new revolution in analytical chemistry is expected with the development of micro- or nano-analytical technologies. (author)

  20. Parallel Computing Characteristics of Two-Phase Thermal-Hydraulics code, CUPID

    International Nuclear Information System (INIS)

    Lee, Jae Ryong; Yoon, Han Young

    2013-01-01

    Parallelized CUPID code has proved to be able to reproduce multi-dimensional thermal hydraulic analysis by validating with various conceptual problems and experimental data. In this paper, the characteristics of the parallelized CUPID code were investigated. Both single- and two phase simulation are taken into account. Since the scalability of a parallel simulation is known to be better for fine mesh system, two types of mesh system are considered. In addition, the dependency of the preconditioner for matrix solver was also compared. The scalability for the single-phase flow is better than that for two-phase flow due to the less numbers of iterations for solving pressure matrix. The CUPID code was investigated the parallel performance in terms of scalability. The CUPID code was parallelized with domain decomposition method. The MPI library was adopted to communicate the information at the interface cells. As increasing the number of mesh, the scalability is improved. For a given mesh, single-phase flow simulation with diagonal preconditioner shows the best speedup. However, for the two-phase flow simulation, the ILU preconditioner is recommended since it reduces the overall simulation time

  1. Trace analysis

    International Nuclear Information System (INIS)

    Warner, M.

    1987-01-01

    What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques

  2. Study on method of characteristics based on cell modular ray tracing

    International Nuclear Information System (INIS)

    Tang Chuntao; Zhang Shaohong

    2009-01-01

    To address the issue of accurately solving neutron transport problem in complex geometry, method of characteristics (MOC) is studied in this paper, and a quite effective and memory saving cell modular ray tracing (CMRT) method is developed and related angle discretization and boundary condition handling issues are discussed. A CMRT based MOC code-PEACH is developed and tested against C5G7 MOX benchmark problem. Numerical results demonstrate that PEACH can give excellent accuracy for both k eff and pin power distribution for neutron transport problem. (authors)

  3. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  4. Transparent Runtime Migration of Loop-Based Traces of Processor Instructions to Reconfigurable Processing Units

    Directory of Open Access Journals (Sweden)

    João Bispo

    2013-01-01

    Full Text Available The ability to map instructions running in a microprocessor to a reconfigurable processing unit (RPU, acting as a coprocessor, enables the runtime acceleration of applications and ensures code and possibly performance portability. In this work, we focus on the mapping of loop-based instruction traces (called Megablocks to RPUs. The proposed approach considers offline partitioning and mapping stages without ignoring their future runtime applicability. We present a toolchain that automatically extracts specific trace-based loops, called Megablocks, from MicroBlaze instruction traces and generates an RPU for executing those loops. Our hardware infrastructure is able to move loop execution from the microprocessor to the RPU transparently, at runtime, and without changing the executable binaries. The toolchain and the system are fully operational. Three FPGA implementations of the system, differing in the hardware interfaces used, were tested and evaluated with a set of 15 application kernels. Speedups ranging from 1.26 to 3.69 were achieved for the best alternative using a MicroBlaze processor with local memory.

  5. An Open Framework for the Reproducible Study of the Iterated Prisoner’s Dilemma

    Directory of Open Access Journals (Sweden)

    Vincent Knight

    2016-08-01

    Full Text Available The Axelrod library is an open source Python package that allows for reproducible game theoretic research into the Iterated Prisoner’s Dilemma. This area of research began in the 1980s but suffers from a lack of documentation and test code. The goal of the library is to provide such a resource, with facilities for the design of new strategies and interactions between them, as well as conducting tournaments and ecological simulations for populations of strategies. With a growing collection of 139 strategies, the library is a also a platform for an original tournament that, in itself, is of interest to the game theoretic community. This paper describes the Iterated Prisoner’s Dilemma, the Axelrod library and its development, and insights gained from some novel research.

  6. Real-time beam tracing for control of the deposition location of electron cyclotron waves

    Energy Technology Data Exchange (ETDEWEB)

    Reich, M., E-mail: matthias.reich@ipp.mpg.de; Bilato, R.; Mszanowski, U.; Poli, E.; Rapson, C.; Stober, J.; Volpe, F.; Zille, R.

    2015-11-15

    Highlights: • We successfully integrated a real-time EC beam tracing code at ASDEX Upgrade. • The calculation of EC beam deposition location is fast enough for control purposes. • The accuracy of the deposition location calculation exceeds equivalent measurements. • The implementation method is by design portable to larger fusion devices. - Abstract: Plasma control techniques that use electron cyclotron (EC) resonance heating and current drive such as control of neoclassical tearing modes require accurate control of the deposition location of EC beams. ASDEX Upgrade has successfully implemented a real-time version of the beam-tracing code TORBEAM into its real-time diagnostic system to act as a globally available module that calculates current deposition location and its sensitivity from other real-time diagnostic measurements for all its moveable EC wave launchers. Based on a highly (100×) accelerated version of TORBEAM, the software implementation as a diagnostic process uses parallelization and achieves cycle times of 15–20 ms for determining the radial deposition location of 12 beams in the plasma. This cycle time includes data input–output overhead arising from the use of available real-time signals. The system is by design portable to other machines such as ITER.

  7. A novel domain overlapping strategy for the multiscale coupling of CFD with 1D system codes with applications to transient flows

    International Nuclear Information System (INIS)

    Grunloh, T.P.; Manera, A.

    2016-01-01

    Highlights: • A novel domain overlapping coupling method is presented. • Method calculates closure coefficients for system codes based on CFD results. • Convergence and stability are compared with a domain decomposition implementation. • Proposed method is tested in several 1D cases. • Proposed method found to exhibit more favorable convergence and stability behavior. - Abstract: A novel multiscale coupling methodology based on a domain overlapping approach has been developed to couple a computational fluid dynamics code with a best-estimate thermal hydraulic code. The methodology has been implemented in the coupling infrastructure code Janus, developed at the University of Michigan, providing methods for the online data transfer between the commercial computational fluid dynamics code STAR-CCM+ and the US NRC best-estimate thermal hydraulic system code TRACE. Coupling between these two software packages is motivated by the desire to extend the range of applicability of TRACE to scenarios in which local momentum and energy transfer are important, such as three-dimensional mixing. These types of flows are relevant, for example, in the simulation of passive safety systems including large containment pools, or for flow mixing in the reactor pressure vessel downcomer of current light water reactors and integral small modular reactors. The intrafluid shear forces neglected by TRACE equations of motion are readily calculated from computational fluid dynamics solutions. Consequently, the coupling methods used in this study are built around correcting TRACE solutions with data from a corresponding STAR-CCM+ solution. Two coupling strategies are discussed in the paper: one based on a novel domain overlapping approach specifically designed for transient operation, and a second based on the well-known domain decomposition approach. In the present paper, we discuss the application of the two coupling methods to the simulation of open and closed loops in both steady

  8. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  9. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  10. Traces of Drosophila Memory

    Science.gov (United States)

    Davis, Ronald L.

    2012-01-01

    Summary Studies using functional cellullar imaging of living flies have identified six memory traces that form in the olfactory nervous system after conditioning with odors. These traces occur in distinct nodes of the olfactory nervous system, form and disappear across different windows of time, and are detected in the imaged neurons as increased calcium influx or synaptic release in response to the conditioned odor. Three traces form at, or near acquisition and co-exist with short-term behavioral memory. One trace forms with a delay after learning and co-exists with intermediate-term behavioral memory. Two traces form many hours after acquisition and co-exist with long-term behavioral memory. The transient memory traces may support behavior across the time-windows of their existence. The experimental approaches for dissecting memory formation in the fly, ranging from the molecular to the systems, make it an ideal system for dissecting the logic by which the nervous system organizes and stores different temporal forms of memory. PMID:21482352

  11. Simulation of reflooding on two parallel heated channel by TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Zakir, Md. Ghulam [Department of Nuclear Engineering, Chalmers University of Technology, Gothenburg (Sweden)

    2016-07-12

    In case of Loss-Of-Coolant accident (LOCA) in a Boiling Water Reactor (BWR), heat generated in the nuclear fuel is not adequately removed because of the decrease of the coolant mass flow rate in the reactor core. This fact leads to an increase of the fuel temperature that can cause damage to the core and leakage of the radioactive fission products. In order to reflood the core and to discontinue the increase of temperature, an Emergency Core Cooling System (ECCS) delivers water under this kind of conditions. This study is an investigation of how the power distribution between two channels can affect the process of reflooding when the emergency water is injected from the top of the channels. The peak cladding temperature (PCT) on LOCA transient for different axial level is determined as well. A thermal-hydraulic system code TRACE has been used. A TRACE model of the two heated channels has been developed, and three hypothetical cases with different power distributions have been studied. Later, a comparison between a simulated and experimental data has been shown as well.

  12. Analysis of the ISP-50 direct vessel injection SBLOCA in the ATLAS facility with the RELAP5/MOD3.3 code

    Energy Technology Data Exchange (ETDEWEB)

    Sharabi, Medhat; Freixa, Jordi [Paul Scherrer Institute, Nuclear Energy and Safety Department, Zurich (Sweden)

    2012-10-15

    The pressurized water reactor APR1400 adopts DVI (Direct Vessel Injection) for the emergency cooling water in the upper downcomer annulus. The International Standard Problem number 50 (ISP-50) was launched with the aim to investigate thermal hydraulic phenomena during a 50% DVI line break scenario with best estimate codes making use of the experimental data available from the ATLAS facility located at KAERI. The present work describes the calculation results obtained for the ISP-50 using the RELAP5/MOD3.3 system code. The work aims at validation and assessment of the code to reproduce the observed phenomena and investigate about its limitations to predict complicated mixing phenomena between the subcooled emergency cooling water and the two-phase flow in the downcomer. The obtained results show that the overall trends of the main test variables are well reproduced by the calculations. In particular, the pressure in the primary system show excellent agreement with the experiment. The loop seal clearance phenomenon was observed in the calculation and it was found to have an important influence on the transient progression. Moreover, the collapsed water levels in the core are accurately reproduced in the simulations. However, the drop in the downcomer level before the activation of the DVI from safety injection tanks was underestimated due to multi-dimensional phenomena in the downcomer that are not properly captured by one-dimensional simulations.

  13. A computational study on altered theta-gamma coupling during learning and phase coding.

    Directory of Open Access Journals (Sweden)

    Xuejuan Zhang

    Full Text Available There is considerable interest in the role of coupling between theta and gamma oscillations in the brain in the context of learning and memory. Here we have used a neural network model which is capable of producing coupling of theta phase to gamma amplitude firstly to explore its ability to reproduce reported learning changes and secondly to memory-span and phase coding effects. The spiking neural network incorporates two kinetically different GABA(A receptor-mediated currents to generate both theta and gamma rhythms and we have found that by selective alteration of both NMDA receptors and GABA(A,slow receptors it can reproduce learning-related changes in the strength of coupling between theta and gamma either with or without coincident changes in theta amplitude. When the model was used to explore the relationship between theta and gamma oscillations, working memory capacity and phase coding it showed that the potential storage capacity of short term memories, in terms of nested gamma-subcycles, coincides with the maximal theta power. Increasing theta power is also related to the precision of theta phase which functions as a potential timing clock for neuronal firing in the cortex or hippocampus.

  14. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  15. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  16. Low Enrichment Uranium (LEU)-fueled SLOWPOKE-2 nuclear reactor simulation with the Monte-Carlo based MCNP 4A code

    International Nuclear Information System (INIS)

    Pierre, J.R.M.

    1996-01-01

    Following the commissioning of the Low Enrichment Uranium (LEU) Fuelled SLOWPOKE-2 research reactor at the Royal Military College-College Militaire Royal (RMC-CMR), excess reactivity measurements were conducted over a range of temperature and power. The results showed a maximum excess reactivity of 3.37 mk at 33 o C. Several deterministic models using computer codes like WIMS-CRNL, CITATION, TRIVAC and DRAGON have been used to try to reproduce the excess reactivity and temperature trend of both the LEU and HEU SLOWPOKE-2 reactors. The best simulations had been obtained at Ecole Polytechnique de Montreal. They were able to reproduce the temperature trend of their HEU-fuelled reactor using TRIVAC calculations, but this model over-estimated the absolute value of the excess reactivity by 119 mk. Although calculations using DRAGON did not reproduce the temperature trend as well as TRIVAC, these calculations represented a significant improvement on the absolute value at 20 o C reducing the discrepancy to 13 mk. Given the advance in computer technology, a probabilistic approach was tried in this work, using the Monte-Carlo N-Particle Transport Code System MCNP 4A, to model the RMC-CMR SLOWPOKE-2 reactor.

  17. A hard tissue cephalometric comparative study between hand tracing and computerized tracing

    Directory of Open Access Journals (Sweden)

    Ramachandra Prabhakar

    2014-01-01

    Full Text Available Aims: To analyze and compare the angular and linear hard tissue cephalometric measurements using hand-tracing and computerized tracings with Nemoceph and Dolphin software systems. Subjects and Methods: A total of 30 cephalograms were randomly chosen for study with the following criteria, cephalograms of patients with good contrast, no distortion, and minimal radiographic artifacts were considered using the digital method (Kodak 8000 C with 12 angular and nine linear parameters selected for the study. Comparisons were determined by post-hoc test using Tukey HSD method. The N-Par tests were performed using Kruskal-Walli′s method. Statistical Analysis Used: ANOVA and post-hoc. Results: The results of this study show that there is no significant difference in the angular and linear measurements recorded. The P values were significant at 0.05 levels for two parameters, Co-A and Co-Gn with the hand-tracing method. This was significant in ANOVA and post-hoc test by Tukey HSD method. Conclusions: This study of comparison provides support for transition from digital hand to computerized tracing methodology. In fact, digital computerized tracings were easier and less time consuming, with the same reliability irrespective of each method of tracing.

  18. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. Simulation of loss of feedwater transient of MASLWR test facility by MARS-KS code

    Energy Technology Data Exchange (ETDEWEB)

    Park, Juyeop [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-05-15

    MASLWR test facility is a mock-up of a passive integral type reactor equipped with helical coil steam generator. Since SMART reactor which is being current developed domestically also adopts helical coil steam generator, KINS has joined this ICSP to evaluate performance of domestic regulatory audit thermal-hydraulic code (MARS-KS code) in various respects including wall-to-fluid heat transfer model modification implemented in the code by independent international experiment database. In the ICSP, two types of transient experiments have been focused and they are loss of feedwater transient with subsequent ADS operation and long term cooling (SP-2) and normal operating conditions at different power levels (SP-3). In the present study, KINS simulation results by the MARS-KS code (KS-002 version) for the SP-2 experiment are presented in detail and conclusions on MARS-KS code performance drawn through this simulation is described. Performance of the MARS-KS code is evaluated through the simulation of the loss of feedwater transient of the MASLWR test facility. Steady state run shows helical coil specific heat transfer models implemented in the code is reasonable. However, through the transient run, it is also found that three-dimensional effect within the HPC and axial conduction effect through the HTP are not well reproduced by the code.

  20. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    Energy Technology Data Exchange (ETDEWEB)

    Suer, Pascal, E-mail: pascal.suer@swedgeo.se [Swedish Geotechnical Institute, Linkoeping (Sweden); Lindqvist, Jan-Erik [Swedish Cement and Concrete Research Institute, Boras (Sweden); Arm, Maria; Frogner-Kockum, Paul [Swedish Geotechnical Institute, Linkoeping (Sweden)

    2009-09-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO{sub 2}) were used for accelerated ageing. Time (7-14 days), temperature (20-40 {sup o}C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO{sub 2} and seven days at 40 {sup o}C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO{sub 4}, DOC and Cr were not reproduced.

  1. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  2. Development of a CAD-based neutron transport code with the method of characteristics

    International Nuclear Information System (INIS)

    Chen Zhenping; Wang Dianxi; He Tao; Wang Guozhong; Zheng Huaqing

    2012-01-01

    The main problem determining whether the method of characteristics (MOC) can be used in complicated and highly heterogeneous geometry is how to combine an effective geometry processing method with MOC. In this study, a new idea making use of MCAM, which is a Mutlti-Calculation Automatic Modeling for Neutronics and Radiation Transport program developed by FDS Team, for geometry description and ray tracing of particle transport was brought forward to solve the geometry problem mentioned above. Based on the theory and approach as the foregoing statement, a two dimensional neutron transport code was developed which had been integrated into VisualBUS, developed by FDS Team. Several benchmarks were used to verify the validity of the code and the numerical results were coincident with the reference values very well, which indicated the accuracy and feasibility of the method and the MOC code. (authors)

  3. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  4. Comparison of Absolute Apparent Diffusion Coefficient (ADC) Values in ADC Maps Generated Across Different Postprocessing Software: Reproducibility in Endometrial Carcinoma.

    Science.gov (United States)

    Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan

    2017-12-01

    Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.

  5. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  6. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  7. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback (p-value = 0.012). The diaphragm motion reproducibility improvements with AV biofeedback were consistent with the abdominal motion reproducibility that was observed from the external marker motion variation. Conclusions: This study was the first to investigate the potential of AV biofeedback to improve the motion

  8. Mercapto-ordered carbohydrate-derived porous carbon electrode as a novel electrochemical sensor for simple and sensitive ultra-trace detection of omeprazole in biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Kalate Bojdi, Majid [Department of Chemistry, Faculty of Science, Shahid Beheshti University, Tehran 1983963113 (Iran, Islamic Republic of); Faculty of Chemistry, Kharazmi (Tarbiat Moallem) University, Tehran (Iran, Islamic Republic of); Behbahani, Mohammad [Department of Chemistry, Faculty of Science, Shahid Beheshti University, Tehran 1983963113 (Iran, Islamic Republic of); Mashhadizadeh, Mohammad Hosein [Faculty of Chemistry, Kharazmi (Tarbiat Moallem) University, Tehran (Iran, Islamic Republic of); Bagheri, Akbar [Department of Chemistry, Faculty of Science, Shahid Beheshti University, Tehran 1983963113 (Iran, Islamic Republic of); Hosseiny Davarani, Saied Saeed, E-mail: ss-hosseiny@sbu.ac.ir [Department of Chemistry, Faculty of Science, Shahid Beheshti University, Tehran 1983963113 (Iran, Islamic Republic of); Farahani, Ali [Department of Chemistry, Faculty of Science, Shahid Beheshti University, Tehran 1983963113 (Iran, Islamic Republic of)

    2015-03-01

    We are introducing mercapto-mesoporous carbon modified carbon paste electrode (mercapto-MP-C-CPE) as a new sensor for trace determination of omeprazole (OM) in biological samples. The synthesized modifier was characterized by thermogravimetry analysis (TGA), differential thermal analysis (DTA), transmission electron microscopy (TEM), Fourier transform infrared spectrometry (FT-IR), X-ray diffraction (XRD), elemental analysis (CHN) and N{sub 2} adsorption surface area measurement (BET). The electrochemical response characteristic of the modified-CPE toward OM was investigated by cyclic and differential pulse voltammetry (CV and DPV). The proposed sensor displayed a good electrooxidation response to the OM, its linear range is 0.25 nM to 25 μM with a detection limit of 0.04 nM under the optimized conditions. The prepared modified electrode shows several advantages such as high sensitivity, long-time stability, wide linear range, ease of preparation and regeneration of the electrode surface by simple polishing and excellent reproducibility. - Highlights: • A modified nanoporous carbon as a novel sensor • High stability and good repeatability and reproducibility by the prepared sensor • Trace determination of omeprazole • Biological and pharmaceutical samples.

  9. Mercapto-ordered carbohydrate-derived porous carbon electrode as a novel electrochemical sensor for simple and sensitive ultra-trace detection of omeprazole in biological samples

    International Nuclear Information System (INIS)

    Kalate Bojdi, Majid; Behbahani, Mohammad; Mashhadizadeh, Mohammad Hosein; Bagheri, Akbar; Hosseiny Davarani, Saied Saeed; Farahani, Ali

    2015-01-01

    We are introducing mercapto-mesoporous carbon modified carbon paste electrode (mercapto-MP-C-CPE) as a new sensor for trace determination of omeprazole (OM) in biological samples. The synthesized modifier was characterized by thermogravimetry analysis (TGA), differential thermal analysis (DTA), transmission electron microscopy (TEM), Fourier transform infrared spectrometry (FT-IR), X-ray diffraction (XRD), elemental analysis (CHN) and N 2 adsorption surface area measurement (BET). The electrochemical response characteristic of the modified-CPE toward OM was investigated by cyclic and differential pulse voltammetry (CV and DPV). The proposed sensor displayed a good electrooxidation response to the OM, its linear range is 0.25 nM to 25 μM with a detection limit of 0.04 nM under the optimized conditions. The prepared modified electrode shows several advantages such as high sensitivity, long-time stability, wide linear range, ease of preparation and regeneration of the electrode surface by simple polishing and excellent reproducibility. - Highlights: • A modified nanoporous carbon as a novel sensor • High stability and good repeatability and reproducibility by the prepared sensor • Trace determination of omeprazole • Biological and pharmaceutical samples

  10. Short- and long-term reproducibility of radioisotopic examination of gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Jonderko, K. (Silesian School of Medicine, Katowice (Poland). Dept. of Gastroenterology)

    1990-01-01

    Reproducibility of gastric emptying (GE) of a radiolabelled solid meal was assessed. The short-term reproducibility was evaluated on the basis of 12 paired GE examinations performed 1-3 days apart. Twelve paired GE examinations taken 3-8 months apart enabled long-term reproducibility assessment. Reproducibility of GE parameters was expressed in terms of the coefficient of variation, CV. No significant between-day variation of solid GE was found either regarding the short-term or the long-term reproducibility. Although slightly higher CV values characterized the long-term reproducibility of the GE parameters considered, the variations of the differences between repeated GE examinations did not differ significantly between short- and long-term GE reproducibility. The results obtained justify the use of radioisotopic GE measurement for the assessment of early and late results of pharmacologic or surgical management. (author).

  11. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  12. Bidirectional holographic codes and sub-AdS locality

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhao; Hayden, Patrick; Qi, Xiao-Liang [Stanford Institute for Theoretical Physics,Physics Department, Stanford University, CA 94304-4060 (United States)

    2016-01-28

    Tensor networks implementing quantum error correcting codes have recently been used to construct toy models of holographic duality explicitly realizing some of the more puzzling features of the AdS/CFT correspondence. These models reproduce the Ryu-Takayanagi entropy formula for boundary intervals, and allow bulk operators to be mapped to the boundary in a redundant fashion. These exactly solvable, explicit models have provided valuable insight but nonetheless suffer from many deficiencies, some of which we attempt to address in this article. We propose a new class of tensor network models that subsume the earlier advances and, in addition, incorporate additional features of holographic duality, including: (1) a holographic interpretation of all boundary states, not just those in a “code” subspace, (2) a set of bulk states playing the role of “classical geometries” which reproduce the Ryu-Takayanagi formula for boundary intervals, (3) a bulk gauge symmetry analogous to diffeomorphism invariance in gravitational theories, (4) emergent bulk locality for sufficiently sparse excitations, and (5) the ability to describe geometry at sub-AdS resolutions or even flat space.

  13. A ray-tracing study of electron cyclotron resonance heating in Tokamak plasmas with a superthermal electron tail

    International Nuclear Information System (INIS)

    Montes, A.; Dendy, R.O.

    1987-09-01

    We consider a Tokamak plasma in which the distribution of electron velocities in the direction parallel to the magnetic field has a monotonically decreasing superthermal tail. A fully three-dimensional ray-tracing code, which includes a realistic antenna pattern, toroidal effects, and refaction, is used to calculate the absorption of the extraordinary mode in the nonrelativistic limit away from perpendicular incidence. The ray-tracing approach extends results previously obtained in slab geometry (3-8) to a more realistic configuration; it is also essential in dealing with strong refraction in high-density plasmas. Our analytical model for the tail makes available a wide range of tail shapes and parameters. At low densities small tails (tail fraction [pt

  14. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    Science.gov (United States)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  15. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    Science.gov (United States)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the

  16. A sensitive method to extract DNA from biological traces present on ammunition for the purpose of genetic profiling.

    Science.gov (United States)

    Dieltjes, Patrick; Mieremet, René; Zuniga, Sofia; Kraaijenbrink, Thirsa; Pijpe, Jeroen; de Knijff, Peter

    2011-07-01

    Exploring technological limits is a common practice in forensic DNA research. Reliable genetic profiling based on only a few cells isolated from trace material retrieved from a crime scene is nowadays more and more the rule rather than the exception. On many crime scenes, cartridges, bullets, and casings (jointly abbreviated as CBCs) are regularly found, and even after firing, these potentially carry trace amounts of biological material. Since 2003, the Forensic Laboratory for DNA Research is routinely involved in the forensic investigation of CBCs in the Netherlands. Reliable DNA profiles were frequently obtained from CBCs and used to match suspects, victims, or other crime scene-related DNA traces. In this paper, we describe the sensitive method developed by us to extract DNA from CBCs. Using PCR-based genotyping of autosomal short tandem repeats, we were able to obtain reliable and reproducible DNA profiles in 163 out of 616 criminal cases (26.5%) and in 283 out of 4,085 individual CBC items (6.9%) during the period January 2003-December 2009. We discuss practical aspects of the method and the sometimes unexpected effects of using cell lysis buffer on the subsequent investigation of striation patterns on CBCs.

  17. Reproducible Bioconductor workflows using browser-based interactive notebooks and containers.

    Science.gov (United States)

    Almugbel, Reem; Hung, Ling-Hong; Hu, Jiaming; Almutairy, Abeer; Ortogero, Nicole; Tamta, Yashaswi; Yeung, Ka Yee

    2018-01-01

    Bioinformatics publications typically include complex software workflows that are difficult to describe in a manuscript. We describe and demonstrate the use of interactive software notebooks to document and distribute bioinformatics research. We provide a user-friendly tool, BiocImageBuilder, that allows users to easily distribute their bioinformatics protocols through interactive notebooks uploaded to either a GitHub repository or a private server. We present four different interactive Jupyter notebooks using R and Bioconductor workflows to infer differential gene expression, analyze cross-platform datasets, process RNA-seq data and KinomeScan data. These interactive notebooks are available on GitHub. The analytical results can be viewed in a browser. Most importantly, the software contents can be executed and modified. This is accomplished using Binder, which runs the notebook inside software containers, thus avoiding the need to install any software and ensuring reproducibility. All the notebooks were produced using custom files generated by BiocImageBuilder. BiocImageBuilder facilitates the publication of workflows with a point-and-click user interface. We demonstrate that interactive notebooks can be used to disseminate a wide range of bioinformatics analyses. The use of software containers to mirror the original software environment ensures reproducibility of results. Parameters and code can be dynamically modified, allowing for robust verification of published results and encouraging rapid adoption of new methods. Given the increasing complexity of bioinformatics workflows, we anticipate that these interactive software notebooks will become as necessary for documenting software methods as traditional laboratory notebooks have been for documenting bench protocols, and as ubiquitous. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  18. A criticality safety analysis code using a vectorized Monte Carlo method on the HITAC S-810 supercomputer

    International Nuclear Information System (INIS)

    Morimoto, Y.; Maruyama, H.

    1987-01-01

    A vectorized Monte Carlo criticality safety analysis code has been developed on the vector supercomputer HITAC S-810. In this code, a multi-particle tracking algorithm was adopted for effective utilization of the vector processor. A flight analysis with pseudo-scattering was developed to reduce the computational time needed for flight analysis, which represents the bulk of computational time. This new algorithm realized a speed-up of factor 1.5 over the conventional flight analysis. The code also adopted the multigroup cross section constants library of the Bodarenko type with 190 groups, with 132 groups being for fast and epithermal regions and 58 groups being for the thermal region. Evaluation work showed that this code reproduce the experimental results to an accuracy of about 1 % for the effective neutron multiplication factor. (author)

  19. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  20. ARDISC (Argonne Dispersion Code): computer programs to calculate the distribution of trace element migration in partially equilibrating media

    International Nuclear Information System (INIS)

    Strickert, R.; Friedman, A.M.; Fried, S.

    1979-04-01

    A computer program (ARDISC, the Argonne Dispersion Code) is described which simulates the migration of nuclides in porous media and includes first order kinetic effects on the retention constants. The code allows for different absorption and desorption rates and solves the coupled migration equations by arithmetic reiterations. Input data needed are the absorption and desorption rates, equilibrium surface absorption coefficients, flow rates and volumes, and media porosities

  1. Predictive coding of music--brain responses to rhythmic incongruity.

    Science.gov (United States)

    Vuust, Peter; Ostergaard, Leif; Pallesen, Karen Johanne; Bailey, Christopher; Roepstorff, Andreas

    2009-01-01

    During the last decades, models of music processing in the brain have mainly discussed the specificity of brain modules involved in processing different musical components. We argue that predictive coding offers an explanatory framework for functional integration in musical processing. Further, we provide empirical evidence for such a network in the analysis of event-related MEG-components to rhythmic incongruence in the context of strong metric anticipation. This is seen in a mismatch negativity (MMNm) and a subsequent P3am component, which have the properties of an error term and a subsequent evaluation in a predictive coding framework. There were both quantitative and qualitative differences in the evoked responses in expert jazz musicians compared with rhythmically unskilled non-musicians. We propose that these differences trace a functional adaptation and/or a genetic pre-disposition in experts which allows for a more precise rhythmic prediction.

  2. A comparison of ray-tracing software for the design of quadrupole microbeam systems

    International Nuclear Information System (INIS)

    Incerti, S.; Smith, R.W.; Merchant, M.; Grime, G.W.; Meot, F.; Serani, L.; Moretto, Ph.; Touzeau, C.; Barberet, Ph.; Habchi, C.; Nguyen, D.T.

    2005-01-01

    For many years the only ray-tracing software available with sufficient precision for the design of quadrupole microbeam focusing systems has been OXRAY and its successor TRAX, developed at Oxford in the 1980s. With the current interest in pushing the beam diameter into the nanometre region, this software has become dated and more importantly the precision at small displacements may not be sufficient and new simulation tools are required. Two candidates for this are Zgoubi, developed at CEA as a general beam line design tool and the CERN simulation program Geant in its latest version Geant4. In order to use Geant4 new quadrupole field modules have been developed and implemented. In this paper the capabilities of the three codes TRAX, Zgoubi and Geant4 are reviewed. Comparisons of ray-tracing calculations in a high demagnification quadrupole probe-forming system for the sub-micron region are presented

  3. Thermal radiation characteristics of nonisothermal cylindrical enclosures using a numerical ray tracing technique

    Science.gov (United States)

    Baumeister, Joseph F.

    1990-01-01

    Analysis of energy emitted from simple or complex cavity designs can lead to intricate solutions due to nonuniform radiosity and irradiation within a cavity. A numerical ray tracing technique was applied to simulate radiation propagating within and from various cavity designs. To obtain the energy balance relationships between isothermal and nonisothermal cavity surfaces and space, the computer code NEVADA was utilized for its statistical technique applied to numerical ray tracing. The analysis method was validated by comparing results with known theoretical and limiting solutions, and the electrical resistance network method. In general, for nonisothermal cavities the performance (apparent emissivity) is a function of cylinder length-to-diameter ratio, surface emissivity, and cylinder surface temperatures. The extent of nonisothermal conditions in a cylindrical cavity significantly affects the overall cavity performance. Results are presented over a wide range of parametric variables for use as a possible design reference.

  4. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  5. TRACE/PARCS validation for BWR stability based on OECD/NEA Oskarshamn-2 benchmark

    International Nuclear Information System (INIS)

    Kozlowski, T.; Roshan, S.; Lefvert, T.; Downar, T.; Xu, Y.; Wysocki, A.; Ivanov, K.; Magedanz, J.; Hardgrove, M.; Netterbrant, C.; March-Leuba, J.; Hudson, N.; Sandervag, O.; Bergman, A.

    2011-01-01

    On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event, which culminated in diverging power oscillations with decay ratio greater than 1.3. The event was successfully modeled by TRACE/PARCS coupled code system and the details of the modeling and solution are described in the paper. The obtained results show excellent agreement with the plant data, capturing the entire behavior of the transient including onset of instability, growth of oscillation (decay ratio) and the oscillation frequency. The event allows coupled code validation for BWR with a real, challenging stability event, which challenges accuracy of neutron kinetics (NK), thermal-hydraulics (TH) and TH/NK coupling. The success of this work has demonstrated the ability of 3-D coupled code systems to capture the complex behavior of BWR stability events. The problem is released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (author)

  6. Experimental determination of trace-element partitioning between pargasite and a synthetic hydrous andesitic melt

    Science.gov (United States)

    Brenan, J. M.; Shaw, H. F.; Ryerson, F. J.; Phinney, D. L.

    1995-10-01

    therefore offer a means to discern the loss of amphibole from the melting assemblage. Elastic strain theory is applied to the partitioning data after the approaches of Beattie and Blundy and Wood and is used to predict amphibole/melt partition coefficients at conditions of P, T and composition other than those employed in this study. Given values of DCa, DTi and DK from previous partitioning studies, this approach yields amphibole/melt trace-element partition coefficients that reproduce measured values from the literature to within 40-45%. This degree of reproducibility is considered reasonable given that model parameters are derived from partitioning relations involving iron- and potassium-free amphibole.

  7. On Predictive Coding for Erasure Channels Using a Kalman Framework

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Murthi, Manohar; Andersen, Søren Vang

    2009-01-01

    We present a new design method for robust low-delay coding of autoregressive sources for transmission across erasure channels. It is a fundamental rethinking of existing concepts. It considers the encoder a mechanism that produces signal measurements from which the decoder estimates the original...... signal. The method is based on linear predictive coding and Kalman estimation at the decoder. We employ a novel encoder state-space representation with a linear quantization noise model. The encoder is represented by the Kalman measurement at the decoder. The presented method designs the encoder...... and decoder offline through an iterative algorithm based on closed-form minimization of the trace of the decoder state error covariance. The design method is shown to provide considerable performance gains, when the transmitted quantized prediction errors are subject to loss, in terms of signal-to-noise ratio...

  8. Adsorption of trace gases to ice surfaces: surface, bulk and co-adsorbate effects

    Science.gov (United States)

    Kerbrat, Michael; Bartels-Rausch, Thorsten; Huthwelker, Thomas; Schneebeli, Martin; Pinzer, Bernd; Ammann, Markus

    2010-05-01

    Atmospheric ices frequently interact with trace gases and aerosol making them an important storage, transport or reaction medium in the global ecosystem. Further, this also alters the physical properties of the ice particles with potential consequences for the global irradiation balance and for the relative humidity of surrounding air masses. We present recent results from a set of laboratory experiments of atmospheric relevance to investigate the nature of the uptake processes. The focus of this talk will be placed on the partitioning of acidic acid and nitrous acid on ice surfaces.The presented results span from very simple reversible adsorption experiments of a single trace gas onto ice surfaces to more complex, but well controlled, experimental procedures that successfully allowed us to - Disentangle surface adsorption and uptake into the ice matrix using radioactive labelled trace gases. - Show that simultaneous adsorption of acetic acid and nitrous acid to an ice surface is consistent with the Langmuir co-adsorption model. The experiments were done in a packed ice bed flow tube at atmospheric pressure and at temperatures between 213 and 253 K. The HONO gas phase mixing ratio was between 0.4 and 137 ppbv, the mixing ratio of acetic acid between 5 and 160 ppbv . The use of the radioactive labelled nitrous acid molecules for these experiments enabled in situ monitoring of the migration of trace gas in the flow tube. The measurements showed that the interactions do not only occur through adsorption but also via diffusion into polycrystalline ice. A method is suggested to disentangle the bulk and the surface processes. The co-adsorption of acetic and nitrous acids was also investigated. The measurements are well reproduced by a competitive Langmuir adsorption model.

  9. Python-Assisted MODFLOW Application and Code Development

    Science.gov (United States)

    Langevin, C.

    2013-12-01

    The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.

  10. Quantum mechanical path integrals in curved spaces and the type-A trace anomaly

    Energy Technology Data Exchange (ETDEWEB)

    Bastianelli, Fiorenzo [Dipartimento di Fisica ed Astronomia, Università di Bologna,via Irnerio 46, I-40126 Bologna (Italy); INFN, Sezione di Bologna,via Irnerio 46, I-40126 Bologna (Italy); Corradini, Olindo [Dipartimento di Scienze Fisiche, Informatiche e Matematiche,Università di Modena e Reggio Emilia,Via Campi 213/A, I-41125 Modena (Italy); INFN, Sezione di Bologna,via Irnerio 46, I-40126 Bologna (Italy); Vassura, Edoardo [Dipartimento di Fisica ed Astronomia, Università di Bologna,via Irnerio 46, I-40126 Bologna (Italy); INFN, Sezione di Bologna,via Irnerio 46, I-40126 Bologna (Italy)

    2017-04-10

    Path integrals for particles in curved spaces can be used to compute trace anomalies in quantum field theories, and more generally to study properties of quantum fields coupled to gravity in first quantization. While their construction in arbitrary coordinates is well understood, and known to require the use of a regularization scheme, in this article we take up an old proposal of constructing the path integral by using Riemann normal coordinates. The method assumes that curvature effects are taken care of by a scalar effective potential, so that the particle lagrangian is reduced to that of a linear sigma model interacting with the effective potential. After fixing the correct effective potential, we test the construction on spaces of maximal symmetry and use it to compute heat kernel coefficients and type-A trace anomalies for a scalar field in arbitrary dimensions up to d=12. The results agree with expected ones, which are reproduced with great efficiency and extended to higher orders. We prove explicitly the validity of the simplified path integral on maximally symmetric spaces. This simplified path integral might be of further use in worldline applications, though its application on spaces of arbitrary geometry remains unclear.

  11. Reproducing 2D breast mammography images with 3D printed phantoms

    Science.gov (United States)

    Clark, Matthew; Ghammraoui, Bahaa; Badal, Andreu

    2016-03-01

    Mammography is currently the standard imaging modality used to screen women for breast abnormalities and, as a result, it is a tool of great importance for the early detection of breast cancer. Physical phantoms are commonly used as surrogates of breast tissue to evaluate some aspects of the performance of mammography systems. However, most phantoms do not reproduce the anatomic heterogeneity of real breasts. New fabrication technologies, such as 3D printing, have created the opportunity to build more complex, anatomically realistic breast phantoms that could potentially assist in the evaluation of mammography systems. The primary objective of this work is to present a simple, easily reproducible methodology to design and print 3D objects that replicate the attenuation profile observed in real 2D mammograms. The secondary objective is to evaluate the capabilities and limitations of the competing 3D printing technologies, and characterize the x-ray properties of the different materials they use. Printable phantoms can be created using the open-source code introduced in this work, which processes a raw mammography image to estimate the amount of x-ray attenuation at each pixel, and outputs a triangle mesh object that encodes the observed attenuation map. The conversion from the observed pixel gray value to a column of printed material with equivalent attenuation requires certain assumptions and knowledge of multiple imaging system parameters, such as x-ray energy spectrum, source-to-object distance, compressed breast thickness, and average breast material attenuation. A detailed description of the new software, a characterization of the printed materials using x-ray spectroscopy, and an evaluation of the realism of the sample printed phantoms are presented.

  12. Scaling of Thermal-Hydraulic Phenomena and System Code Assessment

    International Nuclear Information System (INIS)

    Wolfert, K.

    2008-01-01

    In the last five decades large efforts have been undertaken to provide reliable thermal-hydraulic system codes for the analyses of transients and accidents in nuclear power plants. Many separate effects tests and integral system tests were carried out to establish a data base for code development and code validation. In this context the question has to be answered, to what extent the results of down-scaled test facilities represent the thermal-hydraulic behaviour expected in a full-scale nuclear reactor under accidental conditions. Scaling principles, developed by many scientists and engineers, present a scientific technical basis and give a valuable orientation for the design of test facilities. However, it is impossible for a down-scaled facility to reproduce all physical phenomena in the correct temporal sequence and in the kind and strength of their occurrence. The designer needs to optimize a down-scaled facility for the processes of primary interest. This leads compulsorily to scaling distortions of other processes with less importance. Taking into account these weak points, a goal oriented code validation strategy is required, based on the analyses of separate effects tests and integral system tests as well as transients occurred in full-scale nuclear reactors. The CSNI validation matrices are an excellent basis for the fulfilling of this task. Separate effects tests in full scale play here an important role.

  13. Code of Practice on the International Transboundary Movement of Radioactive Waste; Codigo De Practica Sobre Movimientos Internacionales Transfronterizos De Desechos Radiactivos

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-11-14

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States [Spanish] El 21 de septiembre de 1990, la Conferencia General, en su resolueion GC(XXXIV)/RES/53O, aprobo un Codigo de Practica sobre movimientos internacionales transfronterizos de desechos radiactivos y pidio al Director General, entre otras cosas, que tomase todas las medidas necesarias para lograr una amplia difusion del Codigo de Practica en los planos nacional e internacional. El Codigo de Practica lo elaboro un Grupo de Expertos establecido en cumplimiento de la resolucion GC(XXXII)/RES/49O aprobada por la Conferencia General en 1988. En el presente documento se reproduce el texto del Codigo de Practica para informacion de todos los Estados Miembros.

  14. Prediction Capability of SPACE Code about the Loop Seal Clearing on ATLAS SBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Sung Won; Lee, Jong Hyuk; Chung, Bub Dong; Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The most possible break size for loop seal reforming has been decided as 4 inch by the pre-calculation conducted by the RELAP5 and MARS. Many organizations have participated with various system analysis codes: for examples, RELAP5, MARS, TRACE. KAERI also anticipated with SPACE code. SPACE code has been developed for the use of design and safety analysis of nuclear thermal hydraulics system. KHNP and other organizations have collaborated during last 10 years. And it is currently under the certification procedures. SPACE has the capability to analyze the droplet field with full governing equation set: continuity, momentum, and energy. The SPACE code has been participated in PKL- 3 benchmark program for the international activity. The DSP-04 benchmark problem is also the application of SPACE as the domestic activities. The cold leg top slot break accident of APR1400 reactor has been modeled and surveyed by SPACE code. Benchmark experiment as a program of DSP-04 has been performed with ATLAS facility. The break size has been selected as 4 inch in APR1400 and the corresponding scale down break size has been modeled in SPACE code. The loop seal reforming has been occurred at all 4 loops. But the PCT shows no significant behaviors.

  15. Monte-Carlo Impurity transport simulations in the edge of the DIII-D tokamak using the MCI code

    International Nuclear Information System (INIS)

    Evans, T.E.; Mahdavi, M.A.; Sager, G.T.; West, W.P.; Fenstermacher, M.E.; Meyer, W.H.; Porter, G.D.

    1995-07-01

    A Monte-Carlo Impurity (MCI) transport code is used to follow trace impurities through multiple ionization states in realistic 2-D tokamak geometries. The MCI code is used to study impurity transport along the open magnetic field lines of the Scrape-off Layer (SOL) and to understand how impurities get into the core from the SOL. An MCI study concentrating on the entrainment of carbon impurities ions by deuterium background plasma into the DIII-D divertor is discussed. MCI simulation results are compared to experimental DIII-D carbon measurements

  16. Monte-Carlo Impurity transport simulations in the edge of the DIII-D tokamak using the MCI code

    International Nuclear Information System (INIS)

    Evans, T.E.; Sager, G.T.; Mahdavi, M.A.; Porter, G.D.; Fenstermacher, M.E.; Meyer, W.H.

    1995-01-01

    A Monte-Carlo Impurity (MCI) transport code is used to follow trace impurities through multiple ionization states in realistic 2-D tokamak geometries. The MCI code is used to study impurity transport along the open magnetic field lines of the Scrape-off Layer (SOL) and to understand how impurities get into the core from the SOL. An MCI study concentrating on the entrainment of carbon impurities ions by deuterium background plasma into the DII-D divertor is discussed. MCI simulation results are compared to experimental DII-D carbon measurements. 2 refs

  17. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  18. Understanding Notional Machines through Traditional Teaching with Conceptual Contraposition and Program Memory Tracing

    Directory of Open Access Journals (Sweden)

    Jeisson Hidalgo-Céspedes

    2016-08-01

    Full Text Available A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students’ passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a “Programming II” (CS2 course combining conceptual contraposition with program memory tracing, then we evaluated students’ understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.

  19. TORBEAM 2.0, a paraxial beam tracing code for electron-cyclotron beams in fusion plasmas for extended physics applications

    Science.gov (United States)

    Poli, E.; Bock, A.; Lochbrunner, M.; Maj, O.; Reich, M.; Snicker, A.; Stegmeir, A.; Volpe, F.; Bertelli, N.; Bilato, R.; Conway, G. D.; Farina, D.; Felici, F.; Figini, L.; Fischer, R.; Galperti, C.; Happel, T.; Lin-Liu, Y. R.; Marushchenko, N. B.; Mszanowski, U.; Poli, F. M.; Stober, J.; Westerhof, E.; Zille, R.; Peeters, A. G.; Pereverzev, G. V.

    2018-04-01

    The paraxial WKB code TORBEAM (Poli, 2001) is widely used for the description of electron-cyclotron waves in fusion plasmas, retaining diffraction effects through the solution of a set of ordinary differential equations. With respect to its original form, the code has undergone significant transformations and extensions, in terms of both the physical model and the spectrum of applications. The code has been rewritten in Fortran 90 and transformed into a library, which can be called from within different (not necessarily Fortran-based) workflows. The models for both absorption and current drive have been extended, including e.g. fully-relativistic calculation of the absorption coefficient, momentum conservation in electron-electron collisions and the contribution of more than one harmonic to current drive. The code can be run also for reflectometry applications, with relativistic corrections for the electron mass. Formulas that provide the coupling between the reflected beam and the receiver have been developed. Accelerated versions of the code are available, with the reduced physics goal of inferring the location of maximum absorption (including or not the total driven current) for a given setting of the launcher mirrors. Optionally, plasma volumes within given flux surfaces and corresponding values of minimum and maximum magnetic field can be provided externally to speed up the calculation of full driven-current profiles. These can be employed in real-time control algorithms or for fast data analysis.

  20. Large-eddy simulation of stratified atmospheric flows with the CFD code Code-Saturne

    International Nuclear Information System (INIS)

    Dall'Ozzo, Cedric

    2013-01-01

    Large-eddy simulation (LES) of the physical processes in the atmospheric boundary layer (ABL) remains a complex subject. LES models have difficulties to capture the evolution of the turbulence in different conditions of stratification. Consequently, LES of the whole diurnal cycle of the ABL including convective situations in daytime and stable situations in the nighttime is seldom documented. The simulation of the stable atmospheric boundary layer which is characterized by small eddies and by weak and sporadic turbulence is especially difficult. Therefore The LES ability to well reproduce real meteorological conditions, particularly in stable situations, is studied with the CFD code developed by EDF R and D, Code-Saturne. The first study consist in validate LES on a quasi-steady state convective case with homogeneous terrain. The influence of the sub-grid-scale models (Smagorinsky model, Germano-Lilly model, Wong-Lilly model and Wall-Adapting Local Eddy-viscosity model) and the sensitivity to the parametrization method on the mean fields, flux and variances are discussed. In a second study, the diurnal cycle of the ABL during Wangara experiment is simulated. The deviation from the measurement is weak during the day, so this work is focused on the difficulties met during the night to simulate the stable atmospheric boundary layer. The impact of the different sub-grid-scale models and the sensitivity to the Smagorinsky constant are been analysed. By coupling radiative forcing with LES, the consequences of infra-red and solar radiation on the nocturnal low level jet and on thermal gradient, close to the surface, are exposed. More, enhancement of the domain resolution to the turbulence intensity and the strong atmospheric stability during the Wangara experiment are analysed. Finally, a study of the numerical oscillations inherent to Code-Saturne is realized in order to decrease their effects. (author) [fr

  1. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  2. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  3. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  4. Addressing challenges in bar-code scanning of large-volume infusion bags.

    Science.gov (United States)

    Raman, Kirthana; Heelon, Mark; Kerr, Gary; Higgins, Thomas L

    2011-08-01

    A hospital pharmacy's efforts to identify and address challenges with bedside scanning of bar codes on large-volume parenteral (LVP) infusion bags are described. Bar-code-assisted medication administration (BCMA) has been shown to reduce medication errors and improve patient safety. After the pilot implementation of a BCMA system and point-of-care scanning procedures at a medical center's intensive care unit, it was noted that nurses' attempted bedside scans of certain LVP bags for product identification purposes often were not successful. An investigation and root-cause analysis, including observation of nurses' scanning technique by a multidisciplinary team, determined that the scanning failures stemmed from the placement of two bar-code imprints-one with the product identification code and another, larger imprint with the expiration date and lot number-adjacently on the LVP bags. The nursing staff was educated on a modified scanning technique, which resulted in significantly improved success rates in the scanning of the most commonly used LVP bags. Representatives of the LVP bag manufacturer met with hospital staff to discuss the problem and corrective measures. As part of a subsequent infusion bag redesign, the manufacturer discontinued the use of the bar-code imprint implicated in the scanning failures. Failures in scanning LVP bags were traced to problematic placement of bar-code imprints on the bags. Interdisciplinary collaboration, consultation with the bag manufacturer, and education of the nursing and pharmacy staff resulted in a reduction in scanning failures and the manufacturer's removal of one of the bar codes from its LVP bags.

  5. TRACE Assessment for BWR ATWS Analysis

    International Nuclear Information System (INIS)

    Cheng, L.Y.; Diamond, D.; Cuadra, Arantxa; Raitses, Gilad; Aronson, Arnold

    2010-01-01

    A TRACE/PARCS input model has been developed in order to be able to analyze anticipated transients without scram (ATWS) in a boiling water reactor. The model is based on one developed previously for the Browns Ferry reactor for doing loss-of-coolant accident analysis. This model was updated by adding the control systems needed for ATWS and a core model using PARCS. The control systems were based on models previously developed for the TRAC-B code. The PARCS model is based on information (e.g., exposure and moderator density (void) history distributions) obtained from General Electric Hitachi and cross sections for GE14 fuel obtained from an independent source. The model is able to calculate an ATWS, initiated by the closure of main steam isolation valves, with recirculation pump trip, water level control, injection of borated water from the standby liquid control system and actuation of the automatic depressurization system. The model is not considered complete and recommendations are made on how it should be improved.

  6. Validation matrix for the assessment of thermal-hydraulic codes for VVER LOCA and transients. A report by the OECD support group on the VVER thermal-hydraulic code validation matrix

    International Nuclear Information System (INIS)

    2001-06-01

    This report deals with an internationally agreed experimental test facility matrix for the validation of best estimate thermal-hydraulic computer codes applied for the analysis of VVER reactor primary systems in accident and transient conditions. Firstly, the main physical phenomena that occur during the considered accidents are identified, test types are specified, and test facilities that supplement the CSNI CCVMs and are suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. The construction of VVER Thermal-Hydraulic Code Validation Matrix follows the logic of the CSNI Code Validation Matrices (CCVM). Similar to the CCVM it is an attempt to collect together in a systematic way the best sets of available test data for VVER specific code validation, assessment and improvement, including quantitative assessment of uncertainties in the modelling of phenomena by the codes. In addition to this objective, it is an attempt to record information which has been generated in countries operating VVER reactors over the last 20 years so that it is more accessible to present and future workers in that field than would otherwise be the case. (authors)

  7. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  8. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  9. Optimization of Monte Carlo algorithms and ray tracing on GPUs

    International Nuclear Information System (INIS)

    Bergmann, R.M.; Vujic, J.L.

    2013-01-01

    To take advantage of the computational power of GPUs (Graphical Processing Units), algorithms that work well on CPUs must be modified to conform to the GPU execution model. In this study, typical task-parallel Monte Carlo algorithms have been reformulated in a data-parallel way, and the benefits of doing so are examined. We were able to show that the data-parallel approach greatly improves thread coherency and keeps thread blocks busy, improving GPU utilization compared to the task-parallel approach. Data-parallel does not, however, outperform the task-parallel approach in regards to speedup over CPU. Regarding the ray-tracing acceleration, OptiX shows promise for providing enough ray tracing speed to be used in a full 3D Monte Carlo neutron transport code for reactor calculations. It is important to note that it is necessary to operate on large datasets of particle histories in order to have good performance in both OptiX and the data-parallel algorithm since this reduces the impact of latency. Our paper also shows the need to rewrite standard Monte Carlo algorithms in order to take full advantage of these new, powerful processor architectures

  10. Olfactory memory traces in Drosophila.

    Science.gov (United States)

    Berry, Jacob; Krause, William C; Davis, Ronald L

    2008-01-01

    In Drosophila, the fruit fly, coincident exposure to an odor and an aversive electric shock can produce robust behavioral memory. This behavioral memory is thought to be regulated by cellular memory traces within the central nervous system of the fly. These molecular, physiological, or structural changes in neurons, induced by pairing odor and shock, regulate behavior by altering the neurons' response to the learned environment. Recently, novel in vivo functional imaging techniques have allowed researchers to observe cellular memory traces in intact animals. These investigations have revealed interesting temporal and spatial dynamics of cellular memory traces. First, a short-term cellular memory trace was discovered that exists in the antennal lobe, an early site of olfactory processing. This trace represents the recruitment of new synaptic activity into the odor representation and forms for only a short period of time just after training. Second, an intermediate-term cellular memory trace was found in the dorsal paired medial neuron, a neuron thought to play a role in stabilizing olfactory memories. Finally, a long-term protein synthesis-dependent cellular memory trace was discovered in the mushroom bodies, a structure long implicated in olfactory learning and memory. Therefore, it appears that aversive olfactory associations are encoded by multiple cellular memory traces that occur in different regions of the brain with different temporal domains.

  11. Measuring stone surface area from a radiographic image is accurate and reproducible with the help of an imaging program.

    Science.gov (United States)

    Kurien, Abraham; Ganpule, Arvind; Muthu, V; Sabnis, R B; Desai, Mahesh

    2009-01-01

    The surface area of the stone from a radiographic image is one of the more suitable parameters defining stone bulk. The widely accepted method of measuring stone surface area is to count the number of square millimeters enclosed within a tracing of the stone outline on graph paper. This method is time consuming and cumbersome with potential for human error, especially when multiple measurements are needed. The purpose of this study was to evaluate the accuracy, efficiency, and reproducibility of a commercially available imaging program, Adobe Photoshop 7.0 for the measurement of stone surface area. The instructions to calculate area using the software are simple and easy in a Windows-based format. The accuracy of the imaging software was estimated by measuring surface areas of shapes of known mathematical areas. The efficiency and reproducibility were then evaluated from radiographs of 20 persons with radiopaque upper-tract urinary stones. The surface areas of stone images were measured using both graph paper and imaging software. Measurements were repeated after 10 days to assess the reproducibility of the techniques. The time taken to measure the area by the two methods was also assessed separately. The accuracy of the imaging software was estimated to be 98.7%. The correlation coefficient between the two methods was R(2) = 0.97. The mean percentage variation using the imaging software was 0.68%, while it was 6.36% with the graph paper. The mean time taken to measure using the image analyzer and graph paper was 1.9 +/- 0.8 minutes and 4.5 +/- 1.08 minutes, respectively (P stone surface area from radiographs compared with manual measurements using graph paper.

  12. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  13. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  14. Development of Unified Code for Environmental Research by Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seung Yeon; Kim, Young Sik; Lee, Sang Mi; Chung, Sang Uk; Lee, Kyu Sung; Kang, Sang Hun; Cheon, Ki Hong [Yonsei University, Seoul (Korea, Republic of)

    1997-07-01

    Three codes were developed to improve accuracy and precision of neutron activation analysis with the adoption of IAEA`s recommended `GANAAS` program which has the better peak identification and efficiency calibration algorithm than the currently using commercial program. Quantitative analytical ability of trace element was improved with the codes such that the number of detectable elements including environmentally important elements was increased. Small and over lapped peaks can be detected more efficiently with the good peak shape calibration(energy dependence on peak height, peak base width and FWHM). Several efficiency functions were added to determine the detector efficiency more accurately which was the main source of error in neutron activation analysis. Errors caused by nuclear data themselves were reduced with the introduction of ko method. New graphical program called `POWER NAA` was developed for the recent personal computer environment, Window 95, and for the data compatibility. It also reduced the error caused by operator`s mistake with the easy and comfortable operation of the code. 11 refs., 3 tabs., 9 figs. (author)

  15. Improvement of the spallation-reaction simulation code by considering both the high-momentum intranuclear nucleons and the preequilibrium process

    International Nuclear Information System (INIS)

    Ishibashi, K.; Miura, Y.; Sakae, T.

    1990-01-01

    In the present study, intranuclear nucleons with a high momentum are introduced into intranuclear cascade calculation, and the preequilibrium effects are considered at the end of the cascade process. The improvements made in the HETC (High Energy Transport Code) are outlined, focusing on intranuclear nucleons with a high momentum, and termination of the intranuclear cascade process. Discussion is made of the cutoff energy, and Monte Carlo calculations based on an excitation model are presented and analyzed. The experimental high energy neutrons in the backward direction are successfully reproduced. The preequilibrium effect is considered in a local manner, and this is introduced as a simple probability density function for terminating the intranuclear cascade process. The resultant neutron spectra reproduce the shoulders of the experimental data in the region of 20 to 50 MeV. The exciton model is coded with a Monte Carlo algorithm. The results of the exciton model calculation is not so appreciable except for intermediate energy neutrons in the backward direction. (N.K.)

  16. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    Science.gov (United States)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  17. Characteristics of ion Bernstein wave heating in JIPPT-II-U tokamak

    International Nuclear Information System (INIS)

    Okamoto, M.; Ono, M.

    1985-11-01

    Using a transport code combined with an ion Bernstein wave tokamak ray tracing code, a modelling code for the ion Bernstein wave heating has been developed. Using this code, the ion Bernstein wave heating experiment on the JIPPT-II-U tokamak has been analyzed. It is assumed that the resonance layer is formed by the third harmonic of deuterium-like ions, such as fully ionized carbon, and oxygen ions near the plasma center. For wave absorption mechanisms, electron Landau damping, ion cyclotron harmonic damping, and collisional damping are considered. The characteristics of the ion Bernstein wave heating experiment, such as the ion temperature increase, the strong dependence of the quality factor on the magnetic field strength, and the dependence of the ion temperature increment on the input power, are well reproduced

  18. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  19. A neural coding scheme reproducing foraging trajectories

    Science.gov (United States)

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-12-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  20. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  1. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  2. Trace analysis of semiconductor materials

    CERN Document Server

    Cali, J Paul; Gordon, L

    1964-01-01

    Trace Analysis of Semiconductor Materials is a guidebook concerned with procedures of ultra-trace analysis. This book discusses six distinct techniques of trace analysis. These techniques are the most common and can be applied to various problems compared to other methods. Each of the four chapters basically includes an introduction to the principles and general statements. The theoretical basis for the technique involved is then briefly discussed. Practical applications of the techniques and the different instrumentations are explained. Then, the applications to trace analysis as pertaining

  3. CSNI Integral Test Facility Matrices for Validation of Best-Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Internationally agreed Integral Test Facility (ITF) matrices for validation of realistic thermal hydraulic system computer codes were established. ITF development is mainly for Pressurised Water Reactors (PWRs) and Boiling Water Reactors (BWRs). A separate activity was for Russian Pressurised Water-cooled and Water-moderated Energy Reactors (WWER). Firstly, the main physical phenomena that occur during considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. In this paper some specific examples from the ITF matrices will also be provided. The matrices will be a guide for code validation, will be a basis for comparisons of code predictions performed with different system codes, and will contribute to the quantification of the uncertainty range of code model predictions. In addition to this objective, the construction of such a matrix is an attempt to record information which has been generated around the world over the last years, so that it is more accessible to present and future workers in that field than would otherwise be the case.

  4. TRACE analysis of Phenix core response to an increase of the core inlet sodium temperature

    Energy Technology Data Exchange (ETDEWEB)

    Chenu, A., E-mail: aurelia.chenu@psi.ch [Paul Scherrer Inst., Villigen PSI (Switzerland); Ecole Polytechnique Federale (Switzerland); Mikityuk, K., E-mail: konstantin.mikityuk@psi.ch [Paul Scherrer Inst., Villigen PSI (Switzerland); Adams, R., E-mail: robert.adams@psi.ch [Paul Scherrer Inst., Villigen PSI (Switzerland); Eidgenossische Technische Hochschule, Zurich (Switzerland); Chawla, R., E-mail: rakesh.chawla@epfl.ch [Paul Scherrer Inst., Villigen PSI (Switzerland); Ecole Polytechnique Federale (Switzerland)

    2011-07-01

    This work presents the analysis, using the TRACE code, of the Phenix core response to an inlet sodium temperature increase. The considered experiment was performed in the frame of the Phenix End-Of-Life (EOL) test program of the CEA, prior to the final shutdown of the reactor. It corresponds to a transient following a 40°C increase of the core inlet temperature, which leads to a power decrease of 60%. This work focuses on the first phase of the transient, prior to the reactor scram and pump trip. First, the thermal-hydraulic TRACE model of the core developed for the present analysis is described. The kinetic parameters and feedback coefficients for the point kinetic model were first derived from a 3D static neutronic ERANOS model developed in a former study. The calculated kinetic parameters were then optimized, before use, on the basis of the experimental reactivity in order to minimize the error on the power calculation. The different reactivity feedbacks taken into account include various expansion mechanisms that have been specifically implemented in TRACE for analysis of fast-neutron spectrum systems. The point kinetic model has been used to study the sensitivity of the core response to the different feedback effects. The comparison of the calculated results with the experimental data reveals the need to accurately calculate the reactivity feedback coefficients. This is because the reactor response is very sensitive to small reactivity changes. This study has enabled us to study the sensitivity of the power change to the different reactivity feedbacks and define the most important parameters. As such, it furthers the validation of the FAST code system, which is being used to gain a more in-depth understanding of SFR core behavior during accidental transients. (author)

  5. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Science.gov (United States)

    Młynarski, Wiktor

    2014-01-01

    To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644

  6. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  7. An investigation of the sub-grid variability of trace gases and aerosols for global climate modeling

    Directory of Open Access Journals (Sweden)

    Y. Qian

    2010-07-01

    Full Text Available One fundamental property and limitation of grid based models is their inability to identify spatial details smaller than the grid cell size. While decades of work have gone into developing sub-grid treatments for clouds and land surface processes in climate models, the quantitative understanding of sub-grid processes and variability for aerosols and their precursors is much poorer. In this study, WRF-Chem is used to simulate the trace gases and aerosols over central Mexico during the 2006 MILAGRO field campaign, with multiple spatial resolutions and emission/terrain scenarios. Our analysis focuses on quantifying the sub-grid variability (SGV of trace gases and aerosols within a typical global climate model grid cell, i.e. 75×75 km2.

    Our results suggest that a simulation with 3-km horizontal grid spacing adequately reproduces the overall transport and mixing of trace gases and aerosols downwind of Mexico City, while 75-km horizontal grid spacing is insufficient to represent local emission and terrain-induced flows along the mountain ridge, subsequently affecting the transport and mixing of plumes from nearby sources. Therefore, the coarse model grid cell average may not correctly represent aerosol properties measured over polluted areas. Probability density functions (PDFs for trace gases and aerosols show that secondary trace gases and aerosols, such as O3, sulfate, ammonium, and nitrate, are more likely to have a relatively uniform probability distribution (i.e. smaller SGV over a narrow range of concentration values. Mostly inert and long-lived trace gases and aerosols, such as CO and BC, are more likely to have broad and skewed distributions (i.e. larger SGV over polluted regions. Over remote areas, all trace gases and aerosols are more uniformly distributed compared to polluted areas. Both CO and O3 SGV vertical profiles are nearly constant within the PBL during daytime, indicating that trace gases

  8. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  9. Modeling ion exchange in clinoptilolite using the EQ3/6 geochemical modeling code

    International Nuclear Information System (INIS)

    Viani, B.E.; Bruton, C.J.

    1992-06-01

    Assessing the suitability of Yucca Mtn., NV as a potential repository for high-level nuclear waste requires the means to simulate ion-exchange behavior of zeolites. Vanselow and Gapon convention cation-exchange models have been added to geochemical modeling codes EQ3NR/EQ6, allowing exchange to be modeled for up to three exchangers or a single exchanger with three independent sites. Solid-solution models that are numerically equivalent to the ion-exchange models were derived and also implemented in the code. The Gapon model is inconsistent with experimental adsorption isotherms of trace components in clinoptilolite. A one-site Vanselow model can describe adsorption of Cs or Sr on clinoptilolite, but a two-site Vanselow exchange model is necessary to describe K contents of natural clinoptilolites

  10. Study of reproducibility of measurements with the spectrometer of Bonner multispheres

    International Nuclear Information System (INIS)

    Azevedo, G.A.; Pereira, W.W.; Patrao, K.C.S.; Fonseca, E.S.

    2013-01-01

    This work aims to study the metrological behavior of the Bonner Multisphere Spectrometer (BMS) of the LN / LNMRI / IRD - Laboratorio Metrologia de Neutrons / Laboratorio Nacional de Metrologia e Radiacao Ionizante / Instituto de Radioprotecao e Dosimetria, for measurements in repeatability and reproducibility conditions. Initially, a simulation was done by applying the Monte Carlo method, using the MCNP code and respecting the ISO 8529-1 (2001), using the sources of Californium ( 252 Cf), Americium-Beryllium ( 241 AmBe) and californium in heavy water (Cf + D 2 O), all located at a distance of 100 cm from the neutron detector ( 6 Li (Eu) - crystal scintillator). In this program, the counting of neutrons that are captured by the detector was made. The source is located in the center of a sphere of radius 300 cm. Analyzes the impact of these neutrons in a point of the sphere wall, which in this case acted as a neutron detector and from there, it is estimated the number of neutrons that collide in the whole sphere. The purpose is to obtain the neutron count for different energy bands in a solid field of neutrons, since they have a spectrum ranging from a low to a high energy that can also vary within a particular environment. Wishes to obtain new fields with different sources and moderators materials to be used as new reference fields. Measurements are being conducted for these fields, with the aim of analyzing the variability conditions of the measurement (repeatability and reproducibility) in LEN - Laboratorio de Espectrometria de Neutrons of the LN/LMNRI/IRD. Thus, the spectrometer will be used to improve both the knowledge of the spectrum as the standard of neutrons of the lab, proving that a spectrometry is essential for correct measurement

  11. Development of a kinetics analysis code for fuel solution combined with thermal-hydraulics analysis code PHOENICS and analysis of natural-cooling characteristic test of TRACY. Contract research

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, Shouichi; Yamane, Yuichi; Miyoshi, Yoshinori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    Since exact information is not always acquired in the criticality accident of fuel solution, parametric survey calculations are required for grasping behaviors of the thermal-hydraulics. On the other hand, the practical methods of the calculation with can reduce the computation time with allowable accuracy will be also required, since the conventional method takes a long calculation time. In order to fulfill the requirement, a two-dimensional (R-Z) nuclear-kinetics analysis code considering thermal-hydraulic based on the multi-region kinetic equations with one-group neutron energy was created by incorporating with the thermal-hydraulics analysis code PHOENICS for all-purpose use the computation time of the code was shortened by separating time mesh intervals of the nuclear- and heat-calculations from that of the hydraulics calculation, and by regulating automatically the time mesh intervals in proportion to power change rate. A series of analysis were performed for the natural-cooling characteristic test using TRACY in which the power changed slowly for 5 hours after the transient power resulting from the reactivity insertion of a 0.5 dollar. It was found that the code system was able to calculate within the limit of practical time, and acquired the prospect of reproducing the experimental values considerably for the power and temperature change. (author)

  12. Neutron activation analysis of trace elements in foodstuffs

    International Nuclear Information System (INIS)

    Schelenz, R.; Bayat, I.; Fischer, E.

    1976-05-01

    For the determination of trace elements in foodstuffs with the aid of neutron activation analysis the separation of volatile radionuclides after digestion of the sample is of special interest for radiochemical processing. A distillation procedure was developed to give reproducable results, however optimal conditions were not found for all volatile radionuclides studied. The required selective separation of Br-82 from the distillate was best achieved by the application of an ion-exchange column-chromatography technique. The computer programs for the evaluation of complex gamma spectra have been developed further. The automatic peak search and peak area determination is based on a computer program using the correlation technique and carried out with a mini-computer coupled with a multi-channel gamma spectrometer. The results, which are presented in 3 earlier reports relating to this research program, reveal the advantages and disadvantages of the individual steps of the radiochemical separation scheme. Before neutron activation analysis can be introduced on a routine basis, some aspects of the radiochemical process remain to be tested; these studies will be published in a fourth and final report. (orig.) [de

  13. On the Impact of Zero-padding in Network Coding Efficiency with Internet Traffic and Video Traces

    DEFF Research Database (Denmark)

    Taghouti, Maroua; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2016-01-01

    Random Linear Network Coding (RLNC) theoretical results typically assume that packets have equal sizes while in reality, data traffic presents a random packet size distribution. Conventional wisdom considers zero-padding of original packets as a viable alternative, but its effect can reduce the e...

  14. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  15. User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0

    Science.gov (United States)

    Wright, William B.

    1999-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  16. Pressure vessel SBLOCA simulation with trace: application to ISTF (Rosa V) - 151

    International Nuclear Information System (INIS)

    Abella, V.; Gallardo, S.; Verdu, G.

    2010-01-01

    In this work, an overview of the results obtained in the simulation of an Upper Head Small Break Loss-Of-Coolant-Accident (SBLOCA) under the assumption of total failure of High Pressure Injection System (HPIS) in the Large Scale Test Facility (LSTF) is provided. In previous works, an SBLOCA located in the Pressure Vessel (PV) Lower Plenum was simulated with TRACE. In that case, an asymmetrical steam generator secondary-side depressurization was produced as an accident management action at the Steam Generator in loop without pressurizer after the generation of safety injection signal to achieve a determined depressurization rate in the primary system. The new SBLOCA scenario has been simulated and results compared with experimental values, with the purpose of completing the analysis of PV SBLOCA. This study is developed in the frame of the OECD/NEA ROSA Project Test 6-1 (SB-PV-9 in JAEA). Finally, the present paper represents a contribution for the study of safety analysis of vessel SBLOCAs and the assessment of the predictability of thermal-hydraulic codes like TRACE. (authors)

  17. Introducing GAMER: A Fast and Accurate Method for Ray-tracing Galaxies Using Procedural Noise

    Science.gov (United States)

    Groeneboom, N. E.; Dahle, H.

    2014-03-01

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.

  18. Introducing GAMER: A fast and accurate method for ray-tracing galaxies using procedural noise

    International Nuclear Information System (INIS)

    Groeneboom, N. E.; Dahle, H.

    2014-01-01

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.

  19. Introducing GAMER: A fast and accurate method for ray-tracing galaxies using procedural noise

    Energy Technology Data Exchange (ETDEWEB)

    Groeneboom, N. E.; Dahle, H., E-mail: nicolaag@astro.uio.no [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, N-0315 Oslo (Norway)

    2014-03-10

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.

  20. Anisotropic ray trace

    Science.gov (United States)

    Lam, Wai Sze Tiffany

    Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for

  1. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  2. Trace element profiles in modern horse molar enamel as tracers of seasonality: Evidence from micro-XRF, LA-ICP-MS and stable isotope analysis

    Science.gov (United States)

    de Winter, Niels; Goderis, Steven; van Malderen, Stijn; Vanhaecke, Frank; Claeys, Philippe

    2016-04-01

    A combination of laboratory micro-X-ray Fluorescence (μXRF) and stable carbon and oxygen isotope analysis shows that trace element profiles from modern horse molars reveal a seasonal pattern that co-varies with seasonality in the oxygen isotope records of enamel carbonate from the same teeth. A combination of six cheek teeth (premolars and molars) from the same individual yields a seasonal isotope and trace element record of approximately three years recorded during the growth of the molars. This record shows that reproducible measurements of various trace element ratios (e.g., Sr/Ca, Zn/Ca, Fe/Ca, K/Ca and S/Ca) lag the seasonal pattern in oxygen isotope records by 2-3 months. Laser Ablation-ICP-Mass Spectrometry (LA-ICP-MS) analysis on a cross-section of the first molar of the same individual is compared to the bench-top tube-excitation μXRF results to test the robustness of the measurements and to compare both methods. Furthermore, trace element (e.g. Sr, Zn, Mg & Ba) profiles perpendicular to the growth direction of the same tooth, as well as profiles parallel to the growth direction are measured with LA-ICP-MS and μXRF to study the internal distribution of trace element ratios in two dimensions. Results of this extensive complementary line-scanning procedure shows the robustness of state of the art laboratory micro-XRF scanning for the measurement of trace elements in bioapatite. The comparison highlights the advantages and disadvantages of both methods for trace element analysis and illustrates their complementarity. Results of internal variation within the teeth shed light on the origins of trace elements in mammal teeth and their potential use for paleo-environmental reconstruction.

  3. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  4. QCA Gray Code Converter Circuits Using LTEx Methodology

    Science.gov (United States)

    Mukherjee, Chiradeep; Panda, Saradindu; Mukhopadhyay, Asish Kumar; Maji, Bansibadan

    2018-04-01

    The Quantum-dot Cellular Automata (QCA) is the prominent paradigm of nanotechnology considered to continue the computation at deep sub-micron regime. The QCA realizations of several multilevel circuit of arithmetic logic unit have been introduced in the recent years. However, as high fan-in Binary to Gray (B2G) and Gray to Binary (G2B) Converters exist in the processor based architecture, no attention has been paid towards the QCA instantiation of the Gray Code Converters which are anticipated to be used in 8-bit, 16-bit, 32-bit or even more bit addressable machines of Gray Code Addressing schemes. In this work the two-input Layered T module is presented to exploit the operation of an Exclusive-OR Gate (namely LTEx module) as an elemental block. The "defect-tolerant analysis" of the two-input LTEx module has been analyzed to establish the scalability and reproducibility of the LTEx module in the complex circuits. The novel formulations exploiting the operability of the LTEx module have been proposed to instantiate area-delay efficient B2G and G2B Converters which can be exclusively used in Gray Code Addressing schemes. Moreover this work formulates the QCA design metrics such as O-Cost, Effective area, Delay and Cost α for the n-bit converter layouts.

  5. Effect of high image compression on the reproducibility of cardiac Sestamibi reporting

    International Nuclear Information System (INIS)

    Thomas, P.; Allen, L.; Beuzeville, S.

    1999-01-01

    Full text: Compression algorithms have been mooted to minimize storage space and transmission times of digital images. We assessed the impact of high-level lousy compression using JPEG and wavelet algorithms on image quality and reporting accuracy of cardiac Sestamibi studies. Twenty stress/rest Sestamibi cardiac perfusion studies were reconstructed into horizontal short, vertical long and horizontal long axis slices using conventional methods. Each of these six sets of slices were aligned for reporting and saved (uncompressed) as a bitmap. This bitmap was then compressed using JPEG compression, then decompressed and saved as a bitmap for later viewing. This process was repeated using the original bitmap and wavelet compression. Finally, a second copy of the original bitmap was made. All 80 bitmaps were randomly coded to ensure blind reporting. The bitmaps were read blinded and by consensus of 2 experienced nuclear medicine physicians using a 5-point scale and 25 cardiac segments. Subjective image quality was also reported using a 3-point scale. Samples of the compressed images were also subtracted from the original bitmap for visual comparison of differences. Results showed an average compression ratio of 23:1 for wavelet and 13:1 for JPEG. Image subtraction showed only very minor discordance between the original and compressed images. There was no significant difference in subjective quality between the compressed and uncompressed images. There was no significant difference in reporting reproducibility of the identical bitmap copy, the JPEG image and the wavelet image compared with the original bitmap. Use of the high compression algorithms described had no significant impact on reporting reproducibility and subjective image quality of cardiac Sestamibi perfusion studies

  6. Reproducibility of (n,γ) gamma ray spectrum in Pb under different ENDF/B releases

    Energy Technology Data Exchange (ETDEWEB)

    Kebwaro, J.M., E-mail: jeremiahkebwaro@gmail.com [Department of Physical Sciences, Karatina University, P.O. Box 1957-10101, Karatina (Kenya); He, C.H.; Zhao, Y.L. [School of Nuclear Science and Technology, Xian Jiaotong University, Xian, Shaanxi 710049 (China)

    2016-04-15

    Radiative capture reactions are of interest in shielding design and other fundamental research. In this study the reproducibility of (n,γ) reactions in Pb when cross-section data from different ENDF/B releases are used in the Monte-Carlo code, MCNP, was investigated. Pb was selected for this study because it is widely used in shielding applications where capture reactions are likely to occur. Four different neutron spectra were declared as source in the MCNP model which consisted of a simple spherical geometry. The gamma ray spectra due to the capture reactions were recorded at 10 cm from the center of the sphere. The results reveal that the gamma ray spectrum produced by ENDF/B-V is in reasonable agreement with that produced when ENDF/B-VI.6 is used. However the spectrum produced by ENDF/B-VII does not reveal any primary gamma rays in the higher energy region (E > 3 MeV). It is further observed that the intensities of the capture gamma rays produced when various releases are used differ by a some margin showing that the results are not reproducible. The generated spectra also vary with the spectrum of the source neutrons. The discrepancies observed among various ENDF/B releases could raise concerns to end users and need to be addressed properly during benchmarking calculations before the next release. The evaluation from ENDF to ACE format that is supplied with MCNP should also be examined because errors might have arisen during the evaluation.

  7. Ray-tracing analysis of electron-cyclotron-resonance heating in straight stellarators

    International Nuclear Information System (INIS)

    Kato, K.

    1983-05-01

    A ray-tracing computer code is developed and implemented to simulate electron cyclotron resonance heating (ECRH) in stellarators. A straight stellarator model is developed to simulate the confinement geometry. Following a review of ECRH, a cold plasma model is used to define the dispersion relation. To calculate the wave power deposition, a finite temperature damping approximation is used. 3-D ray equations in cylindrical coordinates are derived and put into suitable forms for computation. The three computer codes, MAC, HERA, and GROUT, developed for this research, are described next. ECRH simulation is then carried out for three models including Heliotron E and Wendelstein VII A. Investigated aspects include launching position and mode scan, frequency detuning, helical effects, start-up, and toroidal effects. Results indicate: (1) an elliptical waveguide radiation pattern, with its long axis oriented half-way between the toroidal axis and the saddle point line, is more efficient than a circular one; and (2) mid-plane, high field side launch is favored for both O- and X-waves

  8. Derivatization reaction-based surface-enhanced Raman scattering (SERS) for detection of trace acetone.

    Science.gov (United States)

    Zheng, Ying; Chen, Zhuo; Zheng, Chengbin; Lee, Yong-Ill; Hou, Xiandeng; Wu, Li; Tian, Yunfei

    2016-08-01

    A facile method was developed for determination of trace volatile acetone by coupling a derivatization reaction to surface-enhanced Raman scattering (SERS). With iodide modified Ag nanoparticles (Ag IMNPs) as the SERS substrate, acetone without obvious Raman signal could be converted to SERS-sensitive species via a chemical derivatization reaction with 2,4-dinitrophenylhydrazine (2,4-DNPH). In addition, acetone can be effectively separated from liquid phase with a purge-sampling device and then any serious interference from sample matrices can be significantly reduced. The optimal conditions for the derivatization reaction and the SERS analysis were investigated in detail, and the selectivity and reproducibility of this method were also evaluated. Under the optimal conditions, the limit of detection (LOD) for acetone was 5mgL(-1) or 0.09mM (3σ). The relative standard deviation (RSD) for 80mgL(-1) acetone (n=9) was 1.7%. This method was successfully used for the determination of acetone in artificial urine and human urine samples with spiked recoveries ranging from 92% to 110%. The present method is convenient, sensitive, selective, reliable and suitable for analysis of trace acetone, and it could have a promising clinical application in early diabetes diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Reproducibility of temporomandibular joint tomography. Influence of shifted X-ray beam and tomographic focal plane on reproducibility

    International Nuclear Information System (INIS)

    Saito, Masashi

    1999-01-01

    Proper tomographic focal plane and x-ray beam direction are the most important factors to obtain accurate images of the temporomandibular joint (TMJ). In this study, to clarify the magnitude of effect of these two factors on the image quality. We evaluated the reproducibility of tomograms by measuring the distortion when the x-ray beam was shifted from the correct center of the object. The effects of the deviation of the tomographic focal plane on image quality were evaluated by the MTF (Modulation Transfer Function). Two types of tomograms, one the plane type, the other the rotational type were used in this study. A TMJ model was made from Teflon for the purpose of evaluation by shifting the x-ray beam. The x-ray images were obtained by tilting the model from 0 to 10 degrees 2-degree increments. These x-ray images were processed for computer image analysis, and then the distance between condyle and the joint space was measured. To evaluate the influence of the shifted tomographic focal plane on image sharpness, the x-ray images from each setting were analyzed by MTF. To obtain the MTF, ''knife-edge'' made from Pb was used. The images were scanned with a microdensitometer at the central focal plane, and 0, 0.5, 1 mm away respectively. The density curves were analyzed by Fourier analysis and the MTF was calculated. The reproducibility of images became worse by shifting the x-ray beam. This tendency was similar for both tomograms. Object characteristics such as anterior and posterior portion of the joint space affected the deterioration of reproducibility of the tomography. The deviation of the tomographic focal plane also decreased the reproducibility of the x-ray images. The rotational type showed a better MTF, but it became seriously unfavorable with slight changes of the tomographic focal plane. Contrarily, the plane type showed a lower MTF, but the image was stable with shifting of the tomographic focal plane. (author)

  10. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  11. Interactive Stable Ray Tracing

    DEFF Research Database (Denmark)

    Dal Corso, Alessandro; Salvi, Marco; Kolb, Craig

    2017-01-01

    Interactive ray tracing applications running on commodity hardware can suffer from objectionable temporal artifacts due to a low sample count. We introduce stable ray tracing, a technique that improves temporal stability without the over-blurring and ghosting artifacts typical of temporal post-pr...

  12. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  13. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  14. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  15. Manual tracing versus smartphone application (app) tracing: a comparative study.

    Science.gov (United States)

    Sayar, Gülşilay; Kilinc, Delal Dara

    2017-11-01

    This study aimed to compare the results of conventional manual cephalometric tracing with those acquired with smartphone application cephalometric tracing. The cephalometric radiographs of 55 patients (25 females and 30 males) were traced via the manual and app methods and were subsequently examined with Steiner's analysis. Five skeletal measurements, five dental measurements and two soft tissue measurements were managed based on 21 landmarks. The durations of the performances of the two methods were also compared. SNA (Sella, Nasion, A point angle) and SNB (Sella, Nasion, B point angle) values for the manual method were statistically lower (p < .001) than those for the app method. The ANB value for the manual method was statistically lower than that of app method. L1-NB (°) and upper lip protrusion values for the manual method were statistically higher than those for the app method. Go-GN/SN, U1-NA (°) and U1-NA (mm) values for manual method were statistically lower than those for the app method. No differences between the two methods were found in the L1-NB (mm), occlusal plane to SN, interincisal angle or lower lip protrusion values. Although statistically significant differences were found between the two methods, the cephalometric tracing proceeded faster with the app method than with the manual method.

  16. Trace detection of tetrahydrocannabinol (THC) with a SERS-based capillary platform prepared by the in situ microwave synthesis of AgNPs.

    Science.gov (United States)

    Yüksel, Sezin; Schwenke, Almut M; Soliveri, Guido; Ardizzone, Silvia; Weber, Karina; Cialla-May, Dana; Hoeppener, Stephanie; Schubert, Ulrich S; Popp, Jürgen

    2016-10-05

    In the present study, an ultra-sensitive and highly reproducible novel SERS-based capillary platform was developed and utilized for the trace detection of tetrahydrocannabinol (THC). The approach combines the advantages of microwave-assisted nanoparticle synthesis, plasmonics and capillary forces. By employing a microwave-assisted preparation method, glass capillaries were reproducibly coated with silver nanoparticles in a batch fabrication process that required a processing time of 3 min without needing to use any pre-surface modifications or add surfactants. The coated capillaries exhibited an excellent SERS activity with a high reproducibility and enabled the detection of low concentrations of target molecules. At the same time, only a small amount of analyte and a short and simple incubation process was required. The developed platform was applied to the spectroscopic characterization of tetrahydrocannabinol (THC) and its identification at concentration levels down to 1 nM. Thus, a highly efficient detection system for practical applications, e.g., in drug monitoring/detection, is introduced, which can be fabricated at low cost by using microwave-assisted batch synthesis techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Validation of the coupled TRACE /PARCS model of KKL through data of plant from the event shooting turbine 2001; Validacion del modelo acoplado TRACE/PARCS de KKL por medio de los datos de planta del evento disparo de turbina 2001

    Energy Technology Data Exchange (ETDEWEB)

    Hidalga, P.; Sekhri, A.; Baumann, P.; Miro, R.; Barrachina, T.; Morera, D.; Verdu, G.

    2014-07-01

    In order to improve the modeling of the nuclear power station Leibstadt (KKL), has been used neutronic code PARCS 3D mesh and the Thermo-hydraulic TRACE. This work is part of the development of a multi-scale methodology and multi-physic allowing the analysis of transient in the reactor, using the available simulation tools. To check the validity of the model it has been simulated a transient complex type turbine trigger the same actual event in plant-based and the results have been compared. At the same time, have been generated the necessary effective sections corresponding to the cycle that is happened the event using the SIMTAB methodology, developed by the ISIRYM jointly with IBERINCO. The model used in TRACE is based on the prior model existing TRAC-BF1. The comparison with plant data shows a good agreement with the data from the simulation. As a result, it can be concluded that KKL TRACE/PARCS coupled model with the generation of effective sections SIMTAB provides a satisfactory analysis of transient turbine trigger complex. (Author)

  18. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.; Glowa, G.; Wren, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Ewig, F. [GRS Koln (Germany); Dickenson, S. [AEAT, (United Kingdom); Billarand, Y.; Cantrel, L. [IPSN (France); Rydl, A. [NRIR (Czech Republic); Royen, J. [OECD/NEA (France)

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I{sup -} concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  19. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I - concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  20. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    Science.gov (United States)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  1. Photometric determination of trace cadmium in waste water drained from uranium mining and water-metallurgy

    International Nuclear Information System (INIS)

    Zhu Zihui; Gu Gang; Xu Quanxiu

    1987-09-01

    Cadmium (Cd) ions react with dithizone to form a pink to red color that can be extracted with chloroform and measured photometrically. Dithizone method is one of standard method to determine trace Cd in the environmental waste water. This method, however, can not be suitable for measuring the trace Cd in the waste water drained from uranium mining and water-metallurgy factory, because this kind of waste water contains magnesium ions as high as 1500 mg/L. One more discomfort is that the method needs to use a large amount of potassium cyanide. The authors, therefore, used potassium fluorine as a precipitator that removed the excess magnesium ions in the experimental system, and try to reduce the amount of potassium cyanide to 1/20 of original usage. The experimental results indicated that the modified method as mentioned above was very satisfactory either to simulated samples or to actual samples of waste water drained from uranium mining and water-metallurgy plants. In Summary, this modified method has higher sensitivity with minimun detectable quantity of 0.02 ppm and it is accurate and reproducible with recovery rate of 100 ± 5%

  2. CERN Analysis Preservation: A Novel Digital Library Service to Enable Reusable and Reproducible Research

    CERN Document Server

    AUTHOR|(CDS)2079501; Chen, Xiaoli; Dani, Anxhela; Dasler, Robin Lynnette; Delgado Fernandez, Javier; Fokianos, Pamfilos; Herterich, Patricia Sigrid; Simko, Tibor

    2016-01-01

    The latest policy developments require immediate action for data preservation, as well as reproducible and Open Science. To address this, an unprecedented digital library service is presented to enable the High-Energy Physics community to preserve and share their research objects (such as data, code, documentation, notes) throughout their research process. While facing the challenges of a “big data” community, the internal service builds on existing internal databases to make the process as easy and intrinsic as possible for researchers. Given the “work in progress” nature of the objects preserved, versioning is supported. It is expected that the service will not only facilitate better preservation techniques in the community, but will foremost make collaborative research easier as detailed metadata and novel retrieval functionality provide better access to ongoing works. This new type of e-infrastructure, fully integrated into the research workflow, could help in fostering Open Science practices acro...

  3. Applications of the 3-dim ICRH global wave code FISIC and comparison with other models

    International Nuclear Information System (INIS)

    Kruecken, T.; Brambilla, M.

    1989-01-01

    Numerical simulations of two ICRF heating experiments in ASDEX are presented, using the FISIC code to solve the integrodifferential wave equations in the finite Larmor radius (FLR) approximation model and of ray tracing. The different models show on the whole good agreement; we can however identify a few interesting toroidal effects, in particular on the efficiency of mode conversion and on the propagation of ion Bernstein waves. (author)

  4. METHES: A Monte Carlo collision code for the simulation of electron transport in low temperature plasmas

    Science.gov (United States)

    Rabie, M.; Franck, C. M.

    2016-06-01

    We present a freely available MATLAB code for the simulation of electron transport in arbitrary gas mixtures in the presence of uniform electric fields. For steady-state electron transport, the program provides the transport coefficients, reaction rates and the electron energy distribution function. The program uses established Monte Carlo techniques and is compatible with the electron scattering cross section files from the open-access Plasma Data Exchange Project LXCat. The code is written in object-oriented design, allowing the tracing and visualization of the spatiotemporal evolution of electron swarms and the temporal development of the mean energy and the electron number due to attachment and/or ionization processes. We benchmark our code with well-known model gases as well as the real gases argon, N2, O2, CF4, SF6 and mixtures of N2 and O2.

  5. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  6. Architecture proposal for the use of QR code in supply chain management

    Directory of Open Access Journals (Sweden)

    Dalton Matsuo Tavares

    2012-01-01

    Full Text Available Supply chain traceability and visibility are key concerns for many companies. Radio-Frequency Identification (RFID is an enabling technology that allows identification of objects in a fully automated manner via radio waves. Nevertheless, this technology has limited acceptance and high costs. This paper presents a research effort undertaken to design a track and trace solution in supply chains, using quick response code (or QR Code for short as a less complex and cost-effective alternative for RFID in supply chain management (SCM. A first architecture proposal using open source software will be presented as a proof of concept. The system architecture is presented in order to achieve tag generation, the image acquisition and pre-processing, product inventory and tracking. A prototype system for the tag identification is developed and discussed at the end of the paper to demonstrate its feasibility.

  7. Shaping Southeast Asia: Tracing Tourism Imaginaries in Guidebooks and Travel Blogs

    Directory of Open Access Journals (Sweden)

    Felix M. Bergmeister

    2015-12-01

    Full Text Available Tourism constitutes both an economic activity and a cultural force that involves a dynamic interplay between travelers and their ideas about the societies they visit. This paper traces the construction and negotiation of “tourism imaginaries” (Salazar, 2012 in popular guidebooks and independent travel-blogs, critically examining questions of representation and power relations in a Southeast Asian context. Employing critical discourse analysis, this paper investigates how particular Southeast Asian destinations are represented from a Western perspective. Whereas long-established commercial media such as guidebooks function mainly to communicate destination images to the reader, recent participatory media formats (e.g. travel-blogs are more experienced-based and enable tourists to form ideas about foreign places in idiosyncratic ways. The preliminary insights of this study show that hegemonic narratives from guidebooks are rather reproduced than critically challenged and subverted in the examples under review.

  8. Traces generating what was there

    CERN Document Server

    2017-01-01

    Traces keep time contained and make visible what was there. Going back to the art of trace-reading, they continue to be a fundamental resource for scientific knowledge production. The contributions study, from the biology laboratory to the large colliders of particle physics, techniques involved in the production of material traces. Following their changes over two centuries, this collection shows the continuities they have in the digital age.

  9. TXRF analysis of trace metals in thin silicon nitride films

    International Nuclear Information System (INIS)

    Vereecke, G.; Arnauts, S.; Verstraeten, K.; Schaekers, M.; Heyrts, M.M.

    2000-01-01

    As critical dimensions of integrated circuits continue to decrease, high dielectric constant materials such as silicon nitride are being considered to replace silicon dioxide in capacitors and transistors. The achievement of low levels of metal contamination in these layers is critical for high performance and reliability. Existing methods of quantitative analysis of trace metals in silicon nitride require high amounts of sample (from about 0.1 to 1 g, compared to a mass of 0.2 mg for a 2 nm thick film on a 8'' silicon wafer), and involve digestion steps not applicable to films on wafers or non-standard techniques such as neutron activation analysis. A novel approach has recently been developed to analyze trace metals in thin films with analytical techniques currently used in the semiconductor industry. Sample preparation consists of three steps: (1) decomposition of the silicon nitride matrix by moist HF condensed at the wafer surface to form ammonium fluosilicate. (2) vaporization of the fluosilicate by a short heat treatment at 300 o C. (3) collection of contaminants by scanning the wafer surface with a solution droplet (VPD-DSC procedure). The determination of trace metals is performed by drying the droplet on the wafer and by analyzing the residue by TXRF, as it offers the advantages of multi-elemental analysis with no dilution of the sample. The lower limits of detection for metals in 2 nm thick films on 8'' silicon wafers range from about 10 to 200 ng/g. The present study will focus on the matrix effects and the possible loss of analyte associated with the evaporation of the fluosilicate salt, in relation with the accuracy and the reproducibility of the method. The benefits of using an internal standard will be assessed. Results will be presented from both model samples (ammonium fluoride contaminated with metallic salts) and real samples (silicon nitride films from a production tool). (author)

  10. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  11. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  12. B2-B2.5 code benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Dekeyser, W.; Baelmans, M; Voskoboynikov, S.; Rozhansky, V.; Reiter, D.; Wiesen, S.; Kotov, V.; Boerner, P.

    2011-01-15

    ITER-IO currently (and since about 15 years) employs the SOLPS4.xxx code for its divertor design, currently version SOLPS4.3. SOLPS.xxx is a special variant of the B2-EIRENE code, which was originally developed by an European consortium (FZ Juelich, AEA Culham, ERM Belgium/KU Leuven) in the late eighties and early nineties of the last century under NET contracts. Until today even the very similar edge plasma codes within the SOLPS family, if run on a seemingly identical choice of physical parameters, still sometimes disagree significantly with each other. It is obvious that in computational engineering applications, as they are carried out for the various ITER divertor aspects with SOLPS4.3 for more than a decade now, any transition from one to another code must be fully backward compatible, or, at least, the origin of differences in the results must be identified and fully understood quantitatively. In this report we document efforts undertaken in 2010 to ultimately eliminate the third issue. For the kinetic EIRENE part within SOLPS this backward compatibility (back until 1996) was basically achieved (V. Kotov, 2004-2006) and SOLPS4.3 is now essentially up to date with the current EIRENE master maintained at FZ Juelich. In order to achieve a similar level of reproducibility for the plasma fluid (B2, B2.5) part, we follow a similar strategy, which is quite distinct from the previous SOLPS benchmark attempts: the codes are ''disintegrated'' and pieces of it are run on smallest (i.e. simplest) problems. Only after full quantitative understanding is achieved, the code model is enlarged, integrated, piece by piece again, until, hopefully, a fully backward compatible B2 / B2.5 ITER edge plasma simulation will be achieved. The status of this code dis-integration effort and its findings until now (Nov. 2010) are documented in the present technical note. This work was initiated in a small workshop by the three partner teams of KU Leuven, St. Petersburg

  13. Microwave transport in EBT distribution manifolds using Monte Carlo ray-tracing techniques

    International Nuclear Information System (INIS)

    Lillie, R.A.; White, T.L.; Gabriel, T.A.; Alsmiller, R.G. Jr.

    1983-01-01

    Ray tracing Monte Carlo calculations have been carried out using an existing Monte Carlo radiation transport code to obtain estimates of the microsave power exiting the torus coupling links in EPT microwave manifolds. The microwave power loss and polarization at surface reflections were accounted for by treating the microwaves as plane waves reflecting off plane surfaces. Agreement on the order of 10% was obtained between the measured and calculated output power distribution for an existing EBT-S toroidal manifold. A cost effective iterative procedure utilizing the Monte Carlo history data was implemented to predict design changes which could produce increased manifold efficiency and improved output power uniformity

  14. Analysis of experiments performed at University of Hannover with Relap5/Mod2 and Cathare codes on fluid dynamic effects in the fuel element top nozzle area during refilling and reflooding

    International Nuclear Information System (INIS)

    Ambrosini, W.; D'Auria, F.; Di Marco, P.; Fantappie, G.; Giot, G.; Emmerechts, D.; Seynhaeve, J.M.; Zhang, J.

    1989-11-01

    The experimental data of flooding and CCFL in the fuel element top nozzle area collected at the University of Hannover have been analyzed with RELAP5/MOD2 and CATHARE V.1.3 codes. Preliminary sensitivity calculations have been performed to evaluate the influence of various parameters and code options on the results. However, an a priori rational assessment procedure has been performed for those parameters non specific in experimental data (e.g. energy loss coefficients in flow restrictions). This procedure is based on single phase flow pressure drops and no further tuning has been performed to fit experimental data. The reported experimental data and some others demonstrate the complex relation-ship among the involved physical quantities (film thickness, pressure drop etc.) even in a simple geometrical condition with well defined boundary conditions. In the application of the two advanced codes to the selected CCFL experiments it appears that sophisticated models do not simulate satisfactorily the measured phenomena mainly when situations similar to nuclear reactors are dealt with (rod bundles). This result should be evaluated considering that: - dimensional phenomena occurring in flooding experiments are not well reproducible with one dimensional models implemented in the two codes; - a rational and reproducible procedure has been used to fix some boundary conditions (K-tuning); there is the evidence that more tuning can be used to get results closer to the experimental ones in each specific situation; - the uncertainty bands in measured experimental results are not (entirely) specified. The work performed demonstrated that further applications to CCFL experiments of present codes appear to be unuseful. New models should be tested and implemented before any attempt to reproduce CCFL in experimental facilities by system codes

  15. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Directory of Open Access Journals (Sweden)

    Wiktor eMlynarski

    2014-03-01

    Full Text Available To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficientcoding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform - Independent Component Analysis (ICA trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment.

  16. Comparison of TITAN hybrid deterministic transport code and MCNP5 for simulation of SPECT

    International Nuclear Information System (INIS)

    Royston, K.; Haghighat, A.; Yi, C.

    2010-01-01

    Traditionally, Single Photon Emission Computed Tomography (SPECT) simulations use Monte Carlo methods. The hybrid deterministic transport code TITAN has recently been applied to the simulation of a SPECT myocardial perfusion study. The TITAN SPECT simulation uses the discrete ordinates formulation in the phantom region and a simplified ray-tracing formulation outside of the phantom. A SPECT model has been created in the Monte Carlo Neutral particle (MCNP)5 Monte Carlo code for comparison. In MCNP5 the collimator is directly modeled, but TITAN instead simulates the effect of collimator blur using a circular ordinate splitting technique. Projection images created using the TITAN code are compared to results using MCNP5 for three collimator acceptance angles. Normalized projection images for 2.97 deg, 1.42 deg and 0.98 deg collimator acceptance angles had maximum relative differences of 21.3%, 11.9% and 8.3%, respectively. Visually the images are in good agreement. Profiles through the projection images were plotted to find that the TITAN results followed the shape of the MCNP5 results with some differences in magnitude. A timing comparison on 16 processors found that the TITAN code completed the calculation 382 to 2787 times faster than MCNP5. Both codes exhibit good parallel performance. (author)

  17. Mobile code security

    Science.gov (United States)

    Ramalingam, Srikumar

    2001-11-01

    A highly secure mobile agent system is very important for a mobile computing environment. The security issues in mobile agent system comprise protecting mobile hosts from malicious agents, protecting agents from other malicious agents, protecting hosts from other malicious hosts and protecting agents from malicious hosts. Using traditional security mechanisms the first three security problems can be solved. Apart from using trusted hardware, very few approaches exist to protect mobile code from malicious hosts. Some of the approaches to solve this problem are the use of trusted computing, computing with encrypted function, steganography, cryptographic traces, Seal Calculas, etc. This paper focuses on the simulation of some of these existing techniques in the designed mobile language. Some new approaches to solve malicious network problem and agent tampering problem are developed using public key encryption system and steganographic concepts. The approaches are based on encrypting and hiding the partial solutions of the mobile agents. The partial results are stored and the address of the storage is destroyed as the agent moves from one host to another host. This allows only the originator to make use of the partial results. Through these approaches some of the existing problems are solved.

  18. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  19. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  20. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  1. Reverse ray tracing for transformation optics.

    Science.gov (United States)

    Hu, Chia-Yu; Lin, Chun-Hung

    2015-06-29

    Ray tracing is an important technique for predicting optical system performance. In the field of transformation optics, the Hamiltonian equations of motion for ray tracing are well known. The numerical solutions to the Hamiltonian equations of motion are affected by the complexities of the inhomogeneous and anisotropic indices of the optical device. Based on our knowledge, no previous work has been conducted on ray tracing for transformation optics with extreme inhomogeneity and anisotropicity. In this study, we present the use of 3D reverse ray tracing in transformation optics. The reverse ray tracing is derived from Fermat's principle based on a sweeping method instead of finding the full solution to ordinary differential equations. The sweeping method is employed to obtain the eikonal function. The wave vectors are then obtained from the gradient of that eikonal function map in the transformed space to acquire the illuminance. Because only the rays in the points of interest have to be traced, the reverse ray tracing provides an efficient approach to investigate the illuminance of a system. This approach is useful in any form of transformation optics where the material property tensor is a symmetric positive definite matrix. The performance and analysis of three transformation optics with inhomogeneous and anisotropic indices are explored. The ray trajectories and illuminances in these demonstration cases are successfully solved by the proposed reverse ray tracing method.

  2. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  3. The PARTRAC code: Status and recent developments

    Science.gov (United States)

    Friedland, Werner; Kundrat, Pavel

    Biophysical modeling is of particular value for predictions of radiation effects due to manned space missions. PARTRAC is an established tool for Monte Carlo-based simulations of radiation track structures, damage induction in cellular DNA and its repair [1]. Dedicated modules describe interactions of ionizing particles with the traversed medium, the production and reactions of reactive species, and score DNA damage determined by overlapping track structures with multi-scale chromatin models. The DNA repair module describes the repair of DNA double-strand breaks (DSB) via the non-homologous end-joining pathway; the code explicitly simulates the spatial mobility of individual DNA ends in parallel with their processing by major repair enzymes [2]. To simulate the yields and kinetics of radiation-induced chromosome aberrations, the repair module has been extended by tracking the information on the chromosome origin of ligated fragments as well as the presence of centromeres [3]. PARTRAC calculations have been benchmarked against experimental data on various biological endpoints induced by photon and ion irradiation. The calculated DNA fragment distributions after photon and ion irradiation reproduce corresponding experimental data and their dose- and LET-dependence. However, in particular for high-LET radiation many short DNA fragments are predicted below the detection limits of the measurements, so that the experiments significantly underestimate DSB yields by high-LET radiation [4]. The DNA repair module correctly describes the LET-dependent repair kinetics after (60) Co gamma-rays and different N-ion radiation qualities [2]. First calculations on the induction of chromosome aberrations have overestimated the absolute yields of dicentrics, but correctly reproduced their relative dose-dependence and the difference between gamma- and alpha particle irradiation [3]. Recent developments of the PARTRAC code include a model of hetero- vs euchromatin structures to enable

  4. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  5. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-05-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter and the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 tim es the hydraulic pipe diameter. (orig.)

  6. Validation of the ATHLET-code 2.1A by calculation of the ECTHOR experiment; Validierung des ATHLET-Codes 2.1A anhand des Einzeleffekt-Tests ECTHOR

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Andreas; Sarkadi, Peter; Schaffrath, Andreas [TUEV NORD SysTec GmbH und Co. KG, Hamburg (Germany)

    2010-06-15

    Before a numerical code (e.g. ATHLET) is used for simulation of physical phenomena being new or unknown for the code and/or the user, the user ensures the applicability of the code and his own experience of handling with it by means of a so-called validation. Parametric studies with the code are executed for that matter und the results have to be compared with verified experimental data. Corresponding reference values are available in terms of so-called single-effect-tests (e.g. ECTHOR). In this work the system-code ATHLET Mod. 2.1 Cycle A is validated by post test calculation of the ECTHOR experiment due to the above named aspects. With the ECTHOR-tests the clearing of a water-filled model of a loop seal by means of an air-stream was investigated including momentum exchange at the phase interface under adiabatic and atmospheric conditions. The post test calculations show that the analytical results meet the experimental data within the reproducibility of the experiments. Further findings of the parametric studies are: - The experimental results obtained with the system water-air (ECTHOR) can be assigned to a water-steam-system, if the densities of the phases are equal in both cases. - The initial water level in the loop seal has no influence on the results as long as the gas mass flow is increased moderately. - The loop seal is appropriately nodalized if the mean length of the control volumes accords approx. 1.5 times the hydraulic pipe diameter. (orig.)

  7. Inclusion of models to describe severe accident conditions in the fuel simulation code DIONISIO

    Energy Technology Data Exchange (ETDEWEB)

    Lemes, Martín; Soba, Alejandro [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Daverio, Hernando [Gerencia Reactores y Centrales Nucleares, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Denis, Alicia [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina)

    2017-04-15

    The simulation of fuel rod behavior is a complex task that demands not only accurate models to describe the numerous phenomena occurring in the pellet, cladding and internal rod atmosphere but also an adequate interconnection between them. In the last years several models have been incorporated to the DIONISIO code with the purpose of increasing its precision and reliability. After the regrettable events at Fukushima, the need for codes capable of simulating nuclear fuels under accident conditions has come forth. Heat removal occurs in a quite different way than during normal operation and this fact determines a completely new set of conditions for the fuel materials. A detailed description of the different regimes the coolant may exhibit in such a wide variety of scenarios requires a thermal-hydraulic formulation not suitable to be included in a fuel performance code. Moreover, there exist a number of reliable and famous codes that perform this task. Nevertheless, and keeping in mind the purpose of building a code focused on the fuel behavior, a subroutine was developed for the DIONISIO code that performs a simplified analysis of the coolant in a PWR, restricted to the more representative situations and provides to the fuel simulation the boundary conditions necessary to reproduce accidental situations. In the present work this subroutine is described and the results of different comparisons with experimental data and with thermal-hydraulic codes are offered. It is verified that, in spite of its comparative simplicity, the predictions of this module of DIONISIO do not differ significantly from those of the specific, complex codes.

  8. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  9. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  10. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  11. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  12. ICOOL: A Simulation Code for Ionization Cooling of Muon Beams

    International Nuclear Information System (INIS)

    Fernow, R. C.

    1999-01-01

    Current ideas [1,2] for designing a high luminosity muon collider require significant cooling of the phase space of the muon beams. The only known method that can cool the beams in a time comparable to the muon lifetime is ionization cooling [3,4]. This method requires directing the particles in the beam at a large angle through a low Z absorber material in a strong focusing magnetic channel and then restoring the longitudinal momentum with an rf cavity. We have developed a new 3-D tracking code ICOOL for examining possible configurations for muon cooling. A cooling system is described in terms of a series of longitudinal regions with associated material and field properties. The tracking takes place in a coordinate system that follows a reference orbit through the system. The code takes into account decays and interactions of ∼50-500 MeV/c muons in matter. Material geometry regions include cylinders and wedges. A number of analytic models are provided for describing the field configurations. Simple diagnostics are built into the code, including calculation of emittances and correlations, longitudinal traces, histograms and scatter plots. A number of auxiliary files can be generated for post-processing analysis by the user

  13. Measurement of Selected Organic Trace Gases During TRACE-P

    Science.gov (United States)

    Atlas, Elliot

    2004-01-01

    Major goals of the TRACE-P mission were: 1) to investigate the chemical composition of radiatively important gases, aerosols, and their precursors in the Asian outflow over the western Pacific, and 2) to describe and understand the chemical evolution of the Asian outflow as it is transported and mixed into the global troposphere. The research performed as part of this proposal addressed these major goals with a study of the organic chemical composition of gases in the TRACE-P region. This work was a close collaboration with the Blake/Rowland research group at UC-Irvine, and they have provided a separate report for their funded effort.

  14. Investigation of Two-Phase Flow Regime Maps for Development of Thermal-Hydraulic Analysis Codes

    International Nuclear Information System (INIS)

    Kim, Kyung Doo; Kim, Byoung Jae; Lee, Seong Wook

    2010-04-01

    This reports is a literature survey on models and correlations for determining flow pattern that are used to simulate thermal-hydraulics in nuclear reactors. Determination of flow patterns are a basis for obtaining physical values of wall/interfacial friction, wall/interfacial heat transfer, and droplet entrainment/de-entrainment. Not only existing system codes, such as RELAP5-3D, TRAC-M, MARS, TRACE, CATHARE) but also up-to-date researches were reviewed to find models and correlations

  15. Trace element analysis with PIXE using Trombay 5.5 MeV Van de Graaff accelerator

    International Nuclear Information System (INIS)

    Govil, Rekha; Kataria, S.K.; Kapoor, S.S.; Madan Lal; Nadkarni, D.M.; Rama Rao, P.N.; Viswanathan, K.V.

    1980-01-01

    The work on trace element analysis using proton induced X-ray emission technique (PIXE) with the proton beam from 5.5 MeV Van de Graaff accelerator at Trombay, is described. The experimental set up consisted of an indigeneously built 220 eV resolution Si(Li) x-ray spectrometer and target chamber having arrangements to mount upto twelve targets. In the present work, a variety of samples of biological nature, monazite mineral and some other samples were analyzed along with a standard multi-element sample. The sample preparation technique for different samples is also given in the report. For quantitative estimation of trace elements, a computer code developed earlier was used. The proton induced x-ray spectra of various samples and their computer fits are presented and quantitative results for some selected samples are also given. The minimum detection limits which were achieved with the present set up are also given. (auth.)

  16. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  17. Double-trace deformations of conformal correlations

    Science.gov (United States)

    Giombi, Simone; Kirilin, Vladimir; Perlmutter, Eric

    2018-02-01

    Large N conformal field theories often admit unitary renormalization group flows triggered by double-trace deformations. We compute the change in scalar four-point functions under double-trace flow, to leading order in 1/ N. This has a simple dual in AdS, where the flow is implemented by a change of boundary conditions, and provides a physical interpretation of single-valued conformal partial waves. We extract the change in the conformal dimensions and three-point coefficients of infinite families of double-trace composite operators. Some of these quantities are found to be sign-definite under double-trace flow. As an application, we derive anomalous dimensions of spinning double-trace operators comprised of non-singlet constituents in the O( N) vector model.

  18. Radiation protection for human exploration of the moon and mars: Application of the mash code system

    International Nuclear Information System (INIS)

    Johnson, J.O.; Santoro, R.T.; Drischler, J.D.; Barnes, J.M.

    1992-01-01

    The Monte Carlo Adjoint Shielding code system -- MASH, developed for the Department of Defense for calculating radiation protection factors for armored vehicles against neutron and gamma radiation, has been used to assess the dose from reactor radiation to an occupant in a habitat on Mars. The capability of MASH to reproduce measured data is summarized to demonstrate the accuracy of the code. The estimation of the radiation environment in an idealized reactor-habitat model is reported to illustrate the merits of the adjoint Monte Carlo procedure for space related studies. The reactor radiation dose for different reactor-habitat surface configurations to a habitat occupant is compared with the natural radiation dose acquired during a 500-day Mars mission

  19. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  20. Trace metals, melanin-based pigmentation and their interaction influence immune parameters in feral pigeons (Columba livia).

    Science.gov (United States)

    Chatelain, M; Gasparini, J; Frantz, A

    2016-04-01

    Understanding the effects of trace metals emitted by anthropogenic activities on wildlife is of great concern in urban ecology; yet, information on how they affect individuals, populations, communities and ecosystems remains scarce. In particular, trace metals may impact survival by altering the immune system response to parasites. Plumage melanin is assumed to influence the effects of trace metals on immunity owing to its ability to bind metal ions in feathers and its synthesis being coded by a pleiotropic gene. We thus hypothesized that trace metal exposure would interact with plumage colouration in shaping immune response. We experimentally investigated the interactive effect between exposure to an environmentally relevant range of zinc and/or lead and melanin-based plumage colouration on components of the immune system in feral pigeons (Columba livia). We found that zinc increased anti-keyhole limpet hemocyanin (KLH) IgY primary response maintenance, buffered the negative effect of lead on anti-KLH IgY secondary response maintenance and tended to increase T-cell mediated phytohaemagglutinin (PHA) skin response. Lead decreased the peak of the anti-KLH IgY secondary response. In addition, pheomelanic pigeons exhibited a higher secondary anti-KLH IgY response than did eumelanic ones. Finally, T-cell mediated PHA skin response decreased with increasing plumage eumelanin level of birds exposed to lead. Neither treatments nor plumage colouration correlated with endoparasite intensity. Overall, our study points out the effects of trace metals on some parameters of birds' immunity, independently from other confounding urbanization factors, and underlines the need to investigate their impacts on other life history traits and their consequences in the ecology and evolution of host-parasite interactions.

  1. TCP Packet Trace Analysis. M.S. Thesis

    Science.gov (United States)

    Shepard, Timothy J.

    1991-01-01

    Examination of a trace of packets collected from the network is often the only method available for diagnosing protocol performance problems in computer networks. This thesis explores the use of packet traces to diagnose performance problems of the transport protocol TCP. Unfortunately, manual examination of these traces can be so tedious that effective analysis is not possible. The primary contribution of this thesis is a graphical method of displaying the packet trace which greatly reduce, the tediousness of examining a packet trace. The graphical method is demonstrated by the examination of some packet traces of typical TCP connections. The performance of two different implementations of TCP sending data across a particular network path is compared. Traces many thousands of packets long are used to demonstrate how effectively the graphical method simplifies examination of long complicated traces. In the comparison of the two TCP implementations, the burstiness of the TCP transmitter appeared to be related to the achieved throughput. A method of quantifying this burstiness is presented and its possible relevance to understanding the performance of TCP is discussed.

  2. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  3. Classification of conductance traces with recurrent neural networks

    Science.gov (United States)

    Lauritzen, Kasper P.; Magyarkuti, András; Balogh, Zoltán; Halbritter, András; Solomon, Gemma C.

    2018-02-01

    We present a new automated method for structural classification of the traces obtained in break junction experiments. Using recurrent neural networks trained on the traces of minimal cross-sectional area in molecular dynamics simulations, we successfully separate the traces into two classes: point contact or nanowire. This is done without any assumptions about the expected features of each class. The trained neural network is applied to experimental break junction conductance traces, and it separates the classes as well as the previously used experimental methods. The effect of using partial conductance traces is explored, and we show that the method performs equally well using full or partial traces (as long as the trace just prior to breaking is included). When only the initial part of the trace is included, the results are still better than random chance. Finally, we show that the neural network classification method can be used to classify experimental conductance traces without using simulated results for training, but instead training the network on a few representative experimental traces. This offers a tool to recognize some characteristic motifs of the traces, which can be hard to find by simple data selection algorithms.

  4. Béatrice Galinon-Mélénec (dir., L’Homme trace : perspectives anthropologiques des traces contemporaines

    Directory of Open Access Journals (Sweden)

    Annick Monseigne

    2013-04-01

    Full Text Available En première de couverture, une photographie qui a marqué un tournant dans l’histoire de l’Homme du XXème siècle : l’empreinte du premier pas d’un homme sur la Lune. Deux chapitres liminaires (pp. 15-55 où le directeur de l’ouvrage pose la question de la mise en place de traces symboliques, de la polysémie des traces, de l’absence de trace comme indice, du rôle de l’indexation pour lever la polysémie des signes et enfin de l’enjeu de la traçabilité des traces ouvrent les questions relatives ...

  5. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  6. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  7. Development of a numerical code for the prediction of the long-term behavior of the underground facilities for the high-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Sawada, Masataka; Okada, Tetsuji; Hasegawa, Takuma

    2006-01-01

    Complicated phenomena originated by thermo-hydro-mechanical coupling behavior will occur in the near-field of geological disposal of nuclear waste. Development of a numerical evaluation method for such phenomena is important in order to make a reasonable repository design and a safety assessment. In order to achieve the objective above, a numerical model using the equations which can evaluate the swelling characteristics of buffer materials based on the diffusive double layer theory is proposed, and a numerical scheme for the thermo-hydro-mechanical coupled analysis including the swelling model is constructed. The proposed swelling model can reproduce the behavior observed during both swelling pressure tests and swelling deformation tests. When the developed numerical code is applied to the laboratory heater test using a bentonite specimen, it can reproduce the thermal gradient, the distribution of saturation rate and the variation of porosity. The developed numerical code will be applied to well-controlled laboratory tests and full-scale in-situ tests in the future work. In order to apply to the various geochemical conditions around the engineered barrier, chemical component will be coupled to the present numerical code. (author)

  8. Study of the noise propagation in PWR with coupled codes

    International Nuclear Information System (INIS)

    Verdu, G.; Garcia-Fenoll, M.; Abarca, A.; Miro, R.; Barrachina, T.

    2011-01-01

    The in-core detectors provide signals of the power distribution monitoring for the Reactor Protection System (RPS). The advanced fuel management strategies (high exposure) and the power upratings for PWR reactor types have led to an increase in the noise amplitude in detectors signals. In the present work a study of the propagation along the reactor core and the effects on the core power evolution of a small perturbation on the moderator density, using the coupled code RELAP5-MOD3.3/PARCSv2.7 is presented. The purpose of these studies is to be able to reproduce and analyze the in-core detector simulated signals. (author)

  9. Spoken word recognition without a TRACE

    Science.gov (United States)

    Hannagan, Thomas; Magnuson, James S.; Grainger, Jonathan

    2013-01-01

    How do we map the rapid input of spoken language onto phonological and lexical representations over time? Attempts at psychologically-tractable computational models of spoken word recognition tend either to ignore time or to transform the temporal input into a spatial representation. TRACE, a connectionist model with broad and deep coverage of speech perception and spoken word recognition phenomena, takes the latter approach, using exclusively time-specific units at every level of representation. TRACE reduplicates featural, phonemic, and lexical inputs at every time step in a large memory trace, with rich interconnections (excitatory forward and backward connections between levels and inhibitory links within levels). As the length of the memory trace is increased, or as the phoneme and lexical inventory of the model is increased to a realistic size, this reduplication of time- (temporal position) specific units leads to a dramatic proliferation of units and connections, begging the question of whether a more efficient approach is possible. Our starting point is the observation that models of visual object recognition—including visual word recognition—have grappled with the problem of spatial invariance, and arrived at solutions other than a fully-reduplicative strategy like that of TRACE. This inspires a new model of spoken word recognition that combines time-specific phoneme representations similar to those in TRACE with higher-level representations based on string kernels: temporally independent (time invariant) diphone and lexical units. This reduces the number of necessary units and connections by several orders of magnitude relative to TRACE. Critically, we compare the new model to TRACE on a set of key phenomena, demonstrating that the new model inherits much of the behavior of TRACE and that the drastic computational savings do not come at the cost of explanatory power. PMID:24058349

  10. Calculation code used in criticality analyses for the accident of JCO precipitation tank

    International Nuclear Information System (INIS)

    Miyoshi, Yoshinori

    2000-01-01

    In order to evaluate nuclear features on criticality accident formed at the nuclear fuel processing facility in Tokai Works of the JCO, Ltd. (JCO), in Tokai-mura, Ibaraki prefecture, dynamic analyses to calculate output change after occurring the accident as well as criticality analyses to calculate reactivity added to precipitation tank, were carried out according to scenario on accident formation. For the criticality analyses, a continuous energy Monte Carlo code MCNP was used to carry out calculation of reactivity fed into the precipitation tank as correctly as possible. And, SRAC code system was used for calculation on temperature and void reactivity coefficients, effective delayed neutron ratio beta eff , and instantaneous neutron generation time required for parameters controlling transition features at criticality accident. In addition, for the dynamic analyses, because of necessity of considering on volume expansion of solution fuels used as exothermic body and radiation decomposition gas forming into solution, output behavior, numbers of nuclear fission, and so forth at initial burst portion were calculated by using TRACE and quasi-regular code, at a center of AGNES-2 promoting on its development in JAERI. Here were reported on outlines and an analysis example on calculation code using for the nuclear features evaluation. (G.K.)

  11. Preliminary Numerical Analysis of Convective Heat Transfer Loop Using MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yongjae; Seo, Gwang Hyeok; Jeun, Gyoodong; Kim, Sung Joong [Hanyang Univ., Seoul (Korea, Republic of)

    2014-05-15

    The MARS has been developed adopting two major modules: RELAP5/MOD3 (USA) for one-dimensional (1D) two-fluid model for two-phase flows and COBRA-TF code for a three-dimensional (3D), two-fluid, and three-field model. In addition to the MARS code, TRACE (USA) is a modernized thermal-hydraulics code designed to consolidate and extend the capabilities of NRC's 3 legacy safety code: TRAC-P, TRAC-B and RELAP. CATHARE (French) is also thermal-hydraulic system analysis code for Pressurized Water Reactor (PWR) safety. There are several researches on comparing experimental data with simulation results by the MARS code. Kang et al. conducted natural convection heat transfer experiments of liquid gallium loop, and the experimental data were compared to MARS simulations. Bang et al. examined the capability of the MARS code to predict condensation heat transfer experiments with a vertical tube containing a non-condensable gas. Moreover, Lee et al. adopted MELCOR, which is one of the severe accident analysis codes, to evaluate several strategies for the severe accident mitigation. The objective of this study is to conduct the preliminary numerical analysis for the experimental loop at HYU using the MARS code, especially in order to provide relevant information on upcoming experiments for the undergraduate students. In this study, the preliminary numerical analysis for the convective heat transfer loop was carried out using the MARS Code. The major findings from the numerical simulations can be summarized as follows. In the calculations of the outlet and surface temperatures, the several limitations were suggested for the upcoming single-phase flow experiments. The comparison work for the HTCs shows validity for the prepared input model. This input could give useful information on the experiments. Furthermore, the undergraduate students in department of nuclear engineering, who are going to be taken part in the experiments, could prepare the program with the input, and will

  12. Preliminary Numerical Analysis of Convective Heat Transfer Loop Using MARS Code

    International Nuclear Information System (INIS)

    Lee, Yongjae; Seo, Gwang Hyeok; Jeun, Gyoodong; Kim, Sung Joong

    2014-01-01

    The MARS has been developed adopting two major modules: RELAP5/MOD3 (USA) for one-dimensional (1D) two-fluid model for two-phase flows and COBRA-TF code for a three-dimensional (3D), two-fluid, and three-field model. In addition to the MARS code, TRACE (USA) is a modernized thermal-hydraulics code designed to consolidate and extend the capabilities of NRC's 3 legacy safety code: TRAC-P, TRAC-B and RELAP. CATHARE (French) is also thermal-hydraulic system analysis code for Pressurized Water Reactor (PWR) safety. There are several researches on comparing experimental data with simulation results by the MARS code. Kang et al. conducted natural convection heat transfer experiments of liquid gallium loop, and the experimental data were compared to MARS simulations. Bang et al. examined the capability of the MARS code to predict condensation heat transfer experiments with a vertical tube containing a non-condensable gas. Moreover, Lee et al. adopted MELCOR, which is one of the severe accident analysis codes, to evaluate several strategies for the severe accident mitigation. The objective of this study is to conduct the preliminary numerical analysis for the experimental loop at HYU using the MARS code, especially in order to provide relevant information on upcoming experiments for the undergraduate students. In this study, the preliminary numerical analysis for the convective heat transfer loop was carried out using the MARS Code. The major findings from the numerical simulations can be summarized as follows. In the calculations of the outlet and surface temperatures, the several limitations were suggested for the upcoming single-phase flow experiments. The comparison work for the HTCs shows validity for the prepared input model. This input could give useful information on the experiments. Furthermore, the undergraduate students in department of nuclear engineering, who are going to be taken part in the experiments, could prepare the program with the input, and will

  13. Computer program for optical systems ray tracing

    Science.gov (United States)

    Ferguson, T. J.; Konn, H.

    1967-01-01

    Program traces rays of light through optical systems consisting of up to 65 different optical surfaces and computes the aberrations. For design purposes, paraxial tracings with astigmation and third order tracings are provided.

  14. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  15. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    Science.gov (United States)

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  16. Validation of CATHARE 3D code against UPTF TRAM C3 transients

    International Nuclear Information System (INIS)

    Glantz, Tony; Freitas, Roberto

    2007-01-01

    Within the nuclear reactor safety analysis, one of the events that could potentially lead to a recriticality accident in case of a Small Break LOCA (SBLOCA) in a pressurized water reactor (PWR) is a boron dilution scenario followed by a coolant mixing transient. Some UPTF experiments can be interpreted as generic boron dilution experiments. In fact, the UPTF experiments were originally designed to conduct separate effects studies focused on multi-dimensional thermal hydraulic phenomena. But, in the case of experimental program TRAM, some studies are realized on the boron mixing: tests C3. Some of these tests have been used for the validation and assessment of the 3D module of CATHARE code. Results are very satisfying; CATHARE 3D code is able to reproduce correctly the main features of the UPTF TRAM C3 tests, the temperature mixing in the cold leg, the formation of a strong stratification in the upper downcomer, the perfect mixing temperature in the lower downcomer and the strong stratification in the lower plenum. These results are also compared with the CFX-5 and TRIO-U codes results on these tests. (author)

  17. Self-complementary circular codes in coding theory.

    Science.gov (United States)

    Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz

    2018-04-01

    Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.

  18. Accurate determination of trace amounts of phosphorus in geological samples by inductively coupled plasma atomic emission spectrometry with ion-exchange separation

    International Nuclear Information System (INIS)

    Asoh, Kazuya; Ebihara, Mitsuru

    2013-01-01

    Graphical abstract: -- Highlights: •We set up an effective ICP-AES procedure for determining trace P in rock samples. •Some certified values of P for reference rock samples were proved to be doubtful. •Accurate and reliable data were presented for a suite of geological reference rocks. -- Abstract: In order to determine trace amounts of phosphorus in geological and cosmochemical rock samples, simple as well as reliable analytical schemes using an ICP-AES instrument were investigated. A (conventional) ICP-AES procedure could determine phosphorus contents at the level of several 100 μg g −1 with a reasonable reproducibility ( −1 ; 1σ). An ICP-AES procedure coupled with matrix-separation using cation and anion exchange resins could lower the quantification level down to 1 μg g −1 or even lower under the present experimental conditions. The matrix-separation ICP-AES procedure developed in this study was applied to twenty-one geological reference samples issued by Geological Survey of Japan. Obtained values vary from 1250 μg g −1 for JB-3 (basalt) to 2.07 μg g −1 for JCt-1 (carbonate). Matrix-separation ICP-AES yielded reasonable reproducibility (less than 8.3%; 1σ) of three replicate analyses for all the samples analyzed. In comparison of our data with certificate values as well as literature or reported values, there appear to be an apparent (and large) discrepancy between our values and certificate/reported values regardless of phosphorus contents. Based on the reproducibility of our data and the analytical capability of the matrix-separation ICP-AES procedure developed in this study (in terms of quantification limit, recovery, selectivity of an analyte through pre-concentration process, etc.), it is concluded that certified values for several reference standard rocks should be reevaluated and revised accordingly. It may be further pointed that some phosphorus data reported in literatures should be critically evaluated when they are to be

  19. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  20. Melting mode and source lithology inferred from trace element systematic in historical olivine from Lanzarote, Canary Islands

    Science.gov (United States)

    Gómez-Ulla, Alejandra; Sigmarsson, Olgeir; Guðfinnsson, Guðmundur H.

    2017-04-01

    Trace element concentrations and ratios in olivine phenocrysts, such as fractionation-corrected Ni x (FeO/MgO) and Fe/Mn, have been shown useful as probes of pyroxenite derived component in mixtures of primary mantle melts (e.g. Sobolev et al., 2007). For instance, higher Ni and lower Mn and Ca contents are expected in partial melts of pyroxenite compared to those of lherzolite. We have measured trace element concentrations in olivine from 1730-1736 AD (Timanfaya) and 1824 AD eruptions in Lanzarote (Canary Islands), which erupted mafic and mantle nodule bearing magmas, ranging in composition from highly silica-undersaturated basanite through alkali basalt to tholeiite. The early basanite exhibit the largest olivine trace element variation covering the range of those from MORB and OIB worldwide, whereas later erupted tholeiite have values typical from pyroxenite derived melts. The Fo value decreased systematically with time during the 1730-36 eruption and the proportion of silica-saturated primary melt increased in the parental magma mixture with time. At the end of the eruption, tholeiite magmas crystallized olivine with, increasing concentrations of Mn and Ca and higher Ca/Al at relatively uniform Ni x (FeO/MgO) and Fe/Mn, all of which is readily explained by increased decompression melting at lower temperature. The basanite from the eruption that took place in 1824 AD has olivine with even higher Fo value and trace element variability similar those of the Timanfaya basanite. The fact that the Lanzarote basanite contain olivine with trace element systematic spanning that of MORB and pyroxenite melt can be explained by CO2-flux melting of a lithologically heterogeneous source, generating the diverse compositions. Initial reactive porous flow through depleted oceanic lithosphere and equilibration with dunitic restite of percolating pyroxenite melt may have amplified the observed Ni depletion in olivine of the earliest basanite. The fact that olivine compositions and