WorldWideScience

Sample records for tracing code reproducing

  1. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Particle tracing code for multispecies gas

    International Nuclear Information System (INIS)

    Eaton, R.R.; Fox, R.L.; Vandevender, W.H.

    1979-06-01

    Details are presented for the development of a computer code designed to calculate the flow of a multispecies gas mixture using particle tracing techniques. The current technique eliminates the need for a full simulation by utilizing local time averaged velocity distribution functions to obtain the dynamic properties for probable collision partners. The development of this concept reduces statistical scatter experienced in conventional Monte Carlo simulations. The technique is applicable to flow problems involving gas mixtures with disparate masses and trace constituents in the Knudsen number, Kn, range from 1.0 to less than 0.01. The resulting code has previously been used to analyze several aerodynamic isotope enrichment devices

  3. The Alba ray tracing code: ART

    Science.gov (United States)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  4. Papa, a Particle Tracing Code in Pascal

    NARCIS (Netherlands)

    Haselhoff, E.H.; Haselhoff, Eltjo H.; Ernst, G.J.

    1990-01-01

    During the design of a 10 ¿m high-gain FEL oscillator (TEUFEL Project) we developed a new particle-tracing code to perform simulations of thermionic- and photo-cathode electron injectors/accelerators. The program allows predictions of current, energy and beam emittance in a user-specified linac

  5. Assessment of TRACE code against CHF experiments

    International Nuclear Information System (INIS)

    Audrius Jasiulevicius; Rafael Macian-Juan; Paul Coddington

    2005-01-01

    Full text of publication follows: This paper reports on the validation of the USNRC 'consolidate' code TRACE with data obtained during Critical Heat Flux (CHF) experiments in single channels and round and annular tubes. CHF is one of the key reactor safety parameters, because it determines the conditions for the onset of transition boiling in the core rod bundles, leading to the low heat transfer rates characteristics of the post-CHF heat transfer regime. In the context of the participation of PSI in the the International Programme for uncertainty analysis BEMUSE, we have carried out extensive work for the validation of some important TRACE models. The present work is aimed at assessing the range of validity for the CHF correlations and post-CHF heat transfer models currently included in TRACE. The heat transfer experiments selected for the assessment were performed at the Royal Institute of Technology (RIT) in Stockholm, Sweden and at the Atomic Energy Establishment in Winfrith, UK. The experimental investigations of the CHF and post-CHF heat transfer at RIT for flow of water in vertical tubes and annulus were performed at pressures ranging from 1 to 20 MPa and coolant mass fluxes from 500 to 3000 kg/m 2 s. The liquid was subcooled by 10 deg. C and 40 deg. C at the inlet of the test section. The experiments were performed on two different types of test sections. Experiments with uniformly heated single 7.0 m long tubes were carried out with three different inner tube diameters of 10, 14.9 and 24.7 mm. A series of experiments with non-uniform axial power distribution were also conducted in order to study the effect of the axial heat flux distribution on the CHF conditions in both 7.0 m long single tubes and 3.65 long annulus. Several different axial power profiles were employed with bottom, middle and top power peaks as well as the double-humped axial power profiles. In total more than 100 experiments with uniform axial heat flux distribution and several hundreds

  6. Perspective on the audit calculation for SFR using TRACE code

    Energy Technology Data Exchange (ETDEWEB)

    Shin, An Dong; Choi, Yong Won; Bang, Young Suk; Bae, Moo Hoon; Huh, Byung Gil; Seol, Kwang One [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    Korean Sodium Cooled Fast Reactor (SFR) is being developed by KAERI. The Prototype SFR will be a first SFR applied for licensing. KINS started research programs for preparing new concept design licensing recently. Safety analysis for the certain reactor is based on the computational estimation with conservatism and/or uncertainty of modeling. For the audit calculation for sodium cooled fast reactor (SFR), TRACE code is considered as one of analytical tool for SFR since TRACE code have already sodium related properties and models in it and have experience in the liquid metal coolant system area in abroad. Applicability of TRACE code for SFR is prechecked before real audit calculation. In this study, Demonstration Fast Reactor (DFR) 600 steady state conditions is simulated for identification of area of modeling improvements of TRACE code.

  7. Evaluation of ATLAS 100% DVI Line Break Using TRACE Code

    International Nuclear Information System (INIS)

    Huh, Byung Gil; Bang, Young Seok; Cheong, Ae Ju; Woo, Sweng Woong

    2011-01-01

    ATLAS (Advanced Thermal-Hydraulic Test Loop for Accident Simulation) is an integral effect test facility in KAERI. It had installed completely to simulate the accident for the OPR1000 and the APR1400 in 2005. After then, several tests for LBLOCA, DVI line break have been performed successfully to resolve the safety issues of the APR1400. Especially, a DVI line break is considered as another spectrum among the SBLOCAs in APR1400 because the DVI line is directly connected to the reactor vessel and the thermal hydraulic behaviors are expected to be different from those for the cold leg injection. However, there are not enough experimental data for the DVI line break. Therefore, integral effect data for the DVI line break of ATLAS is very useful and available for an improvement and validation of safety codes. For the DVI line break in ATLAS, several analyses using MARS and RELAP codes were performed in the ATLAS DSP (Domestic Standard Problem) meetings. However, TRACE code has still not used to simulate a DVI line break. TRACE code has developed as the unified code for the reactor thermal hydraulic analyses in USNRC. In this study, the 100% DVI line break in ATLAS was evaluated by TRACE code. The objectives of this study are to identify the prediction capability of TRACE code for the major thermal hydraulic phenomena of a DVI line break in ATLAS

  8. Simulation of the turbine discharge transient with the code Trace

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Filio L, C.

    2014-10-01

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  9. Documentation for TRACE: an interactive beam-transport code

    International Nuclear Information System (INIS)

    Crandall, K.R.; Rusthoi, D.P.

    1985-01-01

    TRACE is an interactive, first-order, beam-dynamics computer program. TRACE includes space-charge forces and mathematical models for a number of beamline elements not commonly found in beam-transport codes, such as permanent-magnet quadrupoles, rf quadrupoles, rf gaps, accelerator columns, and accelerator tanks. TRACE provides an immediate graphic display of calculative results, has a powerful and easy-to-use command procedure, includes eight different types of beam-matching or -fitting capabilities, and contains its own internal HELP package. This report describes the models and equations used for each of the transport elements, the fitting procedures, and the space-charge/emittance calculations, and provides detailed instruction for using the code

  10. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    Full Text Available Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012 for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  11. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals.

    Science.gov (United States)

    Stodden, Victoria; Guo, Peixuan; Ma, Zhaokun

    2013-01-01

    Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

  12. Analysis of an XADS Target with the System Code TRACE

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Sanchez Espinoza, Victor H.; Feng, Bo

    2008-01-01

    Accelerator-driven systems (ADS) present an option to reduce the radioactive waste of the nuclear industry. The experimental Accelerator-Driven System (XADS) has been designed to investigate the feasibility of using ADS on an industrial scale to burn minor actinides. The target section lies in the middle of the subcritical core and is bombarded by a proton beam to produce spallation neutrons. The thermal energy produced from this reaction requires a heat removal system for the target section. The target is cooled by liquid lead-bismuth-eutectics (LBE) in the primary system which in turn transfers the heat via a heat exchanger (HX) to the secondary coolant, Diphyl THT (DTHT), a synthetic diathermic fluid. Since this design is still in development, a detailed investigation of the system is necessary to evaluate the behavior during normal and transient operations. Due to the lack of experimental facilities and data for ADS, the analyses are mostly done using thermal hydraulic codes. In addition to evaluating the thermal hydraulics of the XADS, this paper also benchmarks a new code developed by the NRC, TRACE, against other established codes. The events used in this study are beam power switch-on/off transients and a loss of heat sink accident. The obtained results from TRACE were in good agreement with the results of various other codes. (authors)

  13. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  14. SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)

    2013-10-01

    SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.

  15. A comparison of the reproducibility of manual tracing and on-screen digitization for cephalometric profile variables

    NARCIS (Netherlands)

    Dvortsin, D. P.; Sandham, John; Pruim, G. J.; Dijkstra, P. U.

    2008-01-01

    The aim of this investigation was to analyse and compare the reproducibility of manual cephalometric tracings with on-screen digitization using a soft tissue analysis. A random sample of 20 lateral cephalometric radiographs, in the natural head posture, was selected. On-screen digitization using

  16. Comparative study of boron transport models in NRC Thermal-Hydraulic Code Trace

    Energy Technology Data Exchange (ETDEWEB)

    Olmo-Juan, Nicolás; Barrachina, Teresa; Miró, Rafael; Verdú, Gumersindo; Pereira, Claubia, E-mail: nioljua@iqn.upv.es, E-mail: tbarrachina@iqn.upv.es, E-mail: rmiro@iqn.upv.es, E-mail: gverdu@iqn.upv.es, E-mail: claubia@nuclear.ufmg.br [Institute for Industrial, Radiophysical and Environmental Safety (ISIRYM). Universitat Politècnica de València (Spain); Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    Recently, the interest in the study of various types of transients involving changes in the boron concentration inside the reactor, has led to an increase in the interest of developing and studying new models and tools that allow a correct study of boron transport. Therefore, a significant variety of different boron transport models and spatial difference schemes are available in the thermal-hydraulic codes, as TRACE. According to this interest, in this work it will be compared the results obtained using the different boron transport models implemented in the NRC thermal-hydraulic code TRACE. To do this, a set of models have been created using the different options and configurations that could have influence in boron transport. These models allow to reproduce a simple event of filling or emptying the boron concentration in a long pipe. Moreover, with the aim to compare the differences obtained when one-dimensional or three-dimensional components are chosen, it has modeled many different cases using only pipe components or a mix of pipe and vessel components. In addition, the influence of the void fraction in the boron transport has been studied and compared under close conditions to BWR commercial model. A final collection of the different cases and boron transport models are compared between them and those corresponding to the analytical solution provided by the Burgers equation. From this comparison, important conclusions are drawn that will be the basis of modeling the boron transport in TRACE adequately. (author)

  17. Non-coding RNA detection methods combined to improve usability, reproducibility and precision

    Directory of Open Access Journals (Sweden)

    Kreikemeyer Bernd

    2010-09-01

    Full Text Available Abstract Background Non-coding RNAs gain more attention as their diverse roles in many cellular processes are discovered. At the same time, the need for efficient computational prediction of ncRNAs increases with the pace of sequencing technology. Existing tools are based on various approaches and techniques, but none of them provides a reliable ncRNA detector yet. Consequently, a natural approach is to combine existing tools. Due to a lack of standard input and output formats combination and comparison of existing tools is difficult. Also, for genomic scans they often need to be incorporated in detection workflows using custom scripts, which decreases transparency and reproducibility. Results We developed a Java-based framework to integrate existing tools and methods for ncRNA detection. This framework enables users to construct transparent detection workflows and to combine and compare different methods efficiently. We demonstrate the effectiveness of combining detection methods in case studies with the small genomes of Escherichia coli, Listeria monocytogenes and Streptococcus pyogenes. With the combined method, we gained 10% to 20% precision for sensitivities from 30% to 80%. Further, we investigated Streptococcus pyogenes for novel ncRNAs. Using multiple methods--integrated by our framework--we determined four highly probable candidates. We verified all four candidates experimentally using RT-PCR. Conclusions We have created an extensible framework for practical, transparent and reproducible combination and comparison of ncRNA detection methods. We have proven the effectiveness of this approach in tests and by guiding experiments to find new ncRNAs. The software is freely available under the GNU General Public License (GPL, version 3 at http://www.sbi.uni-rostock.de/moses along with source code, screen shots, examples and tutorial material.

  18. Safety related investigations of the VVER-1000 reactor type by the coupled code system TRACE/PARCS

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Lischke, Wolfgang; Sanchez Espinoza, Victor Hugo

    2007-01-01

    This study was performed at the Institute of Reactor Safety at the Research Center Karlsruhe. It is embedded in the ongoing investigations of the international code application and maintenance program (CAMP) for qualification and validation of system codes like TRACE [1] and PARCS [2]. The predestinated reactor type for the validation of these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2 [3] includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The posttest-investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement to the measured data. The coolant mixing pattern especially in the downcomer has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provides good results compared to reference values and the ones of other participants of the benchmark. It can be pointed out that the developed three-dimensional nodalisation of the reactor pressure vessel (RPV) is appropriate for the description of transients where the thermal-hydraulics and the neutronics are strongly linked. (author)

  19. Simulation of the turbine discharge transient with the code Trace; Simulacion del transitorio disparo de turbina con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M.; Filio L, C., E-mail: dulcemaria.mejia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Jose Ma. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico)

    2014-10-15

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  20. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  1. Divided multimodal attention sensory trace and context coding strategies in spatially congruent auditory and visual presentation.

    Science.gov (United States)

    Kristjánsson, Tómas; Thorvaldsson, Tómas Páll; Kristjánsson, Arni

    2014-01-01

    Previous research involving both unimodal and multimodal studies suggests that single-response change detection is a capacity-free process while a discriminatory up or down identification is capacity-limited. The trace/context model assumes that this reflects different memory strategies rather than inherent differences between identification and detection. To perform such tasks, one of two strategies is used, a sensory trace or a context coding strategy, and if one is blocked, people will automatically use the other. A drawback to most preceding studies is that stimuli are presented at separate locations, creating the possibility of a spatial confound, which invites alternative interpretations of the results. We describe a series of experiments, investigating divided multimodal attention, without the spatial confound. The results challenge the trace/context model. Our critical experiment involved a gap before a change in volume and brightness, which according to the trace/context model blocks the sensory trace strategy, simultaneously with a roaming pedestal, which should block the context coding strategy. The results clearly show that people can use strategies other than sensory trace and context coding in the tasks and conditions of these experiments, necessitating changes to the trace/context model.

  2. High fidelity analysis of BWR fuel assembly with COBRA-TF/PARCS and trace codes

    International Nuclear Information System (INIS)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.; Soler, A.

    2013-01-01

    The growing importance of detailed reactor core and fuel assembly description for light water reactors (LWRs) as well as the sub-channel safety analysis requires high fidelity models and coupled neutronic/thermalhydraulic codes. Hand in hand with advances in the computer technology, the nuclear safety analysis is beginning to use a more detailed thermal hydraulics and neutronics. Previously, a PWR core and a 16 by 16 fuel assembly models were developed to test and validate our COBRA-TF/PARCS v2.7 (CTF/PARCS) coupled code. In this work, a comparison of the modeling and simulation advantages and disadvantages of modern 10 by 10 BWR fuel assembly with CTF/PARCS and TRACE codes has been done. The objective of the comparison is making known the main advantages of using the sub-channel codes to perform high resolution nuclear safety analysis. The sub-channel codes, like CTF, permits obtain accurate predictions, in two flow regime, of the thermalhydraulic parameters important to safety with high local resolution. The modeled BWR fuel assembly has 91 fuel rods (81 full length and 10 partial length fuel rods) and a big square central water rod. This assembly has been modeled with high level of detail with CTF code and using the BWR modeling parameters provided by TRACE. The same neutronic PARCS's model has been used for the simulation with both codes. To compare the codes a coupled steady state has be performed. (author)

  3. Simulation of the KAERI PASCAL Test with MARS-KS and TRACE Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Won; Cheong, Aeju; Shin, Andong; Cho, Min Ki [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In order to validate the operational performance of the PAFS, KAERI has performed the experimental investigation using the PASCAL (PAFS Condensing heat removal Assessment Loop) facility. In this study, we simulated the KAERI PASCAL SS-540-P1 test with MARS-KS V1.4 and TRACE V5.0 p4 codes to assess the code predictability for the condensation heat transfer inside the passive auxiliary feedwater system. We simulated the KAERI PASCAL SS-540-P1 test with MARS-KS V1.4 and TRACE V5.0 p4 codes to assess the code predictability for the condensation heat transfer inside the passive auxiliary feedwater system. The calculated results of heat flux, inner wall surface temperature of the condensing tube, fluid temperature, and steam mass flow rate are compared with the experimental data. The result shows that the MARS-KS generally under-predict the heat fluxes. The TRACE over-predicts the heat flux at tube inlet region and under-predicts it at tube outlet region. The TRACE prediction shows larger amount of steam condensation by about 3% than the MARS-KS prediction.

  4. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    Science.gov (United States)

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  5. TRACE/VALKIN: a neutronics-thermohydraulics coupled code to analyze strong 3D transients

    Energy Technology Data Exchange (ETDEWEB)

    Rafael Miro; Gumersindo Verdu; Ana Maria Sanchez [Chemical and Nuclear Engineering Department. Polytechnic University of Valencia. Cami de Vera s/n. 46022 Valencia (Spain); Damian Ginestar [Applied Mathematics Department. Polytechnic University of Valencia. Cami de Vera s/n. 46022 Valencia (Spain)

    2005-07-01

    Full text of publication follows: A nuclear reactor simulator consists mainly of two different blocks, which solve the models used for the basic physical phenomena taking place in the reactor. In this way, there is a neutronic module which simulates the neutron balance in the reactor core, and a thermal-hydraulics module, which simulates the heat transfer in the fuel, the heat transfer from the fuel to the water, and the different condensation and evaporation processes taking place in the reactor core and in the condenser systems. TRACE is a two-phase, two-fluid thermal-hydraulic reactor systems analysis code. The TRACE acronym stands for TRAC/RELAP Advanced Computational Engine, reflecting its ability to run both RELAP5 and TRAC legacy input models. It includes a three-dimensional kinetics module called PARCS for performing advanced analysis of coupled core thermal-hydraulic/kinetics problems. TRACE-VALKIN code is a new time domain analysis code to study transients in LWR reactors. This code uses the best estimate code TRACE to give account of the heat transfer and thermal-hydraulic processes, and a 3D neutronics module. This module has two options, the MODKIN option that makes use of a modal method based on the assumption that the neutronic flux can be approximately expanded in terms of the dominant lambda modes associated with a static configuration of the reactor core, and the NOKIN option that uses a one-step backward discretization of the neutron diffusion equation. The lambda modes are obtained using the Implicit Restarted Arnoldi approach or the Jacob-Davidson algorithm. To check the performance of the coupled code TRACE-VALKIN against complex 3D neutronic transients, using the cross-sections tables generated with the translator SIMTAB from SIMULATE to TRACE/VALKIN, the Cofrentes NPP SCRAM-61 transient is simulated. Cofrentes NPP is a General Electric BWR-6 design located in Valencia-land (Spain). It is in operation since 1985 and currently in its fifteenth

  6. Assessment of GOTHIC and TRACE codes against selected PANDA experiments on a Passive Containment Condenser

    Energy Technology Data Exchange (ETDEWEB)

    Papini, Davide, E-mail: davide.papini@psi.ch; Adamsson, Carl; Andreani, Michele; Prasser, Horst-Michael

    2014-10-15

    Highlights: • Code comparison on the performance of a Passive Containment Condenser. • Simulation of separate effect tests with pure steam and non-condensable gases. • Role of the secondary side and accuracy of pool boiling models are discussed. • GOTHIC and TRACE predict the experimental performance with slight underestimation. • Recirculatory flow pattern with injection of light non-condensable gas is inferred. - Abstract: Typical passive safety systems for ALWRs (Advanced Light Water Reactors) rely on the condensation of steam to remove the decay heat from the core or the containment. In the present paper the three-dimensional containment code GOTHIC and the one-dimensional system code TRACE are compared on the calculation of a variety of phenomena characterizing the response of a passive condenser submerged in a boiling pool. The investigation addresses the conditions of interest for the Passive Containment Cooling System (PCCS) proposed for the ESBWR (Economic Simplified Boiling Water Reactor). The analysis of selected separate effect tests carried out on a PCC (Passive Containment Condenser) unit in the PANDA large-scale thermal-hydraulic facility is presented to assess the code predictions. Both pure steam conditions (operating pressure of 3 bar, 6 bar and 9 bar) and the effect on the condensation heat transfer of non-condensable gases heavier than steam (air) and lighter than steam (helium) are considered. The role of the secondary side (pool side) heat transfer on the condenser performance is examined too. In general, this study shows that both the GOTHIC and TRACE codes are able to reasonably predict the heat transfer capability of the PCC as well as the influence of non-condensable gas on the system. A slight underestimation of the condenser performance is obtained with both codes. For those tests where the experimental and simulated efficiencies agree better the possibility of compensating errors among different parts of the heat transfer

  7. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  8. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  9. Two-phase wall friction model for the trace computer code

    International Nuclear Information System (INIS)

    Wang Weidong

    2005-01-01

    The wall drag model in the TRAC/RELAP5 Advanced Computational Engine computer code (TRACE) has certain known deficiencies. For example, in an annular flow regime, the code predicts an unphysical high liquid velocity compared to the experimental data. To address those deficiencies, a new wall frictional drag package has been developed and implemented in the TRACE code to model the wall drag for two-phase flow system code. The modeled flow regimes are (1) annular/mist, (2) bubbly/slug, and (3) bubbly/slug with wall nucleation. The new models use void fraction (instead of flow quality) as the correlating variable to minimize the calculation oscillation. In addition, the models allow for transitions between the three regimes. The annular/mist regime is subdivided into three separate regimes for pure annular flow, annular flow with entrainment, and film breakdown. For adiabatic two-phase bubbly/slug flows, the vapor phase primarily exists outside of the boundary layer, and the wall shear uses single-phase liquid velocity for friction calculation. The vapor phase wall friction drag is set to zero for bubbly/slug flows. For bubbly/slug flows with wall nucleation, the bubbles are presented within the hydrodynamic boundary layer, and the two-phase wall friction drag is significantly higher with a pronounced mass flux effect. An empirical correlation has been studied and applied to account for nucleate boiling. Verification and validation tests have been performed, and the test results showed a significant code improvement. (authors)

  10. A versatile ray-tracing code for studying rf wave propagation in toroidal magnetized plasmas

    International Nuclear Information System (INIS)

    Peysson, Y; Decker, J; Morini, L

    2012-01-01

    A new ray-tracing code named C3PO has been developed to study the propagation of arbitrary electromagnetic radio-frequency (rf) waves in magnetized toroidal plasmas. Its structure is designed for maximum flexibility regarding the choice of coordinate system and dielectric model. The versatility of this code makes it particularly suitable for integrated modeling systems. Using a coordinate system that reflects the nested structure of magnetic flux surfaces in tokamaks, fast and accurate calculations inside the plasma separatrix can be performed using analytical derivatives of a spline-Fourier interpolation of the axisymmetric toroidal MHD equilibrium. Applications to reverse field pinch magnetic configuration are also included. The effects of 3D perturbations of the axisymmetric toroidal MHD equilibrium, due to the discreteness of the magnetic coil system or plasma fluctuations in an original quasi-optical approach, are also studied. Using a Runge–Kutta–Fehlberg method for solving the set of ordinary differential equations, the ray-tracing code is extensively benchmarked against analytical models and other codes for lower hybrid and electron cyclotron waves. (paper)

  11. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Del Valle G, E.

    2015-09-01

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  12. Related research with thermo hydraulics safety by means of Trace code; Investigaciones relacionadas con seguridad termohidraulica con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.; Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico); Rodriguez H, A.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Sanchez E, V. H.; Jager, W., E-mail: evalle@esfm.ipn.mx [Karlsruhe Institute of Technology, Hermann-von-Helmholtz Platz I, D-76344 Eggenstein - Leopoldshafen (Germany)

    2014-10-15

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  13. Developments in the ray-tracing code Zgoubi for 6-D multiturn tracking in FFAG rings

    International Nuclear Information System (INIS)

    Lemuet, F.; Meot, F.

    2005-01-01

    A geometrical method for 3-D modeling of the magnetic field in scaling and non-scaling FFAG magnets has been installed in the ray-tracing code Zgoubi. The method in particular allows a good simulation of transverse non-linearities, of field fall-offs and possible merging fields in configurations of neighboring magnets, while using realistic models of magnetic fields. That yields an efficient tool for FFAG lattice design and optimizations, and for 6-D tracking studies. It is applied for illustration to the simulation of an acceleration cycle in a 150 MeV radial sector proton FFAG

  14. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  15. GRay: A MASSIVELY PARALLEL GPU-BASED CODE FOR RAY TRACING IN RELATIVISTIC SPACETIMES

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal [Department of Astronomy, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States)

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  16. Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project

    Science.gov (United States)

    Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.

    2007-01-01

    The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.

  17. Feasibility Study of Coupling the CASMO-4/TABLES-3/SIMULATE-3 Code System to TRACE/PARCS

    International Nuclear Information System (INIS)

    Demaziere, Christophe; Staalek, Mathias

    2004-12-01

    This report investigates the feasibility of coupling the Studsvik Scandpower CASMO-4/TABLES-3/SIMULATE-3 codes to the US NRC TRACE/PARCS codes. The data required by TRACE/PARCS are actually the ones necessary to run its neutronic module PARCS. Such data are the macroscopic nuclear cross-sections, some microscopic nuclear cross-sections important for the Xenon and Samarium poisoning effects, the Assembly Discontinuity Factors, and the kinetic parameters. All these data can be retrieved from the Studsvik Scandpower codes. The data functionalization is explained in detail for both systems of codes and the possibility of coupling each of these codes to TRACE/PARCS is discussed. Due to confidentiality restrictions in the use of the CASMO-4 files and to an improper format of the TABLES-3 output files, it is demonstrated that TRACE/PARCS can only be coupled to SIMULATE-3. Specifically-dedicated SIMULATE-3 input decks allow easily editing the neutronic data at specific operating statepoints. Although the data functionalization is different between both systems of codes, such a procedure permits reconstructing a set of data directly compatible with PARCS

  18. Radiation heat transfer model in a spent fuel pool by TRACE code

    International Nuclear Information System (INIS)

    Sanchez-Saez, F.; Carlos, S.; Villanueva, J.F.; Martorell, S.

    2014-01-01

    Nuclear policies have experienced an important change since Fukushima Daiichi nuclear plant accident and the safety of spent fuels has been in the spot issue among all the safety concerns. The work presented consists of the thermohydraulic simulation of spent fuel pool behavior after a loss of coolant throughout transfer channel with loss of cooling transient is produced. The simulation is done with the TRACE code. One of the most important variables that define the behavior of the pool is cladding temperature, which evolution depends on the heat emission. In this work convection and radiation heat transfer is considered. When both heat transfer models are considered, a clear delay in achieving the maximum peak cladding temperature (1477 K) is observed compared with the simulation in which only convection heat transfer is considered. (authors)

  19. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-01-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  20. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    Science.gov (United States)

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P 30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  1. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    Science.gov (United States)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  2. Analyses of SBO sequence of VVER1000 reactor using TRACE and MELCOR codes

    International Nuclear Information System (INIS)

    Mazzini, Guido; Kyncl, Milos; Miglierini, Bruno; Kopecek, Vit

    2015-01-01

    In response to the Fukushima accident, the European Commission ordered to perform stress tests to all European Nuclear Power Plants (NPPs). Due to shortage of time a number of conclusions in national stress tests reports were based on engineering judgment only. In the Czech Republic, as a follow up, a consortium of Research Organizations and Universities has decided to simulate selected stress tests scenarios, in particular station Black-Out (SBO) and Loss of Ultimate Sink (LoUS), with the aim to verify conclusions made in the national stress report and to analyse time response of respective source term releases. These activities are carried out in the frame of the project 'Prevention, preparedness and mitigation of consequences of Severe Accident (SA) at Czech NPPs in relation to lessons learned from stress tests after Fukushima' financed by the Ministry of Interior. The Research Centre Rez has been working on the preparation of a MELCOR model for VVER1000 NPP starting with a plant systems nodalization. The basic idea of this paper is to benchmark the MELCOR model with the validated TRACE model, first comparing the steady state and continuing in a long term SBO plus another event until the beginning of the severe accident. The presented work focuses mainly on the preliminary comparison of the thermo-hydraulics of the two models created in MELCOR and TRACE codes. After that, preliminary general results of the SA progression showing the hydrogen production and the relocation phenomena will be shortly discussed. This scenario is considered closed after some seconds to the break of the lower head. (author)

  3. A model of polarized-beam AGS in the ray-tracing code Zgoubi

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ahrens, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Glenn, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Roser, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Shoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-07-12

    A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are available from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.

  4. Related research with thermo hydraulics safety by means of Trace code

    International Nuclear Information System (INIS)

    Chaparro V, F. J.; Del Valle G, E.; Rodriguez H, A.; Gomez T, A. M.; Sanchez E, V. H.; Jager, W.

    2014-10-01

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  5. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs; Acoplamiento neutronico / termohidraulico con el sistema de codigos TRACE / PARCS

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M. [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico); Del Valle G, E., E-mail: dulcemaria.mejia@cnsns.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, Col. Lindavista, 07738 Ciudad de Mexico (Mexico)

    2015-09-15

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  6. Critical review of conservation equations for two-phase flow in the U.S. NRC TRACE code

    International Nuclear Information System (INIS)

    Wulff, Wolfgang

    2011-01-01

    Research highlights: → Field equations as implemented in TRACE are incorrect. → Boundary conditions needed for cooling of nuclear fuel elements are wrong. → The two-fluid model in TRACE is not closed. → Three-dimensional flow modeling in TRACE has no basis. - Abstract: The field equations for two-phase flow in the computer code TRAC/RELAP Advanced Computational Engine or TRACE are examined to determine their validity, their capabilities and limitations in resolving nuclear reactor safety issues. TRACE was developed for the NRC to predict thermohydraulic phenomena in nuclear power plants during operational transients and postulated accidents. TRACE is based on the rigorously derived and well-established two-fluid field equations for 1-D and 3-D two-phase flow. It is shown that: (1)The two-fluid field equations for mass conservation as implemented in TRACE are wrong because local mass balances in TRACE are in conflict with mass conservation for the whole reactor system, as shown in Section . (2)Wrong equations of motion are used in TRACE in place of momentum balances, compromising at branch points the prediction of momentum transfer between, and the coupling of, loops in hydraulic networks by impedance (form loss and wall shear) and by inertia and thereby the simulation of reactor component interactions. (3)Most seriously, TRACE calculation of heat transfer from fuel elements is incorrect for single and two-phase flows, because Eq. of the TRACE Manual is wrong (see Section ). (4)Boundary conditions for momentum and energy balances in TRACE are restricted to flow regimes with single-phase wall contact because TRACE lacks constitutive relations for solid-fluid exchange of momentum and heat in prevailing flow regimes. Without a quantified assessment of consequences from (3) to (4), predictions of phasic fluid velocities, fuel temperatures and important safety parameters, e.g., peak clad temperature, are questionable. Moreover, TRACE cannot predict 3-D single- or

  7. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Weeratunga, S K

    2008-11-06

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can be easily shared between these two code frameworks and concludes with a set of recommendations for its development.

  8. Validation and application of the system code TRACE for safety related investigations of innovative nuclear energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim

    2011-12-19

    The system code TRACE is the latest development of the U.S. Nuclear Regulatory Commission (US NRC). TRACE, developed for the analysis of operational conditions, transients and accidents of light water reactors (LWR), is a best-estimate code with two fluid, six equation models for mass, energy, and momentum conservation, and related closure models. Since TRACE is mainly applied to LWR specific issues, the validation process related to innovative nuclear systems (liquid metal cooled systems, systems operated with supercritical water, etc.) is very limited, almost not existing. In this work, essential contribution to the validation of TRACE related to lead and lead alloy cooled systems as well as systems operated with supercritical water is provided in a consistent and corporate way. In a first step, model discrepancies of the TRACE source code were removed. This inconsistencies caused the wrong prediction of the thermo physical properties of supercritical water and lead bismuth eutectic, and hence the incorrect prediction of heat transfer relevant characteristic numbers like Reynolds or Prandtl number. In addition to the correction of the models to predict these quantities, models describing the thermo physical properties of lead and Diphyl THT (synthetic heat transfer medium) were implemented. Several experiments and numerical benchmarks were used to validate the modified TRACE version. These experiments, mainly focused on wall-to-fluid heat transfer, revealed that not only the thermo physical properties are afflicted with inconsistencies but also the heat transfer models. The models for the heat transfer to liquid metals were enhanced in a way that the code can now distinguish between pipe and bundle flow by using the right correlation. The heat transfer to supercritical water was not existing in TRACE up to now. Completely new routines were implemented to overcome that issue. The comparison of the calculations to the experiments showed, on one hand, the necessity

  9. Translation of the model plant of the CN code TRAC-BF1 Cofrentes of a SNAP-TRACE

    International Nuclear Information System (INIS)

    Escriva, A.; Munuz-Cobo, J. L.; Concejal, A.; Melara, J.; Albendea, M.

    2012-01-01

    It aims to develop a three-dimensional model of the CN Cofrentes whose consistent results Compared with those in current use programs (TRAC-BFl, RETRAN) validated with data of the plant. This comparison should be done globally and that you can not carry a compensation of errors. To check the correct translation of the results obtained have been compared with TRACE and the programs currently in use and the relevant adjustments have been made, taking into account that both the correlations and models are different codes. During the completion of this work we have detected several errors that must be corrected in future versions of these tools.

  10. Statistical safety evaluation of BWR turbine trip scenario using coupled neutron kinetics and thermal hydraulics analysis code SKETCH-INS/TRACE5.0

    International Nuclear Information System (INIS)

    Ichikawa, Ryoko; Masuhara, Yasuhiro; Kasahara, Fumio

    2012-01-01

    The Best Estimate Plus Uncertainty (BEPU) method has been prepared for the regulatory cross-check analysis at Japan Nuclear Energy Safety Organization (JNES) on base of the three-dimensional neutron-kinetics/thermal-hydraulics coupled code SKETCH-INS/TRACE5.0. In the preparation, TRACE5.0 is verified against the large-scale thermal-hydraulic tests carried out with NUPEC facility. These tests were focused on the pressure drop of steam-liquid two phase flow and void fraction distribution. From the comparison of the experimental data with other codes (RELAP5/MOD3.3 and TRAC-BF1), TRACE5.0 was judged better than other codes. It was confirmed that TRACE5.0 has high reliability for thermal hydraulics behavior and are used as a best-estimate code for the statistical safety evaluation. Next, the coupled code SKETCH-INS/TRACE5.0 was applied to turbine trip tests performed at the Peach Bottom-2 BWR4 Plant. The turbine trip event shows the rapid power peak due to the voids collapse with the pressure increase. The analyzed peak value of core power is better simulated than the previous version SKETCH-INS/TRAC-BF1. And the statistical safety evaluation using SKETCH-INS/TRACE5.0 was applied to the loss of load transient for examining the influence of the choice of sampling method. (author)

  11. Model with Peach Bottom Turbine trip and thermal-Hydraulic code TRACE V5P3

    International Nuclear Information System (INIS)

    Mesado, C.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-01-01

    This work is the continuation of the work presented previously in the thirty-ninth meeting annual of the Spanish Nuclear society. The semi-automatic translation of the Thermo-hydraulic model TRAC-BF1 Peach Bottom Turbine Trip to TRACE was presented in such work. This article is intended to validate the model obtained in TRACE, why compare the model results result from the translation with the Benchmark results: NEA/OECD BWR Peach Bottom Turbine Trip (PBTT), in particular is of the extreme scenario 2 of exercise 3, in which there is SCRAM in the reactor. Among other data present in the (transitional) Benchmark , are: total power, axial profile of power, pressure Dome, total reactivity and its components. (Author)

  12. Analysis of an ADS spurious opening event at a BWR/6 by means of the TRACE code

    International Nuclear Information System (INIS)

    Nikitin, Konstantin; Manera, Annalisa

    2011-01-01

    Highlights: → The spurious opening of 8 relief valves of the ADS system in a BWR/6 has been simulated. → The valves opening results in a fast depressurization and significant loads on the RPV internals. → This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the available plant data. - Abstract: The paper presents the results of a post-event analysis of a spurious opening of 8 relief valves of the automatic depressurization system (ADS) occurred in a BWR/6. The opening of the relief valves results in a fast depressurization (pressure blow down) of the primary system which might lead to significant dynamic loads on the RPV and associated internals. In addition, the RPV level swelling caused by the fast depressurization might lead to undesired water carry-over into the steam line and through the safety relief valves (SRVs). Therefore, the transient needs to be characterized in terms of evolution of pressure, temperature and fluid distribution in the system. This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the plant data.

  13. The trace ion module for the Monte Carlo code Eirene, a unified approach to plasma chemistry in the ITER divertor

    International Nuclear Information System (INIS)

    Seebacher, J.; Reiter, D.; Borner, P.

    2007-01-01

    Modelling of kinetic transport effects in magnetic fusion devices is of great importance for understanding the physical processes in both the core and and the scrape off layer (SOL) plasma. For SOL simulation the EIRENE code is a well established tool for modelling of neutral, impurities and radiation transport. Recently a new trace ion transport module (tim), has been developed and incorporated into EIRENE. The tim essentially consists of two parts: 1) A trajectory integrator tracing the deterministic motion of a guiding centre particle in general 3D electric and magnetic fields. 2) A stochastic representation of the Fokker Planck collision operator in suitable guiding centre coordinates treating Coulomb collisions with the plasma background species. The TIM enables integrated SOL simulation packages such as B2-EIRENE, EDGE2D-EIRENE (2D) or EMC3-EIRENE (3D) to treat the physical and chemical processes near the divertor targets and in the bulk of the SOL in greater detail than before, and in particular on a kinetic rather than a fluid level. One of the physics applications is the formation and transport of hydrocarbon molecules and ions in the divertor in tokamaks, where the tritium co deposition via hydrocarbons remains a serious issue for next generation fusion devices like ITER. Real tokamak modelling scenarios will be discussed with the code packages B2-EIRENE (2D) and EMC3-EIRENE (3D). A brief overview of the theoretical basis of the tim will be given including code verification studies of the basic physics properties. Applications to hydrocarbon transport studies in TEXTOR and ITER, comparing present (fluid) approximations in edge modelling with the new extended kinetic model, will be presented. (Author)

  14. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  15. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    Science.gov (United States)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  16. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors; Adaptacion y aplicacion del codigo TRACE para el analisis de transitorios en disenos de reactores rapidos refrigerados por plomo

    Energy Technology Data Exchange (ETDEWEB)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-07-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  17. Validation of the U.S. NRC coupled code system TRITON/TRACE/PARCS with the special power excursion reactor test III (SPERT III)

    Energy Technology Data Exchange (ETDEWEB)

    Wang, R. C.; Xu, Y.; Downar, T. [Dept. of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Ann Arbor, MI 48104 (United States); Hudson, N. [RES Div., U.S. NRC, Rockville, MD (United States)

    2012-07-01

    The Special Power Excursion Reactor Test III (SPERT III) was a series of reactivity insertion experiments conducted in the 1950's. This paper describes the validation of the U.S. NRC Coupled Code system TRITON/PARCS/TRACE to simulate reactivity insertion accidents (RIA) by using several of the SPERT III tests. The work here used the SPERT III E-core configuration tests in which the RIA was initiated by ejecting a control rod. The resulting super-prompt reactivity excursion and negative reactivity feedback produced the familiar bell shaped power increase and decrease. The energy deposition during such a power peak has important safety consequences and provides validation basis for core coupled multi-physics codes. The transients of five separate tests are used to benchmark the PARCS/TRACE coupled code. The models were thoroughly validated using the original experiment documentation. (authors)

  18. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors; Adaptacion y aplicacion del codigo TRACE para el analisis de transitorios en disenos de reactores rapidos refrigerados por plomo

    Energy Technology Data Exchange (ETDEWEB)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-07-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  19. Uncertainty Methods Framework Development for the TRACE Thermal-Hydraulics Code by the U.S.NRC

    International Nuclear Information System (INIS)

    Bajorek, Stephen M.; Gingrich, Chester

    2013-01-01

    The Code of Federal Regulations, Title 10, Part 50.46 requires that the Emergency Core Cooling System (ECCS) performance be evaluated for a number of postulated Loss-Of-Coolant-Accidents (LOCAs). The rule allows two methods for calculation of the acceptance criteria; using a realistic model in the so-called 'Best Estimate' approach, or the more prescriptive following Appendix K to Part 50. Because of the conservatism of Appendix K, recent Evaluation Model submittals to the NRC used the realistic approach. With this approach, the Evaluation Model must demonstrate that the Peak Cladding Temperature (PCT), the Maximum Local Oxidation (MLO) and Core-Wide Oxidation (CWO) remain below their regulatory limits with a 'high probability'. Guidance for Best Estimate calculations following 50.46(a)(1) was provided by Regulatory Guide 1.157. This Guide identified a 95% probability level as being acceptable for comparisons of best-estimate predictions to the applicable regulatory limits, but was vague with respect to acceptable methods in which to determine the code uncertainty. Nor, did it specify if a confidence level should be determined. As a result, vendors have developed Evaluation Models utilizing several different methods to combine uncertainty parameters and determine the PCT and other variables to a high probability. In order to quantify the accuracy of TRACE calculations for a wide variety of applications and to audit Best Estimate calculations made by industry, the NRC is developing its own independent methodology to determine the peak cladding temperature and other parameters of regulatory interest to a high probability. Because several methods are in use, and each vendor's methodology ranges different parameters, the NRC method must be flexible and sufficiently general. Not only must the method apply to LOCA analysis for conventional light-water reactors, it must also be extendable to new reactor designs and type of analyses where the acceptance criteria are less

  20. About the use of the Monte-Carlo code based tracing algorithm and the volume fraction method for S n full core calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gurevich, M. I.; Oleynik, D. S. [RRC Kurchatov Inst., Kurchatov Sq., 1, 123182, Moscow (Russian Federation); Russkov, A. A.; Voloschenko, A. M. [Keldysh Inst. of Applied Mathematics, Miusskaya Sq., 4, 125047, Moscow (Russian Federation)

    2006-07-01

    The tracing algorithm that is implemented in the geometrical module of Monte-Carlo transport code MCU is applied to calculate the volume fractions of original materials by spatial cells of the mesh that overlays problem geometry. In this way the 3D combinatorial geometry presentation of the problem geometry, used by MCU code, is transformed to the user defined 2D or 3D bit-mapped ones. Next, these data are used in the volume fraction (VF) method to approximate problem geometry by introducing additional mixtures for spatial cells, where a few original materials are included. We have found that in solving realistic 2D and 3D core problems a sufficiently fast convergence of the VF method takes place if the spatial mesh is refined. Virtually, the proposed variant of implementation of the VF method seems as a suitable geometry interface between Monte-Carlo and S{sub n} transport codes. (authors)

  1. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    Science.gov (United States)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the

  2. Translation of the model plant of the CN code TRAC-BF1 Cofrentes of a SNAP-TRACE; Traduccion del modelo de planta de CN Cofrentes del codigo TRAC-BF1 a SNAP-TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Escriva, A.; Munuz-Cobo, J. L.; Concejal, A.; Melara, J.; Albendea, M.

    2012-07-01

    It aims to develop a three-dimensional model of the CN Cofrentes whose consistent results Compared with those in current use programs (TRAC-BFl, RETRAN) validated with data of the plant. This comparison should be done globally and that you can not carry a compensation of errors. To check the correct translation of the results obtained have been compared with TRACE and the programs currently in use and the relevant adjustments have been made, taking into account that both the correlations and models are different codes. During the completion of this work we have detected several errors that must be corrected in future versions of these tools.

  3. Analysis of PWR control rod ejection accident with the coupled code system SKETCH-INS/TRACE by incorporating pin power reconstruction model

    International Nuclear Information System (INIS)

    Nakajima, T.; Sakai, T.

    2010-01-01

    The pin power reconstruction model was incorporated in the 3-D nodal kinetics code SKETCH-INS in order to produce accurate calculation of three-dimensional pin power distributions throughout the reactor core. In order to verify the employed pin power reconstruction model, the PWR MOX/UO_2 core transient benchmark problem was analyzed with the coupled code system SKETCH-INS/TRACE by incorporating the model and the influence of pin power reconstruction model was studied. SKETCH-INS pin power distributions for 3 benchmark problems were compared with the PARCS solutions which were provided by the host organisation of the benchmark. SKETCH-INS results were in good agreement with the PARCS results. The capability of employed pin power reconstruction model was confirmed through the analysis of benchmark problems. A PWR control rod ejection benchmark problem was analyzed with the coupled code system SKETCH-INS/ TRACE by incorporating the pin power reconstruction model. The influence of pin power reconstruction model was studied by comparing with the result of conventional node averaged flux model. The results indicate that the pin power reconstruction model has significant effect on the pin powers during transient and hence on the fuel enthalpy

  4. ARDISC (Argonne Dispersion Code): computer programs to calculate the distribution of trace element migration in partially equilibrating media

    International Nuclear Information System (INIS)

    Strickert, R.; Friedman, A.M.; Fried, S.

    1979-04-01

    A computer program (ARDISC, the Argonne Dispersion Code) is described which simulates the migration of nuclides in porous media and includes first order kinetic effects on the retention constants. The code allows for different absorption and desorption rates and solves the coupled migration equations by arithmetic reiterations. Input data needed are the absorption and desorption rates, equilibrium surface absorption coefficients, flow rates and volumes, and media porosities

  5. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs

    Directory of Open Access Journals (Sweden)

    Yu-yan Yu

    2018-04-01

    Full Text Available A novel and sensitive assay for aflatoxin B1 (AFB1 detection has been developed by using bio-bar code assay (BCA. The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP and monoclonal antibodies modified magnetic microparticle (MMP, and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10−8 ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Keywords: Aflatoxin B1, Bio-bar code assay, Chinese herbs, Magnetic microparticle probes, Nanoparticle probes

  6. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    Science.gov (United States)

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  7. On the Impact of Zero-padding in Network Coding Efficiency with Internet Traffic and Video Traces

    DEFF Research Database (Denmark)

    Taghouti, Maroua; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2016-01-01

    Random Linear Network Coding (RLNC) theoretical results typically assume that packets have equal sizes while in reality, data traffic presents a random packet size distribution. Conventional wisdom considers zero-padding of original packets as a viable alternative, but its effect can reduce the e...

  8. A neural coding scheme reproducing foraging trajectories

    Science.gov (United States)

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-12-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  9. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  10. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    International Nuclear Information System (INIS)

    He, Xun

    2016-01-01

    MSR concept using the mathematic tools. In particular, the aim of the first part is to demonstrate the suitability of the TRACE code for the similar MSR designs by using a modified version of the TRACE code to implement the simulations for the steady-state, transient and accidental conditions. The basic approach of this part is to couple the thermal-hydraulic model and the modified point-kinetic model. The equivalent thermal-hydraulic model of the MSRE was built in 1D with three loops including all the critical main components. The point-kinetic model was improved through considering the precursor drift in order to produce more practical results in terms of the delayed neutron behavior. Additionally, new working fluids, namely the molten salts, were embedded into the source code of TRACE. Most results of the simulations show good agreements with the ORNL's reports and with another recent study and the errors were predictable and in an acceptable range. Therefore, the necessary code modification of TRACE appears to be successful and the model will be refined and its functions will be extended further in order to investigate new MSR design. Another part of this thesis is to implement a preliminary study on a new concept of molten salt reactor, namely the Dual Fluid Reactor (DFR). The DFR belongs to the group of the molten salt fast reactors (MSFR) and it is recently considered to be an option of minimum-waste and inherently safe operation of the nuclear reactors in the future. The DFR is using two separately circulating fluids in the reactor core. One is the fuel salt based on the mixture of tri-chlorides of uranium and plutonium (UCl_3-PuCl_3), while another is the coolant composed of the pure lead (Pb). The current work focuses on the basic dynamic behavior of a scaled-down DFR with 500 MW thermal output (DFR-500) instead of its reference design with 3000 MW thermal output (DFR-3000). For this purpose 10 parallel single fuel channels, as the representative samples

  11. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    one is about the demonstration of a new MSR concept using the mathematic tools. In particular, the aim of the first part is to demonstrate the suitability of the TRACE code for the similar MSR designs by using a modified version of the TRACE code to implement the simulations for the steady-state, transient and accidental conditions. The basic approach of this part is to couple the thermal-hydraulic model and the modified point-kinetic model. The equivalent thermal-hydraulic model of the MSRE was built in 1D with three loops including all the critical main components. The point-kinetic model was improved through considering the precursor drift in order to produce more practical results in terms of the delayed neutron behavior. Additionally, new working fluids, namely the molten salts, were embedded into the source code of TRACE. Most results of the simulations show good agreements with the ORNL's reports and with another recent study and the errors were predictable and in an acceptable range. Therefore, the necessary code modification of TRACE appears to be successful and the model will be refined and its functions will be extended further in order to investigate new MSR design. Another part of this thesis is to implement a preliminary study on a new concept of molten salt reactor, namely the Dual Fluid Reactor (DFR). The DFR belongs to the group of the molten salt fast reactors (MSFR) and it is recently considered to be an option of minimum-waste and inherently safe operation of the nuclear reactors in the future. The DFR is using two separately circulating fluids in the reactor core. One is the fuel salt based on the mixture of tri-chlorides of uranium and plutonium (UCl{sub 3}-PuCl{sub 3}), while another is the coolant composed of the pure lead (Pb). The current work focuses on the basic dynamic behavior of a scaled-down DFR with 500 MW thermal output (DFR-500) instead of its reference design with 3000 MW thermal output (DFR-3000). For this purpose 10 parallel

  12. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  13. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  14. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  15. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  16. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  17. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  18. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  19. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  1. TORBEAM 2.0, a paraxial beam tracing code for electron-cyclotron beams in fusion plasmas for extended physics applications

    Science.gov (United States)

    Poli, E.; Bock, A.; Lochbrunner, M.; Maj, O.; Reich, M.; Snicker, A.; Stegmeir, A.; Volpe, F.; Bertelli, N.; Bilato, R.; Conway, G. D.; Farina, D.; Felici, F.; Figini, L.; Fischer, R.; Galperti, C.; Happel, T.; Lin-Liu, Y. R.; Marushchenko, N. B.; Mszanowski, U.; Poli, F. M.; Stober, J.; Westerhof, E.; Zille, R.; Peeters, A. G.; Pereverzev, G. V.

    2018-04-01

    The paraxial WKB code TORBEAM (Poli, 2001) is widely used for the description of electron-cyclotron waves in fusion plasmas, retaining diffraction effects through the solution of a set of ordinary differential equations. With respect to its original form, the code has undergone significant transformations and extensions, in terms of both the physical model and the spectrum of applications. The code has been rewritten in Fortran 90 and transformed into a library, which can be called from within different (not necessarily Fortran-based) workflows. The models for both absorption and current drive have been extended, including e.g. fully-relativistic calculation of the absorption coefficient, momentum conservation in electron-electron collisions and the contribution of more than one harmonic to current drive. The code can be run also for reflectometry applications, with relativistic corrections for the electron mass. Formulas that provide the coupling between the reflected beam and the receiver have been developed. Accelerated versions of the code are available, with the reduced physics goal of inferring the location of maximum absorption (including or not the total driven current) for a given setting of the launcher mirrors. Optionally, plasma volumes within given flux surfaces and corresponding values of minimum and maximum magnetic field can be provided externally to speed up the calculation of full driven-current profiles. These can be employed in real-time control algorithms or for fast data analysis.

  2. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  3. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  4. Reproducible research: a minority opinion

    Science.gov (United States)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  5. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  6. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  7. Intraoral gothic arch tracing.

    Science.gov (United States)

    Rubel, Barry; Hill, Edward E

    2011-01-01

    In order to create optimum esthetics, function and phonetics in complete denture fabrication, it is necessary to record accurate maxillo-mandibular determinants of occlusion. This requires clinical skill to establish an accurate, verifiable and reproducible vertical dimension of occlusion (VDO) and centric relation (CR). Correct vertical relation depends upon a consideration of several factors, including muscle tone, inter-dental arch space and parallelism of the ridges. Any errors made while taking maxillo-mandibular jaw relation records will result in dentures that are uncomfortable and, possibly, unwearable. The application of a tracing mechanism such as the Gothic arch tracer (a central bearing device) is a demonstrable method of determining centric relation. Intraoral Gothic arch tracers provide the advantage of capturing VDO and CR in an easy-to-use technique for practitioners. Intraoral tracing (Gothic arch tracing) is a preferred method of obtaining consistent positions of the mandible in motion (retrusive, protrusive and lateral) at a comfortable VDO.

  8. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  9. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  10. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  11. TraceContract

    Science.gov (United States)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  12. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  13. Trace analysis

    International Nuclear Information System (INIS)

    Warner, M.

    1987-01-01

    What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques

  14. An empirical analysis of journal policy effectiveness for computational reproducibility.

    Science.gov (United States)

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  15. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  16. TRACE and TRAC-BF1 benchmark against Leibstadt plant data during the event inadvertent opening of relief valves

    Energy Technology Data Exchange (ETDEWEB)

    Sekhri, A.; Baumann, P. [KernkraftwerkLeibstadt AG, 5325 Leibstadt (Switzerland); Wicaksono, D. [Swiss Federal Inst. of Technology Zurich ETH, 8092 Zurich (Switzerland); Miro, R.; Barrachina, T.; Verdu, G. [Inst. for Industrial, Radiophysical and Environmental Safety ISIRYM, Universitat Politecnica de Valencia UPV, Cami de Vera s/n, 46021 Valencia (Spain)

    2012-07-01

    In framework of introducing TRACE code to transient analyses system codes for Leibstadt Power Plant (KKL), a conversion process of existing TRAC-BF1 model to TRACE has been started within KKL. In the first step, TRACE thermal-hydraulic model for KKL has been developed based on existing TRAC-BF1 model. In order to assess the code models a simulation of plant transient event is required. In this matter simulations of inadvertent opening of 8 relief valves event have been performed. The event occurs at KKL during normal operation, and it started when 8 relief valves open resulting in depressurization of the Reactor Pressure Vessel (RPV). The reactor was shutdown safely by SCRAM at low level. The high pressure core spray (HPCS) and the reactor core isolation cooling (RCIC) have been started manually in order to compensate the level drop. The remaining water in the feedwater (FW) lines flashes due to saturation conditions originated from RPV depressurization and refills the reactor downcomer. The plant boundary conditions have been used in the simulations and the FW flow rate has been adjusted for better prediction. The simulations reproduce the plant data with good agreement. It can be concluded that the TRAC-BF1 existing model has been used successfully to develop the TRACE model and the results of the calculations have shown good agreement with plant recorded data. Beside the modeling assessment, the TRACE and TRAC-BF1 capabilities to reproduce plant physical behavior during the transient have shown satisfactory results. The first step of developing KKL model for TRACE has been successfully achieved and this model is further developed in order to simulate more complex plant behavior such as Turbine Trip. (authors)

  17. TRACE and TRAC-BF1 benchmark against Leibstadt plant data during the event inadvertent opening of relief valves

    International Nuclear Information System (INIS)

    Sekhri, A.; Baumann, P.; Wicaksono, D.; Miro, R.; Barrachina, T.; Verdu, G.

    2012-01-01

    In framework of introducing TRACE code to transient analyses system codes for Leibstadt Power Plant (KKL), a conversion process of existing TRAC-BF1 model to TRACE has been started within KKL. In the first step, TRACE thermal-hydraulic model for KKL has been developed based on existing TRAC-BF1 model. In order to assess the code models a simulation of plant transient event is required. In this matter simulations of inadvertent opening of 8 relief valves event have been performed. The event occurs at KKL during normal operation, and it started when 8 relief valves open resulting in depressurization of the Reactor Pressure Vessel (RPV). The reactor was shutdown safely by SCRAM at low level. The high pressure core spray (HPCS) and the reactor core isolation cooling (RCIC) have been started manually in order to compensate the level drop. The remaining water in the feedwater (FW) lines flashes due to saturation conditions originated from RPV depressurization and refills the reactor downcomer. The plant boundary conditions have been used in the simulations and the FW flow rate has been adjusted for better prediction. The simulations reproduce the plant data with good agreement. It can be concluded that the TRAC-BF1 existing model has been used successfully to develop the TRACE model and the results of the calculations have shown good agreement with plant recorded data. Beside the modeling assessment, the TRACE and TRAC-BF1 capabilities to reproduce plant physical behavior during the transient have shown satisfactory results. The first step of developing KKL model for TRACE has been successfully achieved and this model is further developed in order to simulate more complex plant behavior such as Turbine Trip. (authors)

  18. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  19. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  20. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  1. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  2. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  3. Main considerations for modelling a station blackout scenario with trace

    International Nuclear Information System (INIS)

    Querol, Andrea; Turégano, Jara; Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo

    2017-01-01

    In the nuclear safety field, the thermal hydraulic phenomena that take place during an accident in a nuclear power plant is of special importance. One of the most studied accidents is the Station BlackOut (SBO). The aim of the present work is the analysis of the PKL integral test facility nodalization using the thermal-hydraulic code TRACE5 to reproduce a SBO accidental scenario. The PKL facility reproduces the main components of the primary and secondary systems of its reference nuclear power plant (Philippsburg II). The results obtained with different nodalization have been compared: 3D vessel vs 1D vessel, Steam Generator (SG) modelling using PIPE or TEE components and pressurizer modelling with PIPE or PRIZER components. Both vessel nodalization (1D vessel and 3D vessel) reproduce the physical phenomena of the experiment. However, there are significant discrepancies between them. The appropriate modelling of the SG is also relevant in the results. Regarding the other nodalization (PIPE or TEE components for SG and PIPE or PRIZER components for pressurizer), do not produce relevant differences in the results. (author)

  4. Main considerations for modelling a station blackout scenario with trace

    Energy Technology Data Exchange (ETDEWEB)

    Querol, Andrea; Turégano, Jara; Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: anquevi@upv.es, E-mail: jaturna@upv.es, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), Universitat Politècnica de València (Spain)

    2017-07-01

    In the nuclear safety field, the thermal hydraulic phenomena that take place during an accident in a nuclear power plant is of special importance. One of the most studied accidents is the Station BlackOut (SBO). The aim of the present work is the analysis of the PKL integral test facility nodalization using the thermal-hydraulic code TRACE5 to reproduce a SBO accidental scenario. The PKL facility reproduces the main components of the primary and secondary systems of its reference nuclear power plant (Philippsburg II). The results obtained with different nodalization have been compared: 3D vessel vs 1D vessel, Steam Generator (SG) modelling using PIPE or TEE components and pressurizer modelling with PIPE or PRIZER components. Both vessel nodalization (1D vessel and 3D vessel) reproduce the physical phenomena of the experiment. However, there are significant discrepancies between them. The appropriate modelling of the SG is also relevant in the results. Regarding the other nodalization (PIPE or TEE components for SG and PIPE or PRIZER components for pressurizer), do not produce relevant differences in the results. (author)

  5. Linac particle tracing simulations

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1979-01-01

    A particle tracing code was developed to study space--charge effects in proton or heavy-ion linear accelerators. The purpose is to study space--charge phenomena as directly as possible without the complications of many accelerator details. Thus, the accelerator is represented simply by harmonic oscillator or impulse restoring forces. Variable parameters as well as mismatched phase--space distributions were studied. This study represents the initial search for those features of the accelerator or of the phase--space distribution that lead to emittance growth

  6. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  7. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  8. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  9. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  10. Tracing Clues

    DEFF Research Database (Denmark)

    Feldt, Liv Egholm

    The past is all messiness and blurred relations. However, we tend to sort the messiness out through rigorous analytical studies leaving the messiness behind. Carlo Ginzburgs´ article Clues. Roots of an Evidential Paradigm from 1986 invigorates methodological elements of (historical) research, which...... central methodological elements will be further elaborated and discussed through a historical case study that traces how networks of philanthropic concepts and practices influenced the Danish welfare state in the period from the Danish constitution of 1849 until today. The overall aim of this paper...

  11. Piezoelectric trace vapor calibrator

    International Nuclear Information System (INIS)

    Verkouteren, R. Michael; Gillen, Greg; Taylor, David W.

    2006-01-01

    The design and performance of a vapor generator for calibration and testing of trace chemical sensors are described. The device utilizes piezoelectric ink-jet nozzles to dispense and vaporize precisely known amounts of analyte solutions as monodisperse droplets onto a hot ceramic surface, where the generated vapors are mixed with air before exiting the device. Injected droplets are monitored by microscope with strobed illumination, and the reproducibility of droplet volumes is optimized by adjustment of piezoelectric wave form parameters. Complete vaporization of the droplets occurs only across a 10 deg. C window within the transition boiling regime of the solvent, and the minimum and maximum rates of trace analyte that may be injected and evaporated are determined by thermodynamic principles and empirical observations of droplet formation and stability. By varying solution concentrations, droplet injection rates, air flow, and the number of active nozzles, the system is designed to deliver--on demand--continuous vapor concentrations across more than six orders of magnitude (nominally 290 fg/l to 1.05 μg/l). Vapor pulses containing femtogram to microgram quantities of analyte may also be generated. Calibrated ranges of three explosive vapors at ng/l levels were generated by the device and directly measured by ion mobility spectrometry (IMS). These data demonstrate expected linear trends within the limited working range of the IMS detector and also exhibit subtle nonlinear behavior from the IMS measurement process

  12. Simulation of a main steam line break with steam generator tube rupture using trace

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, S.; Querol, A.; Verdu, G. [Departamento de Ingenieria Quimica Y Nuclear, Universitat Politecnica de Valencia, Camino de Vera s/n, 46022, Valencia (Spain)

    2012-07-01

    A simulation of the OECD/NEA ROSA-2 Project Test 5 was made with the thermal-hydraulic code TRACE5. Test 5 performed in the Large Scale Test Facility (LSTF) reproduced a Main Steam Line Break (MSLB) with a Steam Generator Tube Rupture (SGTR) in a Pressurized Water Reactor (PWR). The result of these simultaneous breaks is a depressurization in the secondary and primary system in loop B because both systems are connected through the SGTR. Good approximation was obtained between TRACE5 results and experimental data. TRACE5 reproduces qualitatively the phenomena that occur in this transient: primary pressure falls after the break, stagnation of the pressure after the opening of the relief valve of the intact steam generator, the pressure falls after the two openings of the PORV and the recovery of the liquid level in the pressurizer after each closure of the PORV. Furthermore, a sensitivity analysis has been performed to know the effect of varying the High Pressure Injection (HPI) flow rate in both loops on the system pressures evolution. (authors)

  13. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scien...

  14. Trace spaces

    DEFF Research Database (Denmark)

    Fajstrup, Lisbeth; Goubault, Eric; Haucourt, Emmanuel

    2012-01-01

    in the interleaving semantics of a concurrent program, but rather some equivalence classes. The purpose of this paper is to describe a new algorithm to compute such equivalence classes, and a representative per class, which is based on ideas originating in algebraic topology. We introduce a geometric semantics...... of concurrent languages, where programs are interpreted as directed topological spaces, and study its properties in order to devise an algorithm for computing dihomotopy classes of execution paths. In particular, our algorithm is able to compute a control-flow graph for concurrent programs, possibly containing...... loops, which is “as reduced as possible” in the sense that it generates traces modulo equivalence. A preliminary implementation was achieved, showing promising results towards efficient methods to analyze concurrent programs, with very promising results compared to partial-order reduction techniques....

  15. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  16. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  17. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  18. Ray Trace

    International Nuclear Information System (INIS)

    Kowalski, S.

    1981-01-01

    During the past decade, a very general RAYTRACE code has been developed at MIT for following the trajectories of charged particles through ion-optical systems. The motion of a particle carrying charge Q is governed by the Lorentz equation, F = Q[E + V x B], where E is the electric field and B is the magnetic field. In a rectangular (x,y,z) coordinate system the equations of motion along each of the axes may be written as mx = Q(E/sub x/ + V/sub Y/B/sub z/ - v/sub z/B/sub Y/), my = Q(E/sub Y/ + v/sub z/B/sub x/ - v/sub x/B/sub z/, mz = Q(E/sub z/ + v/sub x/B/sub Y/ - v/sub Y/B/sub x/). These three particle differential equations of motion are solved by means of a step-by-step numerical integration with time as the independent variable. Accuracy is limited only by the uncertainties in our knowledge of the electric and magnetic fields. Current versions of the code may be used to calculate trajectories through an arbitrary arrangement of elements including dipoles, quadrupoles, general multipoles, solenoids, velocity selectors, drifts and thin lenses

  19. Traces et espaces de consommation

    Directory of Open Access Journals (Sweden)

    Franck Cochoy

    2016-10-01

    Full Text Available L’avènement des technologies numériques mobiles contribue à une évolution des modalités de distribution et de consommation. Le présent article porte sur l’usage des QR-codes, ces codes-barres bidimensionnels qui offrent à tout usager équipé d’un smartphone l’accès à des contenus commerciaux en ligne. Ils participent à l’Internet des objets et donc au couplage entre espace physique et univers numérique. Ils permettent aussi la collecte de traces numériques porteuses de sens pour les professionnels mais aussi pour les sciences sociales. Grâce à ces traces, on peut comprendre les nouveaux liens marchands tissés entre l’espace physique et le développement de flux informationnels continus. À partir de l’analyse des traces enregistrées à l’occasion de la visite des QR-codes apposés sur trois produits alimentaires (une boîte de sel, une barre chocolatée, une bouteille d’eau, notre enquête s’attache à expliciter les enjeux théoriques, méthodologiques et analytiques du processus de numérisation de l’espace de mobilité physique marchand.

  20. Simulation of a passive auxiliary feedwater system with TRACE5

    Energy Technology Data Exchange (ETDEWEB)

    Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), València (Spain)

    2017-07-01

    The study of the nuclear power plant accidents occurred in recent decades, as well as the probabilistic risk assessment carried out for this type of facility, present human error as one of the main contingency factors. For this reason, the design and development of generation III, III+ and IV reactors, which include inherent and passive safety systems, have been promoted. In this work, a TRACE5 model of ATLAS (Advanced Thermal- Hydraulic Test Loop for Accident Simulation) is used to reproduce an accidental scenario consisting in a prolonged Station BlackOut (SBO). In particular, the A1.2 test of the OECD-ATLAS project is analyzed, whose purpose is to study the primary system cooling by means of the water supply to one of the steam generators from a Passive Auxiliary Feedwater System (PAFS). This safety feature prevents the loss of secondary system inventory by means of the steam condensation and its recirculation. Thus, the conservation of a heat sink allows the natural circulation flow rate until restoring stable conditions. For the reproduction of the test, an ATLAS model has been adapted to the experiment conditions, and a PAFS has been incorporated. >From the simulation test results, the main thermal-hydraulic variables (pressure, flow rates, collapsed water level and temperature) are analyzed in the different circuits, contrasting them with experimental data series. As a conclusion, the work shows the TRACE5 code capability to correctly simulate the behavior of a passive feedwater system. (author)

  1. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  2. Epidemic contact tracing via communication traces.

    Directory of Open Access Journals (Sweden)

    Katayoun Farrahi

    Full Text Available Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  3. Epidemic contact tracing via communication traces.

    Science.gov (United States)

    Farrahi, Katayoun; Emonet, Rémi; Cebrian, Manuel

    2014-01-01

    Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  4. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  5. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  6. LUCID - an optical design and raytrace code

    International Nuclear Information System (INIS)

    Nicholas, D.J.; Duffey, K.P.

    1980-11-01

    A 2D optical design and ray trace code is described. The code can operate either as a geometric optics propagation code or provide a scalar diffraction treatment. There are numerous non-standard options within the code including design and systems optimisation procedures. A number of illustrative problems relating to the design of optical components in the field of high power lasers is included. (author)

  7. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  8. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  9. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  10. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  11. Comparative evaluation of trace elements in blood

    International Nuclear Information System (INIS)

    Goeij, J.J.M. de; Tjioe, P.S.; Pries, C.; Zwiers, J.H.L.

    1976-01-01

    The Interuniversitair Reactor Instituut and the Centraal Laboratorium TNO have carried out a common investigation on neutron-activation-analytical procedures for the determination of trace elements in blood. A comparative evaluation of five methods, destructive as well as non-destructive, is given. The sensitivity and reproducibility of the procedures are discussed. By combining some of the methods it is possible, starting with 1 ml blood, to give quantitative information on 14 important trace elements: antimony, arsenic, bromine, cadmium, cobalt, gold, copper, mercury, molybdenum, nickel, rubidium, selenium, iron and zinc. The methods have also been applied to sodium, chromium and potassium

  12. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  13. The Effect of Nitrous Oxide Psychosedation on Pantographic Tracings; A preliminary study

    International Nuclear Information System (INIS)

    Fareed, Kamal

    1989-01-01

    The form and reproducibility of pantographic tracings under the influence of relaxant drugs and in patients with muscle dysfunction and TMJ disorders, tend to emphasize the dominance of the neuromuscular factors. The purpose of this study was to demonstrate the effect of nitrous oxide induced psychosedation, on the reproducibility of pantographic tracings of border movements of the mandible. This study included four male subjects (with no signs and symptoms of muscular dysfunction and temporomandibular joint problems). Operator guided border tracings were recorded using the Denar pantograph. Three sets of tracings were recorded: (1) three tracings prior to sedation (Tracing I); (2) one tracing prior to sedation and two after sedation (Tracing II); (3) three tracings after psychosedation (Tracing III). The coincidence of tracings I, II, and 111 were statistically analyzed applying the chi-square (X2) analysis. There was a significant difference in the coincidence of tracings between Tracings 1 and II (X2 = 14.892). There was no significant difference in the coincidence of tracings between Tracings I and III (X2 = 1.338). This suggests that nitrous oxide psychosedation produces a centrally induced relaxation of the musculature, by possibly eliminating the extraneous anxiety producing factors. (author)

  14. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    Science.gov (United States)

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  15. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  16. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  17. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  18. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  19. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  20. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  1. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  2. The International Atomic Energy Agency Flag Code

    International Nuclear Information System (INIS)

    1999-01-01

    The document reproduces the text of the IAEA Flag Code which was promulgated by the Director General on 15 September 1999, pursuant to the decision of the Board of Governors on 10 June 1999 to adopt an Agency flag as described in document GOV/1999/41 and its use in accordance with a flag code to be promulgated by the Director General

  3. The International Atomic Energy Agency Flag Code

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-11-17

    The document reproduces the text of the IAEA Flag Code which was promulgated by the Director General on 15 September 1999, pursuant to the decision of the Board of Governors on 10 June 1999 to adopt an Agency flag as described in document GOV/1999/41 and its use in accordance with a flag code to be promulgated by the Director General.

  4. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  5. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  6. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  7. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  8. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  9. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  10. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  11. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  12. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  13. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  14. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  15. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  16. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  17. Reproducibility of the results in ultrasonic testing

    International Nuclear Information System (INIS)

    Chalaye, M.; Launay, J.P.; Thomas, A.

    1980-12-01

    This memorandum reports on the conclusions of the tests carried out in order to evaluate the reproducibility of ultrasonic tests made on welded joints. FRAMATOME have started a study to assess the dispersion of results afforded by the test line and to characterize its behaviour. The tests covered sensors and ultrasonic generators said to be identical to each other (same commercial batch) [fr

  18. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  19. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  20. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  1. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  2. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  3. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  4. Repeat: a framework to assess empirical reproducibility in biomedical research

    Directory of Open Access Journals (Sweden)

    Leslie D. McIntosh

    2017-09-01

    Full Text Available Abstract Background The reproducibility of research is essential to rigorous science, yet significant concerns of the reliability and verifiability of biomedical research have been recently highlighted. Ongoing efforts across several domains of science and policy are working to clarify the fundamental characteristics of reproducibility and to enhance the transparency and accessibility of research. Methods The aim of the proceeding work is to develop an assessment tool operationalizing key concepts of research transparency in the biomedical domain, specifically for secondary biomedical data research using electronic health record data. The tool (RepeAT was developed through a multi-phase process that involved coding and extracting recommendations and practices for improving reproducibility from publications and reports across the biomedical and statistical sciences, field testing the instrument, and refining variables. Results RepeAT includes 119 unique variables grouped into five categories (research design and aim, database and data collection methods, data mining and data cleaning, data analysis, data sharing and documentation. Preliminary results in manually processing 40 scientific manuscripts indicate components of the proposed framework with strong inter-rater reliability, as well as directions for further research and refinement of RepeAT. Conclusions The use of RepeAT may allow the biomedical community to have a better understanding of the current practices of research transparency and accessibility among principal investigators. Common adoption of RepeAT may improve reporting of research practices and the availability of research outputs. Additionally, use of RepeAT will facilitate comparisons of research transparency and accessibility across domains and institutions.

  5. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  6. Interactive Stable Ray Tracing

    DEFF Research Database (Denmark)

    Dal Corso, Alessandro; Salvi, Marco; Kolb, Craig

    2017-01-01

    Interactive ray tracing applications running on commodity hardware can suffer from objectionable temporal artifacts due to a low sample count. We introduce stable ray tracing, a technique that improves temporal stability without the over-blurring and ghosting artifacts typical of temporal post-pr...

  7. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  8. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  9. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  10. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  11. SIMULATE-3 K coupled code applications

    Energy Technology Data Exchange (ETDEWEB)

    Joensson, Christian [Studsvik Scandpower AB, Vaesteraas (Sweden); Grandi, Gerardo; Judd, Jerry [Studsvik Scandpower Inc., Idaho Falls, ID (United States)

    2017-07-15

    This paper describes the coupled code system TRACE/SIMULATE-3 K/VIPRE and the application of this code system to the OECD PWR Main Steam Line Break. A short description is given for the application of the coupled system to analyze DNBR and the flexibility the system creates for the user. This includes the possibility to compare and evaluate the result with the TRACE/SIMULATE-3K (S3K) coupled code, the S3K standalone code (core calculation) as well as performing single-channel calculations with S3K and VIPRE. This is the typical separate-effect-analyses required for advanced calculations in order to develop methodologies to be used for safety analyses in general. The models and methods of the code systems are presented. The outline represents the analysis approach starting with the coupled code system, reactor and core model calculation (TRACE/S3K). This is followed by a more detailed core evaluation (S3K standalone) and finally a very detailed thermal-hydraulic investigation of the hot pin condition (VIPRE).

  12. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  13. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  14. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  15. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  16. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  17. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  18. Nuclear traces in glass

    International Nuclear Information System (INIS)

    Segovia A, M. de N.

    1978-01-01

    The charged particles produce, in dielectric materials, physical and chemical effects which make evident the damaged zone along the trajectory of the particle. This damaged zone is known as the latent trace. The latent traces can be enlarged by an etching of the detector material. This treatment attacks preferently the zones of the material where the charged particles have penetrated, producing concavities which can be observed through a low magnification optical microscope. These concavities are known as developed traces. In this work we describe the glass characteristics as a detector of the fission fragments traces. In the first chapter we present a summary of the existing basic theories to explain the formation of traces in solids. In the second chapter we describe the etching method used for the traces development. In the following chapters we determine some chatacteristics of the traces formed on the glass, such as: the development optimum time; the diameter variation of the traces and their density according to the temperature variation of the detector; the glass response to a radiation more penetrating than that of the fission fragments; the distribution of the developed traces and the existing relation between this ditribution and the fission fragments of 252 Cf energies. The method which has been used is simple and cheap and can be utilized in laboratories whose resources are limited. The commercial glass which has been employed allows the registration of the fission fragments and subsequently the realization of experiments which involve the counting of the traces as well as the identification of particles. (author)

  19. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  20. A Computer Library for Ray Tracing in Analytical Media

    International Nuclear Information System (INIS)

    Miqueles, Eduardo; Coimbra, Tiago A; Figueiredo, J J S de

    2013-01-01

    Ray tracing technique is an important tool not only for forward but also for inverse problems in Geophysics, which most of the seismic processing steps depends on. However, implementing ray tracing codes can be very time consuming. This article presents a computer library to trace rays in 2.5D media composed by stack of layers. The velocity profile inside each layer is such that the eikonal equation can be analitically solved. Therefore, the ray tracing within such profile is made fast and accurately. The great advantage of an analytical ray tracing library is the numerical precision of the quantities computed and the fast execution of the implemented codes. Although ray tracing programs already exist for a long time, for example the seis package by Cervený, with a numerical approach to compute the ray. Regardless of the fact that numerical methods can solve more general problems, the analytical ones could be part of a more sofisticated simulation process, where the ray tracing time is completely relevant. We demonstrate the feasibility of our codes using numerical examples.

  1. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  2. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  3. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  4. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  5. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  6. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  7. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  8. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  9. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  10. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  11. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  12. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  13. MCViNE – An object oriented Monte Carlo neutron ray tracing simulation package

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiao Y.Y., E-mail: linjiao@ornl.gov [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Department of Applied Physics and Materials Science, California Institute of Technology (United States); Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Smith, Hillary L. [Department of Applied Physics and Materials Science, California Institute of Technology (United States); Granroth, Garrett E., E-mail: granrothge@ornl.gov [Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A. [Quantum Condensed Matter Division, Oak Ridge National Laboratory (United States); Aivazis, Michael [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Fultz, Brent, E-mail: btf@caltech.edu [Department of Applied Physics and Materials Science, California Institute of Technology (United States)

    2016-02-21

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.

  14. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  15. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  16. Traces of Drosophila Memory

    Science.gov (United States)

    Davis, Ronald L.

    2012-01-01

    Summary Studies using functional cellullar imaging of living flies have identified six memory traces that form in the olfactory nervous system after conditioning with odors. These traces occur in distinct nodes of the olfactory nervous system, form and disappear across different windows of time, and are detected in the imaged neurons as increased calcium influx or synaptic release in response to the conditioned odor. Three traces form at, or near acquisition and co-exist with short-term behavioral memory. One trace forms with a delay after learning and co-exists with intermediate-term behavioral memory. Two traces form many hours after acquisition and co-exist with long-term behavioral memory. The transient memory traces may support behavior across the time-windows of their existence. The experimental approaches for dissecting memory formation in the fly, ranging from the molecular to the systems, make it an ideal system for dissecting the logic by which the nervous system organizes and stores different temporal forms of memory. PMID:21482352

  17. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  18. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  19. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  20. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    OpenAIRE

    Stodden, Victoria; Miguez, Sheila

    2014-01-01

    The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are...

  1. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  2. Atom trap trace analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O' Connor, T. P.; Young, L.

    2000-05-25

    A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual {sup 85}Kr and {sup 81}Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10{sup {minus}11} and 10{sup {minus}13}, respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications.

  3. Atom trap trace analysis

    International Nuclear Information System (INIS)

    Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O'Connor, T. P.; Young, L.

    2000-01-01

    A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual 85 Kr and 81 Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10 -11 and 10 -13 , respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications

  4. Oscilloscope trace photograph digitizing system (TRACE)

    International Nuclear Information System (INIS)

    Richards, M.; Dabbs, R.D.

    1977-10-01

    The digitizing system allows digitization of photographs or sketches of waveforms and then the computer is used to reduce and analyze the data. The software allows for alignment, calibration, removal of baselines, removal of unwanted points and addition of new points which makes for a fairly versatile system as far as data reduction and manipulation are concerned. System considerations are introduced first to orient the potential user to the process of digitizing information. The start up and actual commands for TRACE are discussed. Detailed descriptions of each subroutine and program section are also provided. Three general examples of typical photographs are included. A partial listing of FAWTEK is made available. Once suitable arrays that contain the data are arranged, ''GO FA'' (active FAWTEK) and many mathematical operations to further analyze the data may be performed

  5. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  6. Reproducibility of morphometric X-ray absorptiometry

    International Nuclear Information System (INIS)

    Culton, N.; Pocock, N.

    1999-01-01

    Full text: Morphometric X-ray absorptiometry (MXA) using DXA is potentially a useful clinical tool which may provide additional vertebral fracture information with low radiation exposure. While morphometric analysis is semi-automated, operator intervention is crucial for the accurate positioning of the six data points quantifying the vertebral heights at the anterior, middle and posterior positions. Our study evaluated intra-operator reproducibility of MXA in an elderly patient population and assessed the effect of training and experience on vertebral height precision. Ten patients, with a mean lumbar T score of - 2.07, were studied. Images were processed by a trained operator who had initially only limited morphometric experience. The analysis of the data files were repeated at 2 and 6 weeks, during which time the operator had obtained further experience and training. The intra-operator precision of vertebral height measurements was calculated using the three separate combinations of paired analyses, and expressed as the coefficient of variation. This study confirms the importance of adequate training and attention to detail in MXA analysis. The data indicate that the precision of MXA is adequate for its use in the diagnosis of vertebral fractures, based on a 20% deformity criteria. Use of MXA for monitoring would require approximately an 8% change in vertebral heights to achieve statistical significance

  7. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  8. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  9. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  10. Queer Tracings of Genre

    DEFF Research Database (Denmark)

    Balle, Søren Hattesen

    as (re)tracings of genres that appear somehow residual or defunct in a post-modernist poetic context. On the other, they are made to "encode new [and queer, shb] meanings" (Anne Ferry) inasmuch as Ashbery, for instance, doubles and literalizes Dante's false etymology of the word ‘eclogue' (aig- and logos...

  11. The Trace of Superusers

    DEFF Research Database (Denmark)

    Samson, Kristine; Abasolo, José

    2013-01-01

    The city and its public spaces can be seen as a fragmented whole carrying meanings and traces of culture, use and politics with it. Whereas architects impose new stories and meanings on the urban fabric, the city itself is layered and assembled, a collective of social flows and routines a result ...

  12. Third order trace formula

    Indian Academy of Sciences (India)

    N. Centre for Advanced Scientific Research, Bangalore 560 064, India. 2Indian Institute of ... for rational functions φ with poles off R. In [5,16], Koplienko's trace formula was derived ... be a sequence of complex numbers such that ..... Again if we set the sum of the second and fourth term inside the integral in (2.3) to be. I2 ≡.

  13. Advanced methodology to simulate boiling water reactor transient using coupled thermal-hydraulic/neutron-kinetic codes

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Christoph Oliver

    2016-06-13

    -sets) predicted by SCALE6/TRITON and CASMO. Thereby the coupled TRACE/PARCS simulations reproduced the single fuel assembly depletion and stand-alone PARCS results. A turbine trip event, occurred at a BWR plant of type 72, has been investigated in detail using the cross-section libraries generated with SCALE/TRITON and CASMO. Thereby the evolution of the integral BWR parameters predicted by the coupled codes using cross-sections from SCALE/TRITON is very close to the global trends calculated using CASMO cross-sections. Further, to implement uncertainty quantifications, the PARCS reactor dynamic code was extended (uncertainty module) to facilitate the consideration of the uncertainty of neutron kinetic parameters in coupled TRACE/PARCS simulations. For a postulated pressure pertubation, an uncertainty and sensitivity study was performed using TRACE/PARCS and SUSA. The obtained results illustrated the capability of such methodologies which are still under development. Based on this analysis, the uncertainty band for key-parameters, e.g. reactivity, as well as the importance ranking of reactor kinetics parameters could be predicted and identified for this accident scenario.

  14. Liquid scintigraphic gastric emptying - is it reproducible?

    International Nuclear Information System (INIS)

    Cooper, R.G.; Shuter, B.; Leach, M.; Roach, P.J.

    1999-01-01

    Full text: Radioisotope gastric emptying (GE) studies have been used as a non-invasive technique for motility assessment for many years. In a recent study investigating the correlation of mesenteric vascular changes with GE, six subjects had a repeat study 2-4 months later. Repeat studies were required due to minor technical problems (5 subjects) and a very slow GE (I subject) on the original study. Subjects drank 275 ml of 'Ensure Plus' mixed with 8 MBq 67 Ga-DTPA and were imaged for 2 h while lying supine. GE time-activity curves for each subject were generated and time to half emptying (T l/2 ) calculated. Five of the six subjects had more rapid GE on the second study. Three of the subjects had T l/2 values on their second study which were within ± 15 min of their original T l/2 . The other three subjects had T l/2 values on their second study which were 36 min, 55 min and 280 min (subject K.H.) less than their original T l/2 . Statistical analysis (t-test) was performed on paired T l/2 values. The average T l/2 value was greater in the first study than in the second (149 ± 121 and 86 ± 18 min respectively), although the difference was not statistically significant (P ∼ 0.1). Subjects' anxiety levels were not quantitated during the GE study; however, several major equipment faults occurred during the original study of subject K.H., who became visibly stressed. These results suggest that the reproducibility of GE studies may be influenced by psychological factors

  15. Is my network module preserved and reproducible?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    2011-01-01

    Full Text Available In many applications, one is interested in determining which of the properties of a network module change across conditions. For example, to validate the existence of a module, it is desirable to show that it is reproducible (or preserved in an independent test network. Here we study several types of network preservation statistics that do not require a module assignment in the test network. We distinguish network preservation statistics by the type of the underlying network. Some preservation statistics are defined for a general network (defined by an adjacency matrix while others are only defined for a correlation network (constructed on the basis of pairwise correlations between numeric variables. Our applications show that the correlation structure facilitates the definition of particularly powerful module preservation statistics. We illustrate that evaluating module preservation is in general different from evaluating cluster preservation. We find that it is advantageous to aggregate multiple preservation statistics into summary preservation statistics. We illustrate the use of these methods in six gene co-expression network applications including 1 preservation of cholesterol biosynthesis pathway in mouse tissues, 2 comparison of human and chimpanzee brain networks, 3 preservation of selected KEGG pathways between human and chimpanzee brain networks, 4 sex differences in human cortical networks, 5 sex differences in mouse liver networks. While we find no evidence for sex specific modules in human cortical networks, we find that several human cortical modules are less preserved in chimpanzees. In particular, apoptosis genes are differentially co-expressed between humans and chimpanzees. Our simulation studies and applications show that module preservation statistics are useful for studying differences between the modular structure of networks. Data, R software and accompanying tutorials can be downloaded from the following webpage: http

  16. Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)

    Science.gov (United States)

    Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.

    2016-01-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We

  17. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  18. Markov traces and II1 factors in conformal field theory

    International Nuclear Information System (INIS)

    Boer, J. de; Goeree, J.

    1991-01-01

    Using the duality equations of Moore and Seiberg we define for every primary field in a Rational Conformal Field Theory a proper Markov trace and hence a knot invariant. Next we define two nested algebras and show, using results of Ocneanu, how the position of the smaller algebra in the larger one reproduces part of the duality data. A new method for constructing Rational Conformal Field Theories is proposed. (orig.)

  19. Analysis of Uncertainty and Sensitivity with TRACE-SUSA and TRACE-DAKOTA. Application to NUPEC BFTB; Analisis de Incertidumbre y Sensibilidad con TRACE-SUSA y TRACE-DAKOTA. Aplicacion a NUPEC BFTB

    Energy Technology Data Exchange (ETDEWEB)

    Montero-Mayorga, J.; Wadim, J.; Sanchez, V. H.

    2012-07-01

    The aim of this work is to test the capabilities of the new tool of uncertainty incorporated into SNAP by simulating experiments with TRACE code and compare these with the results obtained by the same simulations with uncertainty calculation performed with the tool SUSA.

  20. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  1. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  2. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  3. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M A; Fink, D; Hua, Q; Jacobsen, G E; Lawson, E M; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  4. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  5. On Trace Zero Matrices

    Indian Academy of Sciences (India)

    In this note, we shall try to present an elemen- tary proof of a couple of closely related results which have both proved quite useful, and al~ indicate possible generalisations. The results we have in mind are the following facts: (a) A complex n x n matrix A has trace 0 if and only if it is expressible in the form A = PQ - Q P.

  6. Preconcentration of trace elements

    International Nuclear Information System (INIS)

    Zolotov, Yu. A.; Kuz'min, N.M.

    1990-01-01

    This monograph deals with the theory and practical applications of trace metals preconcentration. It gives general characteristics of the process and describes in detail the methods of preconcentration: solvent extraction, sorption, co-precipitation, volatilization, and others. Special attention is given to preconcentration in combination with subsequent determination methods. The use of preconcentration in analysis of environmental and biological samples, mineral raw materials, high purity substances, and various industrial materials is also considered

  7. A manual to the MAXRAY program library for reflective and dispersive ray tracing

    International Nuclear Information System (INIS)

    Svensson, S.; Nyholm, R.

    1985-07-01

    A general ray tracing program package for reflective and dispersive X-ray optics is described. The package consists of a number of subroutines written in FORTRAN 77 code giving the necessary tools for ray tracing. The program package is available on request from the authors. (authors)

  8. Anisotropic ray trace

    Science.gov (United States)

    Lam, Wai Sze Tiffany

    Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for

  9. Tracers and tracing methods

    International Nuclear Information System (INIS)

    Leclerc, J.P.

    2001-01-01

    The first international congress on 'Tracers and tracing methods' took place in Nancy in May 2001. The objective of this second congress was to present the current status and trends on tracing methods and their applications. It has given the opportunity to people from different fields to exchange scientific information and knowledge about tracer methodologies and applications. The target participants were the researchers, engineers and technologists of various industrial and research sectors: chemical engineering, environment, food engineering, bio-engineering, geology, hydrology, civil engineering, iron and steel production... Two sessions have been planned to cover both fundamental and industrial aspects: 1)fundamental development (tomography, tracer camera visualization and particles tracking; validation of computational fluid dynamics simulations by tracer experiments and numerical residence time distribution; new tracers and detectors or improvement and development of existing tracing methods; data treatments and modeling; reactive tracer experiments and interpretation) 2)industrial applications (geology, hydrogeology and oil field applications; civil engineering, mineral engineering and metallurgy applications; chemical engineering; environment; food engineering and bio-engineering). The program included 5 plenary lectures, 23 oral communications and around 50 posters. Only 9 presentations are interested for the INIS database

  10. PLASMOR: A laser-plasma simulation code. Pt. 2

    International Nuclear Information System (INIS)

    Salzman, D.; Krumbein, A.D.; Szichman, H.

    1987-06-01

    This report supplements a previous one which describes the PLASMOR hydrodynamics code. The present report documents the recent changes and additions made in the code. In particular described are two new subroutines for radiative preheat, a system of preprocessors which prepare the code before run, a list of postprocessors which simulate experimental setups, and the basic data sets required to run PLASMOR. In the Appendix a new computer-based manual which lists the main features of PLASMOR is reproduced

  11. Evaluation of the reproducibility of two techniques used to determine and record centric relation in angle's class I patients

    Directory of Open Access Journals (Sweden)

    Fernanda Paixão

    2007-08-01

    Full Text Available The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson's Bilateral Manipulation and Gysi's Gothic Arch Tracing. Twenty volunteers (14 females and 6 males with no dental loss, presenting occlusal contacts according to those described in Angle's I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson's Bilateral Manipulation and to the Gysi's Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27. Shapiro-Wilk test was applied and the results allowed application of Student's t-test (sampling error of 5%. The techniques showed different degrees of variability. The Gysi's Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records.

  12. Goya - an MHD equilibrium code for toroidal plasmas

    International Nuclear Information System (INIS)

    Scheffel, J.

    1984-09-01

    A description of the GOYA free-boundary equilibrium code is given. The non-linear Grad-Shafranov equation of ideal MHD is solved in a toroidal geometry for plasmas with purely poloidal magnetic fields. The code is based on a field line-tracing procedure, making storage of a large amount of information on a grid unnecessary. Usage of the code is demonstrated by computations of equi/libria for the EXTRAP-T1 device. (Author)

  13. An analysis of reproducibility and non-determinism in HEP software and ROOT data

    Science.gov (United States)

    Ivie, Peter; Zheng, Charles; Lannon, Kevin; Thain, Douglas

    2017-10-01

    Reproducibility is an essential component of the scientific method. In order to validate the correctness or facilitate the extension of a computational result, it should be possible to re-run a published result and verify that the same results are produced. However, reproducing a computational result is surprisingly difficult: non-determinism and other factors may make it impossible to get the same result, even when running the same code on the same machine on the same day. We explore this problem in the context of HEP codes and data, showing three high level methods for dealing with non-determinism in general: 1) Domain specific methods; 2) Domain specific comparisons; and 3) Virtualization adjustments. Using a CMS workflow with output data stored in ROOT files, we use these methods to prevent, detect, and eliminate some sources of non-determinism. We observe improved determinism using pre-determined random seeds, a predictable progression of system timestamps, and fixed process identifiers. Unfortunately, sources of non-determinism continue to exist despite the combination of all three methods. Hierarchical data comparisons also allow us to appropriately ignore some non-determinism when it is unavoidable. We conclude that there is still room for improvement, and identify directions that can be taken in each method to make an experiment more reproducible.

  14. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  15. Development Status of TRACE model for PGSFR Safety Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Andong; Choi, Yong Won; Kim, Jihun; Bae, Moohoon [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    For the preparation of the review of licensing application for PGSFR, TRACE model for the PGSFR is being developed considering the sodium related properties and model in the code. For the use of licensing purpose, it is identified and need to be improved that model uncertainty in the code and conservative conditions for accident analysis is needs to be defined and validated. And current simulations are applicable only to assembly-averaged assessment. So it is also need to be defined for pin-wise assessment within hot assembly. On the basis on the developed model, PGSFR design change will be applied and improved for independent audit calculation for incoming licensing review. Prototype Generation IV Sodium cooled Fast Reactor (PGSFR) of 150MWe is under developing targeting licensing application by 2017. KINS is preparing review of its licensing application, especially the audit calculation tool for transient and accident analysis is being prepared for review. Since 2012, TRACE code applicability study has been doing for the Sodium-cooled Fast Reactor. At first, Sodium properties and the related heat transfer model in the code were reviewed. Demonstration Sodium cooled Fast Reactor (DSFR-600) were model and representing DBAs were assessed until the PGSFR design is fixed. EBR-II Shutdown Heat Removal Test (SHRT) experiment is also being analyzed in terms of IAEA Cooperated Research Program. In this paper, PGSFR TRACE code modeling status and considerations for SFR DBA assessment is introduced.

  16. Mobile code security

    Science.gov (United States)

    Ramalingam, Srikumar

    2001-11-01

    A highly secure mobile agent system is very important for a mobile computing environment. The security issues in mobile agent system comprise protecting mobile hosts from malicious agents, protecting agents from other malicious agents, protecting hosts from other malicious hosts and protecting agents from malicious hosts. Using traditional security mechanisms the first three security problems can be solved. Apart from using trusted hardware, very few approaches exist to protect mobile code from malicious hosts. Some of the approaches to solve this problem are the use of trusted computing, computing with encrypted function, steganography, cryptographic traces, Seal Calculas, etc. This paper focuses on the simulation of some of these existing techniques in the designed mobile language. Some new approaches to solve malicious network problem and agent tampering problem are developed using public key encryption system and steganographic concepts. The approaches are based on encrypting and hiding the partial solutions of the mobile agents. The partial results are stored and the address of the storage is destroyed as the agent moves from one host to another host. This allows only the originator to make use of the partial results. Through these approaches some of the existing problems are solved.

  17. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  18. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Directory of Open Access Journals (Sweden)

    Ling-Hong Hung

    Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  19. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Science.gov (United States)

    Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2016-01-01

    Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  20. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  1. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  2. Au coated PS nanopillars as a highly ordered and reproducible SERS substrate

    Science.gov (United States)

    Kim, Yong-Tae; Schilling, Joerg; Schweizer, Stefan L.; Sauer, Guido; Wehrspohn, Ralf B.

    2017-07-01

    Noble metal nanostructures with nanometer gap size provide strong surface-enhanced Raman scattering (SERS) which can be used to detect trace amounts of chemical and biological molecules. Although several approaches were reported to obtain active SERS substrates, it still remains a challenge to fabricate SERS substrates with high sensitivity and reproducibility using low-cost techniques. In this article, we report on the fabrication of Au sputtered PS nanopillars based on a template synthetic method as highly ordered and reproducible SERS substrates. The SERS substrates are fabricated by anodic aluminum oxide (AAO) template-assisted infiltration of polystyrene (PS) resulting in hemispherical structures, and a following Au sputtering process. The optimum gap size between adjacent PS nanopillars and thickness of the Au layers for high SERS sensitivity are investigated. Using the Au sputtered PS nanopillars as an active SERS substrate, the Raman signal of 4-methylbenzenethiol (4-MBT) with a concentration down to 10-9 M is identified with good signal reproducibility, showing great potential as promising tool for SERS-based detection.

  3. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  4. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  5. Reproducing a Prospective Clinical Study as a Computational Retrospective Study in MIMIC-II.

    Science.gov (United States)

    Kury, Fabrício S P; Huser, Vojtech; Cimino, James J

    2015-01-01

    In this paper we sought to reproduce, as a computational retrospective study in an EHR database (MIMIC-II), a recent large prospective clinical study: the 2013 publication, by the Japanese Association for Acute Medicine (JAAM), about disseminated intravascular coagulation, in the journal Critical Care (PMID: 23787004). We designed in SQL and Java a set of electronic phenotypes that reproduced the study's data sampling, and used R to perform the same statistical inference procedures. All produced source code is available online at https://github.com/fabkury/paamia2015. Our program identified 2,257 eligible patients in MIMIC-II, and the results remarkably agreed with the prospective study. A minority of the needed data elements was not found in MIMIC-II, and statistically significant inferences were possible in the majority of the cases.

  6. Osteoporosis and trace elements

    DEFF Research Database (Denmark)

    Aaseth, J.; Boivin, G.; Andersen, Ole

    2012-01-01

    More than 200 million people are affected by osteoporosis worldwide, as estimated by 2 million annual hip fractures and other debilitating bone fractures (vertebrae compression and Colles' fractures). Osteoporosis is a multi-factorial disease with potential contributions from genetic, endocrine...... in new bone and results in a net gain in bone mass, but may be associated with a tissue of poor quality. Aluminum induces impairment of bone formation. Gallium and cadmium suppresses bone turnover. However, exact involvements of the trace elements in osteoporosis have not yet been fully clarified...

  7. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  9. Trace conditioning in insects-keep the trace!

    Science.gov (United States)

    Dylla, Kristina V; Galili, Dana S; Szyszka, Paul; Lüdke, Alja

    2013-01-01

    Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS) and an unconditioned stimulus (US) following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination-a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase (Rut-AC), which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  10. Trace conditioning in insects – Keep the trace!

    Directory of Open Access Journals (Sweden)

    Kristina V Dylla

    2013-08-01

    Full Text Available Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS and an unconditioned stimulus (US following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination – a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase, which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  11. BETHSY 9.1b Test Calculation with TRACE Using 3D Vessel Component

    International Nuclear Information System (INIS)

    Berar, O.; Prosek, A.

    2012-01-01

    Recently, several advanced multidimensional computational tools for simulating reactor system behaviour during real and hypothetical transient scenarios were developed. One of such advanced, best-estimate reactor systems codes is TRAC/RELAP Advanced Computational Engine (TRACE), developed by the U.S. Nuclear Regulatory Commission. The advanced TRACE comes with a graphical user interface called SNAP (Symbolic Nuclear Analysis Package). It is intended for pre- and post-processing, running codes, RELAP5 to TRACE input deck conversion, input deck database generation etc. The TRACE code is still not fully development and it will have all the capabilities of RELAP5. The purpose of the present study was therefore to assess the 3D capability of the TRACE on BETHSY 9.1b test. The TRACE input deck was semi-converted (using SNAP and manual corrections) from the RELAP5 input deck. The 3D fluid dynamics within reactor vessel was modelled and compared to 1D fluid dynamics. The 3D calculation was compared both to TRACE 1D calculation and RELAP5 calculation. Namely, the geometry used in TRACE is basically the same, what gives very good basis for the comparison of the codes. The only exception is 3D reactor vessel model in case of TRACE 3D calculation. The TRACE V5.0 Patch 1 and RELAP5/MOD3.3 Patch 4 were used for calculations. The BETHSY 9.1b test (International Standard Problem no. 27 or ISP-27) was 5.08 cm equivalent diameter cold leg break without high pressure safety injection and with delayed ultimate procedure. BETHSY facility was a 3-loop replica of a 900 MWe FRAMATOME pressurized water reactor. For better presentation of the calculated physical phenomena and processes, an animation model using SNAP was developed. In general, the TRACE 3D code calculation is in good agreement with the BETHSY 9.1b test. The TRACE 3D calculation results are as good as or better than the RELAP5 calculated results. Also, the TRACE 3D calculation is not significantly different from TRACE 1D

  12. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  13. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  14. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  15. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  16. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  17. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  18. Coding of Stimuli by Animals: Retrospection, Prospection, Episodic Memory and Future Planning

    Science.gov (United States)

    Zentall, Thomas R.

    2010-01-01

    When animals code stimuli for later retrieval they can either code them in terms of the stimulus presented (as a retrospective memory) or in terms of the response or outcome anticipated (as a prospective memory). Although retrospective memory is typically assumed (as in the form of a memory trace), evidence of prospective coding has been found…

  19. Extensible, Reusable, and Reproducible Computing: A Case Study of PySPH

    International Nuclear Information System (INIS)

    Ramachandran, Prabhu

    2016-01-01

    In this work, the Smoothed Particle Hydrodynamics (SPH) technique is considered as an example of a typical computational research area. PySPH is an open source framework for SPH computations. PySPH is designed to be easy to use. The framework allows a user to implement an entire simulation in pure Python. It is designed to make it easy for scientists to reuse their code and extend the work of others. These important features allow PySPH to facilitate reproducible computational research. Based on the experience with PySPH, general recommendations are suggested for other computational researchers. (paper)

  20. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  1. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  2. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  3. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  4. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  5. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    International Nuclear Information System (INIS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-01-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  6. IRSN Code of Ethics and Professional Conduct. Annex VII [TSO Mission Statement and Code of Ethics

    International Nuclear Information System (INIS)

    2018-01-01

    IRSN has adopted, in 2013, a Code of Ethics and Professional Conduct, the contents of which are summarized. As a preamble, it is indicated that the Code, which was adopted in 2013 by the Ethics Commission of IRSN and the Board of IRSN, complies with relevant constitutional and legal requirements. The introduction to the Code presents the role and missions of IRSN in the French system, as well as the various conditions and constraints that frame its action, in particular with respect to ethical issues. It states that the Code sets principles and establishes guidance for addressing these constraints and resolving conflicts that may arise, thus constituting references for the Institute and its staff, and helping IRSN’s partners in their interaction with the Institute. The stipulations of the Code are organized in four articles, reproduced and translated.

  7. Traces generating what was there

    CERN Document Server

    2017-01-01

    Traces keep time contained and make visible what was there. Going back to the art of trace-reading, they continue to be a fundamental resource for scientific knowledge production. The contributions study, from the biology laboratory to the large colliders of particle physics, techniques involved in the production of material traces. Following their changes over two centuries, this collection shows the continuities they have in the digital age.

  8. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  9. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  10. Trace Mineral Losses in Sweat

    National Research Council Canada - National Science Library

    Chinevere, Troy D; McClung, James P; Cheuvront, Samuel N

    2007-01-01

    Copper, iron and zinc are nutritionally essential trace minerals that confer vital biological roles including the maintenance of cell structure and integrity, regulation of metabolism, immune function...

  11. Trace analysis of semiconductor materials

    CERN Document Server

    Cali, J Paul; Gordon, L

    1964-01-01

    Trace Analysis of Semiconductor Materials is a guidebook concerned with procedures of ultra-trace analysis. This book discusses six distinct techniques of trace analysis. These techniques are the most common and can be applied to various problems compared to other methods. Each of the four chapters basically includes an introduction to the principles and general statements. The theoretical basis for the technique involved is then briefly discussed. Practical applications of the techniques and the different instrumentations are explained. Then, the applications to trace analysis as pertaining

  12. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  13. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research

    Directory of Open Access Journals (Sweden)

    Victoria Stodden

    2014-07-01

    Full Text Available The goal of this article is to coalesce a discussion around best practices for scholarly research that utilizes computational methods, by providing a formalized set of best practice recommendations to guide computational scientists and other stakeholders wishing to disseminate reproducible research, facilitate innovation by enabling data and code re-use, and enable broader communication of the output of computational scientific research. Scholarly dissemination and communication standards are changing to reflect the increasingly computational nature of scholarly research, primarily to include the sharing of the data and code associated with published results. We also present these Best Practices as a living, evolving, and changing document at http://wiki.stodden.net/Best_Practices.

  14. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  15. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser.

    Science.gov (United States)

    Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R

    2012-01-01

    Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".

  16. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser

    Directory of Open Access Journals (Sweden)

    Jonas S Almeida

    2012-01-01

    Full Text Available Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results : Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH′s popular ImageJ application. Conclusions : The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without

  17. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  18. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  19. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  20. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  1. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  2. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    Science.gov (United States)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  3. Olfactory memory traces in Drosophila.

    Science.gov (United States)

    Berry, Jacob; Krause, William C; Davis, Ronald L

    2008-01-01

    In Drosophila, the fruit fly, coincident exposure to an odor and an aversive electric shock can produce robust behavioral memory. This behavioral memory is thought to be regulated by cellular memory traces within the central nervous system of the fly. These molecular, physiological, or structural changes in neurons, induced by pairing odor and shock, regulate behavior by altering the neurons' response to the learned environment. Recently, novel in vivo functional imaging techniques have allowed researchers to observe cellular memory traces in intact animals. These investigations have revealed interesting temporal and spatial dynamics of cellular memory traces. First, a short-term cellular memory trace was discovered that exists in the antennal lobe, an early site of olfactory processing. This trace represents the recruitment of new synaptic activity into the odor representation and forms for only a short period of time just after training. Second, an intermediate-term cellular memory trace was found in the dorsal paired medial neuron, a neuron thought to play a role in stabilizing olfactory memories. Finally, a long-term protein synthesis-dependent cellular memory trace was discovered in the mushroom bodies, a structure long implicated in olfactory learning and memory. Therefore, it appears that aversive olfactory associations are encoded by multiple cellular memory traces that occur in different regions of the brain with different temporal domains.

  4. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  5. TRACE/PARCS modelling of rips trip transients for Lungmen ABWR

    Energy Technology Data Exchange (ETDEWEB)

    Chang, C. Y. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., No.101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Lin, H. T.; Wang, J. R. [Inst. of Nuclear Energy Research, No. 1000, Wenhua Rd., Longtan Township, Taoyuan County 32546, Taiwan (China); Shih, C. [Inst. of Nuclear Engineering and Science, Dept. of Engineering and System Science, National Tsing-Hua Univ., No.101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2012-07-01

    The objectives of this study are to examine the performances of the steady-state results calculated by the Lungmen TRACE/PARCS model compared to SIMULATE-3 code, as well as to use the analytical results of the final safety analysis report (FSAR) to benchmark the Lungmen TRACE/PARCS model. In this study, three power generation methods in TRACE were utilized to analyze the three reactor internal pumps (RIPs) trip transient for the purpose of validating the TRACE/PARCS model. In general, the comparisons show that the transient responses of key system parameters agree well with the FSAR results, including core power, core inlet flow, reactivity, etc. Further studies will be performed in the future using Lungmen TRACE/PARCS model. After the commercial operation of Lungmen nuclear power plant, TRACE/PARCS model will be verified. (authors)

  6. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    side the limits of a consulting room.5. Reproducibility of ... examination, intraocular pressure and corneal thickness ... All OCT measurements were taken between 2 and 5 pm ..... CAS-OCT, Slit-lamp OCT, RTVue-100) have shown ICC.

  7. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  8. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  9. Assessment of intercentre reproducibility and epidemiological concordance of Legionella pneumophila serogroup 1 genotyping by amplified fragment length polymorphism analysis

    DEFF Research Database (Denmark)

    Fry, N K; Bangsborg, Jette Marie; Bernander, S

    2000-01-01

    The aims of this work were to assess (i) the intercentre reproducibility and epidemiological concordance of amplified fragment length polymorphism analysis for epidemiological typing of Legionella pneumophila serogroup 1, and (ii) the suitability of the method for standardisation and implementation...... by members of the European Working Group on Legionella Infections. Fifty coded isolates comprising two panels of well-characterised strains, a "reproducibility" panel (n=20) and an "epidemiologically related" panel (n=30), were sent to 13 centres in 12 European countries. Analysis was undertaken in each...... using gel analysis software yielded R=1.00 and E=1.00, with 12, 13 or 14 types. This method can be used as a simple, rapid screening tool for epidemiological typing of isolates of Legionella pneumophila serogroup 1. Results demonstrate that the method can be highly reproducible (R=1...

  10. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  11. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  12. Tracing Geothermal Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Michael C. Adams; Greg Nash

    2004-03-01

    Geothermal water must be injected back into the reservoir after it has been used for power production. Injection is critical in maximizing the power production and lifetime of the reservoir. To use injectate effectively the direction and velocity of the injected water must be known or inferred. This information can be obtained by using chemical tracers to track the subsurface flow paths of the injected fluid. Tracers are chemical compounds that are added to the water as it is injected back into the reservoir. The hot production water is monitored for the presence of this tracer using the most sensitive analytic methods that are economically feasible. The amount and concentration pattern of the tracer revealed by this monitoring can be used to evaluate how effective the injection strategy is. However, the tracers must have properties that suite the environment that they will be used in. This requires careful consideration and testing of the tracer properties. In previous and parallel investigations we have developed tracers that are suitable from tracing liquid water. In this investigation, we developed tracers that can be used for steam and mixed water/steam environments. This work will improve the efficiency of injection management in geothermal fields, lowering the cost of energy production and increasing the power output of these systems.

  13. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  14. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    International Nuclear Information System (INIS)

    Suer, Pascal; Lindqvist, Jan-Erik; Arm, Maria; Frogner-Kockum, Paul

    2009-01-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO 2 ) were used for accelerated ageing. Time (7-14 days), temperature (20-40 o C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO 2 and seven days at 40 o C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO 4 , DOC and Cr were not reproduced.

  15. Modernized Approach for Generating Reproducible Heterogeneity Using Transmitted-Light for Flow Visualization Experiments

    Science.gov (United States)

    Jones, A. A.; Holt, R. M.

    2017-12-01

    Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).

  16. Suspension of the NAB Code and Its Effect on Regulation of Advertising.

    Science.gov (United States)

    Maddox, Lynda M.; Zanot, Eric J.

    1984-01-01

    Traces events leading to the suspension of the Television Code of the National Association of Broadcasters in 1982 and looks at changes that have occurred in the informal and formal regulation of advertising as a result of that suspension. (FL)

  17. A Denotational Semantics for Communicating Unstructured Code

    Directory of Open Access Journals (Sweden)

    Nils Jähnig

    2015-03-01

    Full Text Available An important property of programming language semantics is that they should be compositional. However, unstructured low-level code contains goto-like commands making it hard to define a semantics that is compositional. In this paper, we follow the ideas of Saabas and Uustalu to structure low-level code. This gives us the possibility to define a compositional denotational semantics based on least fixed points to allow for the use of inductive verification methods. We capture the semantics of communication using finite traces similar to the denotations of CSP. In addition, we examine properties of this semantics and give an example that demonstrates reasoning about communication and jumps. With this semantics, we lay the foundations for a proof calculus that captures both, the semantics of unstructured low-level code and communication.

  18. Computer program for optical systems ray tracing

    Science.gov (United States)

    Ferguson, T. J.; Konn, H.

    1967-01-01

    Program traces rays of light through optical systems consisting of up to 65 different optical surfaces and computes the aberrations. For design purposes, paraxial tracings with astigmation and third order tracings are provided.

  19. Trace formulae for arithmetical systems

    International Nuclear Information System (INIS)

    Bogomolny, E.B.; Georgeot, B.; Giannoni, M.J.; Schmit, C.

    1992-09-01

    For quantum problems on the pseudo-sphere generated by arithmetic groups there exist special trace formulae, called trace formulae for Hecke operators, which permit the reconstruction of wave functions from the knowledge of periodic orbits. After a short discussion of this subject, the Hecke operators trace formulae are presented for the Dirichlet problem on the modular billiard, which is a prototype of arithmetical systems. The results of numerical computations for these semiclassical type relations are in good agreement with the directly computed eigenfunctions. (author) 23 refs.; 2 figs

  20. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  1. Development of code PRETOR for stellarator simulation

    International Nuclear Information System (INIS)

    Dies, J.; Fontanet, J.; Fontdecaba, J.M.; Castejon, F.; Alejandre, C.

    1998-01-01

    The Department de Fisica i Enginyeria Nuclear (DFEN) of the UPC has some experience in the development of the transport code PRETOR. This code has been validated with shots of DIII-D, JET and TFTR, it has also been used in the simulation of operational scenarios of ITER fast burnt termination. Recently, the association EURATOM-CIEMAT has started the operation of the TJ-II stellarator. Due to the need of validating the results given by others transport codes applied to stellarators and because all of them made some approximations, as a averaging magnitudes in each magnetic surface, it was thought suitable to adapt the PRETOR code to devices without axial symmetry, like stellarators, which is very suitable for the specific needs of the study of TJ-II. Several modifications are required in PRETOR; the main concerns to the models of: magnetic equilibrium, geometry and transport of energy and particles. In order to solve the complex magnetic equilibrium geometry the powerful numerical code VMEC has been used. This code gives the magnetic surface shape as a Fourier series in terms of the harmonics (m,n). Most of the geometric magnitudes are also obtained from the VMEC results file. The energy and particle transport models will be replaced by other phenomenological models that are better adapted to stellarator simulation. Using the proposed models, it is pretended to reproduce experimental data available from present stellarators, given especial attention to the TJ-II of the association EURATOM-CIEMAT. (Author)

  2. Unveiling Exception Handling Bug Hazards in Android Based on GitHub and Google Code Issues

    NARCIS (Netherlands)

    Coelho, R.; Almeida, L.; Gousios, G.; Van Deursen, A.

    2015-01-01

    This paper reports on a study mining the exception stack traces included in 159,048 issues reported on Android projects hosted in GitHub (482 projects) and Google Code (157 projects). The goal of this study is to investigate whether stack trace information can reveal bug hazards related to exception

  3. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  4. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  5. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  6. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  7. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  8. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  9. A Message Without a Code?

    Directory of Open Access Journals (Sweden)

    Tom Conley

    1981-01-01

    Full Text Available The photographic paradox is said to be that of a message without a code, a communication lacking a relay or gap essential to the process of communication. Tracing the recurrence of Barthes's definition in the essays included in Image/Music/Text and in La Chambre claire , this paper argues that Barthes's definition is platonic in its will to dematerialize the troubling — graphic — immediacy of the photograph. He writes of the image in order to flee its signature. As a function of media, his categories are written in order to be insufficient and inadequate; to maintain an ineluctable difference between language heard and letters seen; to protect an idiom of loss which the photograph disallows. The article studies the strategies of his definition in «The Photographic Paradox» as instrument of abstraction, opposes the notion of code, in an aural sense, to audio-visual markers of closed relay in advertising, and critiques the layout and order of La Chambre claire in respect to Barthes's ideology of absence.

  10. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  11. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  12. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  13. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  14. The quest for improved reproducibility in MALDI mass spectrometry.

    Science.gov (United States)

    O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P

    2018-03-01

    Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.

  15. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  16. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  17. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  18. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  19. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  20. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  1. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  2. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  3. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  4. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  5. Reproducing Kernel Method for Solving Nonlinear Differential-Difference Equations

    Directory of Open Access Journals (Sweden)

    Reza Mokhtari

    2012-01-01

    Full Text Available On the basis of reproducing kernel Hilbert spaces theory, an iterative algorithm for solving some nonlinear differential-difference equations (NDDEs is presented. The analytical solution is shown in a series form in a reproducing kernel space, and the approximate solution , is constructed by truncating the series to terms. The convergence of , to the analytical solution is also proved. Results obtained by the proposed method imply that it can be considered as a simple and accurate method for solving such differential-difference problems.

  6. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  7. Wood construction codes issues in the United States

    Science.gov (United States)

    Douglas R. Rammer

    2006-01-01

    The current wood construction codes find their origin in the 1935 Wood Handbook: Wood as an Engineering Material published by the USDA Forest Service. Many of the current design recommendations can be traced back to statements from this book. Since this time a series of development both historical and recent has led to a multi-layered system for use of wood products in...

  8. The elimination of ray tracing in Monte Carlo shielding programs

    International Nuclear Information System (INIS)

    Bendall, D.E.

    1988-01-01

    The MONK6 code has clearly demonstrated the advantages of hole tracking, which was devised by Woodcock et at. for use in criticality codes from earlier work by Von Neumann. Hole tracking eliminates ray tracing by introducing, for all materials present in the problem, a pseudo scattering reaction that forward scatters without energy loss. The cross section for this reaction is chosen so that the total cross sections for all the materials are equal at a given energy. By this means, tracking takes place with a constant total cross section everywhere, so there is now no need to ray trace. The present work extends hole tracking to shielding codes, where it functions in tandem with Russian roulette and splitting. An algorithm has been evolved and its performance is compared with the ray-tracking code McBEND. A disadvantage with hole tracking occurs when there is a wide variation in total cross section for materials present. As the tracking uses the total cross section of the material that has the maximum cross section, there can be a large number of pseudo collisions in the materials with low total cross sections. In extreme cases, the advantages of hole tracking can be lost by the by the extra time taken in servicing these pseudo collisions; however, techniques for eliminating this problem are under consideration

  9. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  10. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  11. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  12. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  13. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  14. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  15. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  16. Numerical modeling of flow boiling instabilities using TRACE

    International Nuclear Information System (INIS)

    Kommer, Eric M.

    2015-01-01

    Highlights: • TRACE was used to realistically model boiling instabilities in single and parallel channel configurations. • Model parameters were chosen to exactly mimic other author’s work in order to provide for direct comparison of results. • Flow stability maps generated by the model show unstable flow at operating points similar to other authors. • The method of adjudicating when a flow is “unstable” is critical in this type of numerical study. - Abstract: Dynamic flow instabilities in two-phase systems are a vitally important area of study due to their effects on a great number of industrial applications, including heat exchangers in nuclear power plants. Several next generation nuclear reactor designs incorporate once through steam generators which will exhibit boiling flow instabilities if not properly designed or when operated outside design limits. A number of numerical thermal hydraulic codes attempt to model instabilities for initial design and for use in accident analysis. TRACE, the Nuclear Regulatory Commission’s newest thermal hydraulic code is used in this study to investigate flow instabilities in both single and dual parallel channel configurations. The model parameters are selected as to replicate other investigators’ experimental and numerical work in order to provide easy comparison. Particular attention is paid to the similarities between analysis using TRACE Version 5.0 and RELAP5/MOD3.3. Comparison of results is accomplished via flow stability maps non-dimensionalized via the phase change and subcooling numbers. Results of this study show that TRACE does indeed model two phase flow instabilities, with the transient response closely mimicking that seen in experimental studies. When compared to flow stability maps generated using RELAP, TRACE shows similar results with differences likely due to the somewhat qualitative criteria used by various authors to determine when the flow is truly unstable

  17. Mantle End-Members: The Trace Element Perspective

    Science.gov (United States)

    Willbold, M.; Stracke, A.; Hofmann, A. W.

    2004-12-01

    . Although there are some trace element characteristics common to all EM-type basalts, which distinguish them from HIMU-type basalts (e.g. uniformly high Th/U ratios of 4.7 ± 0.3, and enrichment in Cs-U), each suite of EM-type basalts has unique trace element signatures that distinguish them from any other suite of EM-type basalts. This is especially obvious when comparing the trace element composition of EM basalts from one isotopic family, for example EM1-type basalts from Tristan, Gough and Pitcairn. Consequently, the trace element systematics of EM-type basalts suggest that there are many different EM-type sources, whereas the isotopic composition of EM-type basalts suggest derivation from two broadly similar sources, i.e. EM1 and EM2. The large variability in subducting sediments with respect to both parent-daughter (e.g. Rb/Sr, Sm/Nd, U/Pb, Th/Pb,...) and other trace element ratios makes it unlikely that there are reproducible mixtures of sediments leading to two different isotopic evolution paths (EM1 and EM2) while preserving a range of incompatible element contents for each isotopic family, as would be required to reconcile the isotopic and trace element characteristics of EM-type basalts. Although this does not a priori argue against sediments as possible source components for OIB, it does argue against two distinct groups of sediments as EM1 and EM2 sources. Further characterization of sources with the same general origin (e.g. a certain type of crust or lithosphere) or identification of processes leading to reservoirs with similar parent-daughter ratio characteristics but different incompatible trace element contents could resolve the apparent conundrum.

  18. RIA Fuel Codes Benchmark - Volume 1

    International Nuclear Information System (INIS)

    Marchand, Olivier; Georgenthum, Vincent; Petit, Marc; Udagawa, Yutaka; Nagase, Fumihisa; Sugiyama, Tomoyuki; Arffman, Asko; Cherubini, Marco; Dostal, Martin; Klouzal, Jan; Geelhood, Kenneth; Gorzel, Andreas; Holt, Lars; Jernkvist, Lars Olof; Khvostov, Grigori; Maertens, Dietmar; Spykman, Gerold; Nakajima, Tetsuo; Nechaeva, Olga; Panka, Istvan; Rey Gayo, Jose M.; Sagrado Garcia, Inmaculada C.; Shin, An-Dong; Sonnenburg, Heinz Guenther; Umidova, Zeynab; Zhang, Jinzhao; Voglewede, John

    2013-01-01

    Reactivity-initiated accident (RIA) fuel rod codes have been developed for a significant period of time and they all have shown their ability to reproduce some experimental results with a certain degree of adequacy. However, they sometimes rely on different specific modelling assumptions the influence of which on the final results of the calculations is difficult to evaluate. The NEA Working Group on Fuel Safety (WGFS) is tasked with advancing the understanding of fuel safety issues by assessing the technical basis for current safety criteria and their applicability to high burnup and to new fuel designs and materials. The group aims at facilitating international convergence in this area, including the review of experimental approaches as well as the interpretation and use of experimental data relevant for safety. As a contribution to this task, WGFS conducted a RIA code benchmark based on RIA tests performed in the Nuclear Safety Research Reactor in Tokai, Japan and tests performed or planned in CABRI reactor in Cadarache, France. Emphasis was on assessment of different modelling options for RIA fuel rod codes in terms of reproducing experimental results as well as extrapolating to typical reactor conditions. This report provides a summary of the results of this task. (authors)

  19. Reproducible and expedient rice regeneration system using in vitro ...

    African Journals Online (AJOL)

    Inevitable prerequisite for expedient regeneration in rice is the selection of totipotent explant and developing an apposite combination of growth hormones. Here, we reported a reproducible regeneration protocol in which basal segments of the stem of the in vitro grown rice plants were used as ex-plant. Using the protocol ...

  20. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  1. Intercenter reproducibility of binary typing for Staphylococcus aureus

    NARCIS (Netherlands)

    van Leeuwen, Willem B.; Snoeijers, Sandor; van der Werken-Libregts, Christel; Tuip, Anita; van der Zee, Anneke; Egberink, Diane; de Proost, Monique; Bik, Elisabeth; Lunter, Bjorn; Kluytmans, Jan; Gits, Etty; van Duyn, Inge; Heck, Max; van der Zwaluw, Kim; Wannet, Wim; Noordhoek, Gerda T.; Mulder, Sije; Renders, Nicole; Boers, Miranda; Zaat, Sebastiaan; van der Riet, Daniëlle; Kooistra, Mirjam; Talens, Adriaan; Dijkshoorn, Lenie; van der Reyden, Tanny; Veenendaal, Dick; Bakker, Nancy; Cookson, Barry; Lynch, Alisson; Witte, Wolfgang; Cuny, Christa; Blanc, Dominique; Vernez, Isabelle; Hryniewicz, Waleria; Fiett, Janusz; Struelens, Marc; Deplano, Ariane; Landegent, Jim; Verbrugh, Henri A.; van Belkum, Alex

    2002-01-01

    The reproducibility of the binary typing (BT) protocol developed for epidemiological typing of Staphylococcus aureus was analyzed in a biphasic multicenter study. In a Dutch multicenter pilot study, 10 genetically unique isolates of methicillin-resistant S. aureus (MRSA) were characterized by the BT

  2. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    Science.gov (United States)

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  3. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    de Mast, J.; van Wieringen, W.N.

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  4. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  5. Reproducibility of abdominal fat assessment by ultrasound and computed tomography.

    Science.gov (United States)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge

    2017-01-01

    To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  6. Reproducibility of contrast-enhanced transrectal ultrasound of the prostate

    NARCIS (Netherlands)

    Sedelaar, J. P.; Goossen, T. E.; Wijkstra, H.; de la Rosette, J. J.

    2001-01-01

    Transrectal three-dimensional (3-D) contrast-enhanced power Doppler ultrasound (US) is a novel technique for studying possible prostate malignancy. Before studies can be performed to investigate the clinical validity of the technique, reproducibility of the contrast US studies must be proven.

  7. Reproducibility in the assessment of acute pancreatitis with computed tomography

    International Nuclear Information System (INIS)

    Freire Filho, Edison de Oliveira; Vieira, Renata La Rocca; Yamada, Andre Fukunishi; Shigueoka, David Carlos; Bekhor, Daniel; Freire, Maxime Figueiredo de Oliveira; Ajzen, Sergio; D'Ippolito, Giuseppe

    2007-01-01

    Objective: To evaluate the reproducibility of unenhanced and contrast-enhanced computed tomography in the assessment of patients with acute pancreatitis. Materials and methods: Fifty-one unenhanced and contrast-enhanced abdominal computed tomography studies of patients with acute pancreatitis were blindly reviewed by two radiologists (observers 1 and 2). The morphological index was separately calculated for unenhanced and contrast-enhanced computed tomography and the disease severity index was established. Intraobserver and interobserver reproducibility of computed tomography was measured by means of the kappa index (κ). Results: Interobserver agreement was κ 0.666, 0.705, 0.648, 0.547 and 0.631, respectively for unenhanced and contrast-enhanced morphological index, presence of pancreatic necrosis, pancreatic necrosis extension, and disease severity index. Intraobserver agreement (observers 1 and 2, respectively) was κ = 0.796 and 0.732 for unenhanced morphological index; κ 0.725 and 0.802 for contrast- enhanced morphological index; κ = 0.674 and 0.849 for presence of pancreatic necrosis; κ = 0.606 and 0.770 for pancreatic necrosis extension; and κ = 0.801 and 0.687 for disease severity index at computed tomography. Conclusion: Computed tomography for determination of morphological index and disease severity index in the staging of acute pancreatitis is a quite reproducible method. The absence of contrast- enhancement does not affect the computed tomography morphological index reproducibility. (author)

  8. Reproducible positioning in chest X-ray radiography

    International Nuclear Information System (INIS)

    1974-01-01

    A device is described that can be used to ensure reproducibility in the positioning of the patient during X-ray radiography of the thorax. Signals are taken from an electrocardiographic monitor and from a device recording the respiratory cycle. Radiography is performed only when two preselected signals coincide

  9. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    Directory of Open Access Journals (Sweden)

    Zainab S Al-Hosni

    2016-11-01

    Full Text Available Objectives: Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods: In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20. Data were analyzed using intraclass correlation coefficient (ICC as a method of reproducibility assessment. Results: The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78. The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037. When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420. Conclusions: The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts.

  10. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    project.org/) and SPSS (IBM Corp., Armonk, NY) for data analysis. Mean and confidence inter- vals for each measure are found in Tables 1–7. To assess...visits, and was calculated using a two- way mixed model in SPSS MCV and MRD values closer to 0 are considered to be the most reproducible, and ICC

  11. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaete; Benedeti, Augusto Cesar Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge, E-mail: fernando@fatesa.edu.br [Faculdade de Tecnologia em Saude (FATESA), Ribeirao Preto, SP (Brazil); Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil). Departmento de Radiologia; Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Departmento de Medicina Clinica; Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras; Hospital Mae de Deus, Porto Alegre, RS (Brazil)

    2017-05-15

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. (author)

  12. High Reproducibility of ELISPOT Counts from Nine Different Laboratories

    DEFF Research Database (Denmark)

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem

    2015-01-01

    The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable...

  13. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, W.J. (Wilhelmina J.); H.A. van Elteren (Hugo); T.G. Goos (Tom); I.K.M. Reiss (Irwin); R.C.J. de Jonge (Rogier); V.J. van den Berg (Victor J.)

    2017-01-01

    textabstractThe aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm

  14. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, Wilhelmina J.; Van Elteren, Hugo A.; Goos, T.G.; Reiss, Irwin K.M.; De Jonge, Rogier C.J.; van Den Berg, Victor J.

    2017-01-01

    The aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm neonates in the

  15. Annotating with Propp's Morphology of the Folktale: Reproducibility and Trainability

    NARCIS (Netherlands)

    Fisseni, B.; Kurji, A.; Löwe, B.

    2014-01-01

    We continue the study of the reproducibility of Propp’s annotations from Bod et al. (2012). We present four experiments in which test subjects were taught Propp’s annotation system; we conclude that Propp’s system needs a significant amount of training, but that with sufficient time investment, it

  16. Exploring the Coming Repositories of Reproducible Experiments: Challenges and Opportunities

    DEFF Research Database (Denmark)

    Freire, Juliana; Bonnet, Philippe; Shasha, Dennis

    2011-01-01

    Computational reproducibility efforts in many communities will soon give rise to validated software and data repositories of high quality. A scientist in a field may want to query the components of such repositories to build new software workflows, perhaps after adding the scientist’s own algorithms...

  17. TraceContract: A Scala DSL for Trace Analysis

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus

    2011-01-01

    In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.

  18. Reproducibility of airway luminal size in asthma measured by HRCT.

    Science.gov (United States)

    Brown, Robert H; Henderson, Robert J; Sugar, Elizabeth A; Holbrook, Janet T; Wise, Robert A

    2017-10-01

    Brown RH, Henderson RJ, Sugar EA, Holbrook JT, Wise RA, on behalf of the American Lung Association Airways Clinical Research Centers. Reproducibility of airway luminal size in asthma measured by HRCT. J Appl Physiol 123: 876-883, 2017. First published July 13, 2017; doi:10.1152/japplphysiol.00307.2017.-High-resolution CT (HRCT) is a well-established imaging technology used to measure lung and airway morphology in vivo. However, there is a surprising lack of studies examining HRCT reproducibility. The CPAP Trial was a multicenter, randomized, three-parallel-arm, sham-controlled 12-wk clinical trial to assess the use of a nocturnal continuous positive airway pressure (CPAP) device on airway reactivity to methacholine. The lack of a treatment effect of CPAP on clinical or HRCT measures provided an opportunity for the current analysis. We assessed the reproducibility of HRCT imaging over 12 wk. Intraclass correlation coefficients (ICCs) were calculated for individual airway segments, individual lung lobes, both lungs, and air trapping. The ICC [95% confidence interval (CI)] for airway luminal size at total lung capacity ranged from 0.95 (0.91, 0.97) to 0.47 (0.27, 0.69). The ICC (95% CI) for airway luminal size at functional residual capacity ranged from 0.91 (0.85, 0.95) to 0.32 (0.11, 0.65). The ICC measurements for airway distensibility index and wall thickness were lower, ranging from poor (0.08) to moderate (0.63) agreement. The ICC for air trapping at functional residual capacity was 0.89 (0.81, 0.94) and varied only modestly by lobe from 0.76 (0.61, 0.87) to 0.95 (0.92, 0.97). In stable well-controlled asthmatic subjects, it is possible to reproducibly image unstimulated airway luminal areas over time, by region, and by size at total lung capacity throughout the lungs. Therefore, any changes in luminal size on repeat CT imaging are more likely due to changes in disease state and less likely due to normal variability. NEW & NOTEWORTHY There is a surprising lack

  19. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback (p-value = 0.012). The diaphragm motion reproducibility improvements with AV biofeedback were consistent with the abdominal motion reproducibility that was observed from the external marker motion variation. Conclusions: This study was the first to investigate the potential of AV biofeedback to improve the motion

  20. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  1. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance.

    Science.gov (United States)

    Nöremark, Maria; Widgren, Stefan

    2014-03-17

    During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs.

  2. Measurement of Selected Organic Trace Gases During TRACE-P

    Science.gov (United States)

    Atlas, Elliot

    2004-01-01

    Major goals of the TRACE-P mission were: 1) to investigate the chemical composition of radiatively important gases, aerosols, and their precursors in the Asian outflow over the western Pacific, and 2) to describe and understand the chemical evolution of the Asian outflow as it is transported and mixed into the global troposphere. The research performed as part of this proposal addressed these major goals with a study of the organic chemical composition of gases in the TRACE-P region. This work was a close collaboration with the Blake/Rowland research group at UC-Irvine, and they have provided a separate report for their funded effort.

  3. A Theory of Network Tracing

    Science.gov (United States)

    Acharya, Hrishikesh B.; Gouda, Mohamed G.

    Traceroute is a widely used program for computing the topology of any network in the Internet. Using Traceroute, one starts from a node and chooses any other node in the network. Traceroute obtains the sequence of nodes that occur between these two nodes, as specified by the routing tables in these nodes. Each use of Traceroute in a network produces a trace of nodes that constitute a simple path in this network. In every trace that is produced by Traceroute, each node occurs either by its unique identifier, or by the anonymous identifier"*". In this paper, we introduce the first theory aimed at answering the following important question. Is there an algorithm to compute the topology of a network N from a trace set T that is produced by using Traceroute in network N, assuming that each edge in N occurs in at least one trace in T, and that each node in N occurs by its unique identifier in at least one trace in T? We prove that the answer to this question is "No" if N is an even ring or a general network. However, it is "Yes" if N is a tree or an odd ring. The answer is also "No" if N is mostly-regular, but "Yes" if N is a mostly-regular even ring.

  4. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  5. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    Science.gov (United States)

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. An Open Framework for the Reproducible Study of the Iterated Prisoner’s Dilemma

    Directory of Open Access Journals (Sweden)

    Vincent Knight

    2016-08-01

    Full Text Available The Axelrod library is an open source Python package that allows for reproducible game theoretic research into the Iterated Prisoner’s Dilemma. This area of research began in the 1980s but suffers from a lack of documentation and test code. The goal of the library is to provide such a resource, with facilities for the design of new strategies and interactions between them, as well as conducting tournaments and ecological simulations for populations of strategies. With a growing collection of 139 strategies, the library is a also a platform for an original tournament that, in itself, is of interest to the game theoretic community. This paper describes the Iterated Prisoner’s Dilemma, the Axelrod library and its development, and insights gained from some novel research.

  7. CERN Analysis Preservation: A Novel Digital Library Service to Enable Reusable and Reproducible Research

    CERN Document Server

    AUTHOR|(CDS)2079501; Chen, Xiaoli; Dani, Anxhela; Dasler, Robin Lynnette; Delgado Fernandez, Javier; Fokianos, Pamfilos; Herterich, Patricia Sigrid; Simko, Tibor

    2016-01-01

    The latest policy developments require immediate action for data preservation, as well as reproducible and Open Science. To address this, an unprecedented digital library service is presented to enable the High-Energy Physics community to preserve and share their research objects (such as data, code, documentation, notes) throughout their research process. While facing the challenges of a “big data” community, the internal service builds on existing internal databases to make the process as easy and intrinsic as possible for researchers. Given the “work in progress” nature of the objects preserved, versioning is supported. It is expected that the service will not only facilitate better preservation techniques in the community, but will foremost make collaborative research easier as detailed metadata and novel retrieval functionality provide better access to ongoing works. This new type of e-infrastructure, fully integrated into the research workflow, could help in fostering Open Science practices acro...

  8. SU-E-J-227: Breathing Pattern Consistency and Reproducibility: Comparative Analysis for Supine and Prone Body Positioning

    International Nuclear Information System (INIS)

    Laugeman, E; Weiss, E; Chen, S; Hugo, G; Rosu, M

    2014-01-01

    Purpose: Evaluate and compare the cycle-to-cycle consistency of breathing patterns and their reproducibility over the course of treatment, for supine and prone positioning. Methods: Respiratory traces from 25 patients were recorded for sequential supine/prone 4DCT scans acquired prior to treatment, and during the course of the treatment (weekly or bi-weekly). For each breathing cycle, the average(AVE), end-of-exhale(EoE) and end-of-inhale( EoI) locations were identified using in-house developed software. In addition, the mean values and variations for the above quantities were computed for each breathing trace. F-tests were used to compare the cycle-to-cycle consistency of all pairs of sequential supine and prone scans. Analysis of variances was also performed using population means for AVE, EoE and EoI to quantify differences between the reproducibility of prone and supine respiration traces over the treatment course. Results: Consistency: Cycle-to-cycle variations are less in prone than supine in the pre-treatment and during-treatment scans for AVE, EoE and EoI points, for the majority of patients (differences significant at p<0.05). The few cases where the respiratory pattern had more variability in prone appeared to be random events. Reproducibility: The reproducibility of breathing patterns (supine and prone) improved as treatment progressed, perhaps due to patients becoming more comfortable with the procedure. However, variability in supine position continued to remain significantly larger than in prone (p<0.05), as indicated by the variance analysis of population means for the pretreatment and subsequent during-treatment scans. Conclusions: Prone positioning stabilizes breathing patterns in most subjects investigated in this study. Importantly, a parallel analysis of the same group of patients revealed a tendency towards increasing motion amplitude of tumor targets in prone position regardless of their size or location; thus, the choice for body positioning

  9. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  10. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  11. [Trace elements of bone tumors].

    Science.gov (United States)

    Kalashnikov, V M; Zaĭchik, V E; Bizer, V A

    1983-01-01

    Due to activation analysis involving the use of neutrons from a nuclear reactor, the concentrations of 11 trace elements: scandium, iron, cobalt, mercury, rubidium, selenium, silver, antimony, chrome, zinc and terbium in intact bone and skeletal tumors were measured. 76 specimens of bioptates and resected material of operations for bone tumors and 10 specimens of normal bone tissue obtained in autopsies of cases of sudden death were examined. The concentrations of trace elements and their dispersion patterns in tumor tissue were found to be significantly higher than those in normal bone tissue. Also, the concentrations of some trace elements in tumor differed significantly from those in normal tissue; moreover, they were found to depend on the type and histogenesis of the neoplasm.

  12. Trace elements in human milk

    Energy Technology Data Exchange (ETDEWEB)

    Parr, R M [International Atomic Energy Agency, Vienna (Austria). Div. of Life Sciences

    1983-06-01

    Trace elements are those elements having a concentration lower than 10 ppm in body fluids or tissues. A total of 24 elements, both trace and minor elements, present in human milk have been analysed in this study, employing neutron activation analysis and absorption spectroscopy. The analyses have been carried out collaboratively by several different laboratories and the Agency which has also served as a coordinating centre. Although the evaluation of the results, altogether 8500 separate values, is still in progress, enough evidence is already available, however, to show some very interesting differences between different study areas and, in some cases, between different socio-economic groups within a single country. The main value of these data will probably be to throw new light on the nutritional requirements of young babies for trace elements.

  13. Reproducibility of (n,γ) gamma ray spectrum in Pb under different ENDF/B releases

    Energy Technology Data Exchange (ETDEWEB)

    Kebwaro, J.M., E-mail: jeremiahkebwaro@gmail.com [Department of Physical Sciences, Karatina University, P.O. Box 1957-10101, Karatina (Kenya); He, C.H.; Zhao, Y.L. [School of Nuclear Science and Technology, Xian Jiaotong University, Xian, Shaanxi 710049 (China)

    2016-04-15

    Radiative capture reactions are of interest in shielding design and other fundamental research. In this study the reproducibility of (n,γ) reactions in Pb when cross-section data from different ENDF/B releases are used in the Monte-Carlo code, MCNP, was investigated. Pb was selected for this study because it is widely used in shielding applications where capture reactions are likely to occur. Four different neutron spectra were declared as source in the MCNP model which consisted of a simple spherical geometry. The gamma ray spectra due to the capture reactions were recorded at 10 cm from the center of the sphere. The results reveal that the gamma ray spectrum produced by ENDF/B-V is in reasonable agreement with that produced when ENDF/B-VI.6 is used. However the spectrum produced by ENDF/B-VII does not reveal any primary gamma rays in the higher energy region (E > 3 MeV). It is further observed that the intensities of the capture gamma rays produced when various releases are used differ by a some margin showing that the results are not reproducible. The generated spectra also vary with the spectrum of the source neutrons. The discrepancies observed among various ENDF/B releases could raise concerns to end users and need to be addressed properly during benchmarking calculations before the next release. The evaluation from ENDF to ACE format that is supplied with MCNP should also be examined because errors might have arisen during the evaluation.

  14. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Science.gov (United States)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  15. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Directory of Open Access Journals (Sweden)

    Wilke Daniel N.

    2017-01-01

    Full Text Available The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  16. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    Science.gov (United States)

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  17. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  18. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  19. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  20. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  1. Adjustments in Almod3W2 transient analysis code to fit Angra 1 NPP experimental data

    International Nuclear Information System (INIS)

    Madeira, A.A.; Camargo, C.T.M.

    1988-01-01

    Some little modifications were introduced in ALMOD3W2 code, as consequence of the interest in reproducing the full load rejection test in Angra 1 NPP. Such modifications showed to be adequate when code results were compared with experimental data. (author) [pt

  2. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  3. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  4. Manual tracing versus smartphone application (app) tracing: a comparative study.

    Science.gov (United States)

    Sayar, Gülşilay; Kilinc, Delal Dara

    2017-11-01

    This study aimed to compare the results of conventional manual cephalometric tracing with those acquired with smartphone application cephalometric tracing. The cephalometric radiographs of 55 patients (25 females and 30 males) were traced via the manual and app methods and were subsequently examined with Steiner's analysis. Five skeletal measurements, five dental measurements and two soft tissue measurements were managed based on 21 landmarks. The durations of the performances of the two methods were also compared. SNA (Sella, Nasion, A point angle) and SNB (Sella, Nasion, B point angle) values for the manual method were statistically lower (p < .001) than those for the app method. The ANB value for the manual method was statistically lower than that of app method. L1-NB (°) and upper lip protrusion values for the manual method were statistically higher than those for the app method. Go-GN/SN, U1-NA (°) and U1-NA (mm) values for manual method were statistically lower than those for the app method. No differences between the two methods were found in the L1-NB (mm), occlusal plane to SN, interincisal angle or lower lip protrusion values. Although statistically significant differences were found between the two methods, the cephalometric tracing proceeded faster with the app method than with the manual method.

  5. Tracing a planar algebraic curve

    International Nuclear Information System (INIS)

    Chen Falai; Kozak, J.

    1994-09-01

    In this paper, an algorithm that determines a real algebraic curve is outlined. Its basic step is to divide the plane into subdomains that include only simple branches of the algebraic curve without singular points. Each of the branches is then stably and efficiently traced in the particular subdomain. Except for the tracing, the algorithm requires only a couple of simple operations on polynomials that can be carried out exactly if the coefficients are rational, and the determination of zeros of several polynomials of one variable. (author). 5 refs, 4 figs

  6. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    -predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  7. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  8. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  9. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  10. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  11. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    Science.gov (United States)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  12. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  13. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  14. Code of ethics and conduct for European nursing.

    Science.gov (United States)

    Sasso, Loredana; Stievano, Alessandro; González Jurado, Máximo; Rocco, Gennaro

    2008-11-01

    A main identifying factor of professions is professionals' willingness to comply with ethical and professional standards, often defined in a code of ethics and conduct. In a period of intense nursing mobility, if the public are aware that health professionals have committed themselves to the drawing up of a code of ethics and conduct, they will have more trust in the health professional they choose, especially if this person comes from another European Member State. The Code of Ethics and Conduct for European Nursing is a programmatic document for the nursing profession constructed by the FEPI (European Federation of Nursing Regulators) according to Directive 2005/36/EC On recognition of professional qualifications , and Directive 2006/123/EC On services in the internal market, set out by the European Commission. This article describes the construction of the Code and gives an overview of some specific areas of importance. The main text of the Code is reproduced in Appendix 1.

  15. APC-II: an electron beam propagation code

    International Nuclear Information System (INIS)

    Iwan, D.C.; Freeman, J.R.

    1984-05-01

    The computer code APC-II simulates the propagation of a relativistic electron beam through air. APC-II is an updated version of the APC envelope model code. It incorporates an improved conductivity model which significantly extends the range of stable calculations. A number of test cases show that these new models are capable of reproducing the simulations of the original APC code. As the result of a major restructuring and reprogramming of the code, APC-II is now friendly to both the occasional user and the experienced user who wishes to make modifications. Most of the code is in standard ANS-II Fortran 77 so that it can be easily transported between machines

  16. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  17. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    International Nuclear Information System (INIS)

    Gaona, Enrique

    2003-01-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image

  18. Reproducibility of CT bone dosimetry: Operator versus automated ROI definition

    International Nuclear Information System (INIS)

    Louis, O.; Luypaert, R.; Osteaux, M.; Kalender, W.

    1988-01-01

    Intrasubject reproducibility with repeated determination of vertebral mineral density from a given set of CT images was investigated. The region of interest (ROI) in 10 patient scans was selected by four independent operators either manually or with an automated procedure separating cortical and spongeous bone, the operators being requested to interact in ROI selection. The mean intrasubject variation was found to be much lower with the automated process (0.3 to 0.6%) than with the conventional method (2.5 to 5.2%). In a second study, 10 patients were examined twice to determine the reproducibility of CT slice selection by the operator. The errors were of the same order of magnitude as in ROI selection. (orig.)

  19. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    , has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise......This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...... and speech. The results have shown that the first-order floor and ceiling reflections are likely to individually contribute to the timbre of reproduced speech. For a noise signal, additional reflections from the left sidewall will contribute individually. The level of the reverberant field has been found...

  20. Transition questions in clinical practice - validity and reproducibility

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein

    2008-01-01

    Transition questions in CLINICAL practice - validity and reproducibility Lauridsen HH1, Manniche C3, Grunnet-Nilsson N1, Hartvigsen J1,2 1   Clinical Locomotion Science, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark. e-mail: hlauridsen......@health.sdu.dk 2   Nordic Institute of Chiropractic and Clinical Biomechanics, Part of Clinical Locomotion Science, Odense, Denmark 3   Backcenter Funen, Part of Clinical Locomotion Science, Ringe, Denmark   Abstract  Understanding a change score is indispensable for interpretation of results from clinical studies...... are reproducible in patients with low back pain and/or leg pain. Despite critique of several biases, our results have reinforced the construct validity of TQ’s as an outcome measure since only 1 hypothesis was rejected. On the basis of our findings we have outlined a proposal for a standardised use of transition...

  1. LHC Orbit Correction Reproducibility and Related Machine Protection

    CERN Document Server

    Baer, T; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the machine in a vulnerable state, a novel software-based interlock system for orbit corrector currents was developed. In this paper, the principle of the new interlock system is described and the reproducibility of the LHC orbit correction is discussed against the background of this system.

  2. Towards reproducibility of research by reuse of IT best practices

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Reproducibility of any research gives much higher credibility both to research results and to the researchers. This is true for any kind of research including computer science, where a lot of tools and approaches have been developed to ensure reproducibility. In this talk I will focus on basic and seemingly simple principles, which sometimes look too obvious to follow, but help researchers build beautiful and reliable systems that produce consistent, measurable results. My talk will cover, among other things, the problem of embedding machine learning techniques into analysis strategy. I will also speak about the most common pitfalls in this process and how to avoid them. In addition, I will demonstrate the research environment based on the principles that I will have outlined. About the speaker Andrey Ustyuzhanin (36) is Head of CERN partnership program at Yandex. He is involved in the development of event indexing and event filtering services which Yandex has been providing for the LHCb experiment sinc...

  3. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    Energy Technology Data Exchange (ETDEWEB)

    Suer, Pascal, E-mail: pascal.suer@swedgeo.se [Swedish Geotechnical Institute, Linkoeping (Sweden); Lindqvist, Jan-Erik [Swedish Cement and Concrete Research Institute, Boras (Sweden); Arm, Maria; Frogner-Kockum, Paul [Swedish Geotechnical Institute, Linkoeping (Sweden)

    2009-09-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO{sub 2}) were used for accelerated ageing. Time (7-14 days), temperature (20-40 {sup o}C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO{sub 2} and seven days at 40 {sup o}C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO{sub 4}, DOC and Cr were not reproduced.

  4. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  5. The reproducibility of single photon absorptiometry in a clinical setting

    International Nuclear Information System (INIS)

    Valkema, R.; Blokland, J.A.K.; Pauwels, E.K.J.; Papapoulos, S.E.; Bijvoet, O.L.M.

    1989-01-01

    The reproducibility of single photon absorptiometry (SPA) results for detection of changes in bone mineral content (BMC) was evaluated in a clinical setting. During a period of 18 months with 4 different sources, the calibration scans of an aluminium standard had a variation of less than 1% unless the activity of the 125 I source was low. The calibration procedure was performed weekly and this was sufficient to correct for drift of the system. The short term reproducibility in patients was assessed with 119 duplicate measurements made in direct succession. The best reproducibility (CV=1.35%) was found for fat corrected BMC results expressed in g/cm, obtained at the site proximal to the 8 mm space between the radius and ulna. Analysis of all SPA scans made during 1 year (487 scans) showed a failure of the automatic procedure to detect the space of 8 mm between the forearm bones in 19 scans (3.9%). A space adjacent to the ulnar styloid was taken as the site for the first scan in these examinations. This problem may be recognized and corrected relatively easy. A significant correlation was found between BMC at the lower arm and BMC of the lumbar spine assessed with dual photon absorptiometry. However, the error of estimation of proximal BMC (SEE=20%) and distal BMC (SEE=19.4%) made these measurements of little value to predict BMC at the lumbar spine in individuals. The short term reproducibility in patients combined with long term stability of the equipment in our clinical setting showed that SPA is a reliable technique to assess changes in bone mass at the lower arm of 4% between 2 measurements with a confidence level of 95%. (orig.)

  6. Towards reproducible MSMS data preprocessing, quality control and quantification

    OpenAIRE

    Gatto, Laurent; Lilley, Kathryn S.

    2010-01-01

    The development of MSnbase aims at providing researchers dealing with labelled quantitative proteomics data with a transparent, portable, extensible and open-source collaborative framework to easily manipulate and analyse MS2-level raw tandem mass spectrometry data. The implementation in R gives users and developers a great variety of powerful tools to be used in a controlled and reproducible way. Furthermore, MSnbase has been developed following an object-oriented programming paradigm: all i...

  7. Cuban strategy for reproducing, preserving and developing nuclear knowledge

    International Nuclear Information System (INIS)

    Elias Hardy, L.L.; Guzman Martinez, F.; Rodriguez Hoyos, O.E.; Lopez Nunez, A.F.

    2006-01-01

    One of the problems in the changing world is the preservation of knowledge for the next human generation, and nuclear knowledge is not an exception. Cuba has worked for reproducing, preserving, developing and capturing nuclear knowledge, mainly through a higher education centre, the Higher Institute of Nuclear Sciences and Technologies. This institute is a component of a national network in the preparation of manpower not only for nuclear activities but also for environmental and managerial activities too. (author)

  8. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    Science.gov (United States)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  9. Serous tubal intraepithelial carcinoma: diagnostic reproducibility and its implications.

    Science.gov (United States)

    Carlson, Joseph W; Jarboe, Elke A; Kindelberger, David; Nucci, Marisa R; Hirsch, Michelle S; Crum, Christopher P

    2010-07-01

    Serous tubal intraepithelial carcinoma (STIC) is detected in between 5% and 7% of women undergoing risk-reduction salpingooophorectomy for mutations in the BRCA1 or 2 genes (BRCA+), and seems to play a role in the pathogenesis of many ovarian and "primary peritoneal" serous carcinomas. The recognition of STIC is germane to the management of BRCA+ women; however, the diagnostic reproducibility of STIC is unknown. Twenty-one cases were selected and classified as STIC or benign, using both hematoxylin and eosin and immunohistochemical stains for p53 and MIB-1. Digital images of 30 hematoxylin and eosin-stained STICs (n=14) or benign tubal epithelium (n=16) were photographed and randomized for blind digital review in a Powerpoint format by 6 experienced gynecologic pathologists and 6 pathology trainees. A generalized kappa statistic for multiple raters was calculated for all groups. For all reviewers, the kappa was 0.333, indicating poor reproducibility; kappa was 0.453 for the experienced gynecologic pathologists (fair-to-good reproducibility), and kappa=0.253 for the pathology residents (poor reproducibility). In the experienced group, 3 of 14 STICs were diagnosed by all 6 reviewers, and 9 of 14 by a majority of the reviewers. These results show that interobserver concordance in the recognition of STIC in high-quality digital images is at best fair-to-good for even experienced gynecologic pathologists, and a proportion cannot be consistently identified even among experienced observers. In view of these findings, a diagnosis of STIC should be corroborated by a second pathologist, if feasible.

  10. Adaptive Learning in Cartesian Product of Reproducing Kernel Hilbert Spaces

    OpenAIRE

    Yukawa, Masahiro

    2014-01-01

    We propose a novel adaptive learning algorithm based on iterative orthogonal projections in the Cartesian product of multiple reproducing kernel Hilbert spaces (RKHSs). The task is estimating/tracking nonlinear functions which are supposed to contain multiple components such as (i) linear and nonlinear components, (ii) high- and low- frequency components etc. In this case, the use of multiple RKHSs permits a compact representation of multicomponent functions. The proposed algorithm is where t...

  11. Reproducibility Test for Thermoluminescence Dosimeter (TLD) Using TLD Radpro

    International Nuclear Information System (INIS)

    Nur Khairunisa Zahidi; Ahmad Bazlie Abdul Kadir; Faizal Azrin Abdul Razalim

    2016-01-01

    Thermoluminescence dosimeters (TLD) as one type of dosimeter which are often used to substitute the film badge. Like a film badge, it is worn for a period of time and then must be processed to determine the dose received. This study was to test the reproducibility of TLD using Radpro reader. This study aimed to determine the dose obtained by TLD-100 chips when irradiated with Co-60 gamma source and to test the effectiveness of TLD Radpro reader as a machine to analyse the TLD. Ten chips of TLD -100 were irradiated using Eldorado machine with Co-60 source at a distance of 5 meters from the source with 2 mSv dose exposure. After the irradiation process, TLD-100 chips were read using the TLD Radpro reader. These steps will be repeated for nine times to obtain reproducibility coefficient, r i . The readings of dose obtained from experiment was almost equivalent to the actual dose. Results shows that the average value obtained for reproducibility coefficient, r i is 6.39 % which is less than 10 %. As conclusion, the dose obtained from experiment considered accurate because its value were almost equivalent to the actual dose and TLD Radpro was verified as a good reader to analyse the TLD. (author)

  12. Reproducibility of gene expression across generations of Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Haslett Judith N

    2003-06-01

    Full Text Available Abstract Background The development of large-scale gene expression profiling technologies is rapidly changing the norms of biological investigation. But the rapid pace of change itself presents challenges. Commercial microarrays are regularly modified to incorporate new genes and improved target sequences. Although the ability to compare datasets across generations is crucial for any long-term research project, to date no means to allow such comparisons have been developed. In this study the reproducibility of gene expression levels across two generations of Affymetrix GeneChips® (HuGeneFL and HG-U95A was measured. Results Correlation coefficients were computed for gene expression values across chip generations based on different measures of similarity. Comparing the absolute calls assigned to the individual probe sets across the generations found them to be largely unchanged. Conclusion We show that experimental replicates are highly reproducible, but that reproducibility across generations depends on the degree of similarity of the probe sets and the expression level of the corresponding transcript.

  13. Reproducibility of the Portuguese version of the PEDro Scale

    Directory of Open Access Journals (Sweden)

    Silvia Regina Shiwa

    2011-10-01

    Full Text Available The objective of this study was to test the inter-rater reproducibility of the Portuguese version of the PEDro Scale. Seven physiotherapists rated the methodological quality of 50 reports of randomized controlled trials written in Portuguese indexed on the PEDro database. Each report was also rated using the English version of the PEDro Scale. Reproducibility was evaluated by comparing two separate ratings of reports written in Portuguese and comparing the Portuguese PEDro score with the English version of the scale. Kappa coefficients ranged from 0.53 to 1.00 for individual item and an intraclass correlation coefficient (ICC of 0.82 for the total PEDro score was observed. The standard error of the measurement of the scale was 0.58. The Portuguese version of the scale was comparable with the English version, with an ICC of 0.78. The inter-rater reproducibility of the Brazilian Portuguese PEDro Scale is adequate and similar to the original English version.

  14. Can cancer researchers accurately judge whether preclinical reports will reproduce?

    Directory of Open Access Journals (Sweden)

    Daniel Benjamin

    2017-06-01

    Full Text Available There is vigorous debate about the reproducibility of research findings in cancer biology. Whether scientists can accurately assess which experiments will reproduce original findings is important to determining the pace at which science self-corrects. We collected forecasts from basic and preclinical cancer researchers on the first 6 replication studies conducted by the Reproducibility Project: Cancer Biology (RP:CB to assess the accuracy of expert judgments on specific replication outcomes. On average, researchers forecasted a 75% probability of replicating the statistical significance and a 50% probability of replicating the effect size, yet none of these studies successfully replicated on either criterion (for the 5 studies with results reported. Accuracy was related to expertise: experts with higher h-indices were more accurate, whereas experts with more topic-specific expertise were less accurate. Our findings suggest that experts, especially those with specialized knowledge, were overconfident about the RP:CB replicating individual experiments within published reports; researcher optimism likely reflects a combination of overestimating the validity of original studies and underestimating the difficulties of repeating their methodologies.

  15. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  16. Aveiro method in reproducing kernel Hilbert spaces under complete dictionary

    Science.gov (United States)

    Mai, Weixiong; Qian, Tao

    2017-12-01

    Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.

  17. Validity and reproducibility of a Spanish dietary history.

    Directory of Open Access Journals (Sweden)

    Pilar Guallar-Castillón

    Full Text Available To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E, which collects information on numerous aspects of the Spanish diet.The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart.The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66, meat (r = 0.66, fish (r = 0.42, vegetables (r = 0.62 and fruits (r = 0.44. The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76, proteins (r= 0.58, lipids (r = 0.73, saturated fat (r = 0.73, monounsaturated fat (r = 0.59, polyunsaturated fat (r = 0.57, and carbohydrates (r = 0.66. The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients.The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients.

  18. The Reproducibility of Nuclear Morphometric Measurements in Invasive Breast Carcinoma

    Directory of Open Access Journals (Sweden)

    Pauliina Kronqvist

    1997-01-01

    Full Text Available The intraobserver and interobserver reproducibility of computerized nuclear morphometry was determined in repeated measurements of 212 samples of invasive breast cancer. The influence of biological variation and the selection of the measurement area was also tested. Morphometrically determined mean nuclear profile area (Pearson’s r 0.89, grading efficiency (GE 0.95 and standard deviation (SD of nuclear profile area (Pearson’s r 0.84, GE 0.89 showed high reproducibility. In this respect, nuclear morphometry equals with other established methods of quantitative pathology and exceeds the results of subjective grading of nuclear atypia in invasive breast cancer. A training period of eight days was sufficient to produce clear improvement in consistency of nuclear morphometry results. By estimating the sources of variation it could be shown that the variation associated with the measurement procedure itself is small. Instead, sample associated variation is responsible for the majority of variation in the measurements (82.9% in mean nuclear profile area and 65.9% in SD of nuclear profile area. This study points out that when standardized methods are applied computerized morphometry is a reproducible and reliable method of assessing nuclear atypia in invasive breast cancer. For further improvement special emphasize should be put on sampling rules of selecting the microscope fields and measurement areas.

  19. Trace elements in brazilian soils

    International Nuclear Information System (INIS)

    Rocha, Geraldo Cesar

    1995-01-01

    A literature revision on trace elements (Zn, B, Mn, Mo, Cu, Fe, and Cl) in Brazilian soils was prepared, with special attention to the chemical form and range in the soil, extraction methods and correlation of the amount in soils with soil properties

  20. Digital Traces of Information Systems

    DEFF Research Database (Denmark)

    Hedman, Jonas; Srinivasan, Nikhil; Lindgren, Rikard

    2013-01-01

    . This disconcerting result suggests that IS researchers must pay more attention to the changing landscape of data sources. To motivate and guide fellow colleagues to establish the credibility and reliability of digital traces, we develop a future research agenda that covers both opportunities in theory generation...

  1. Inter-examiner reproducibility of tests for lumbar motor control

    Directory of Open Access Journals (Sweden)

    Elkjaer Arne

    2011-05-01

    Full Text Available Abstract Background Many studies show a relation between reduced lumbar motor control (LMC and low back pain (LBP. However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS, sitting forward lean (SFL, sitting knee extension (SKE, and bent knee fall out (BKFO, all measured in cm, and leg lowering (LL, measured in mm Hg. A total of 40 subjects (14 males, 26 females 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8, were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27 cm for RPS, 1.01 (0.62 cm for SFL, 0.40 (0.29 cm for SKE, 1.07 (0.52 cm for BKFO, and 32.9 (7.1 mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these

  2. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    Science.gov (United States)

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  3. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  4. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  5. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  6. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  7. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  8. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  9. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  10. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  11. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  12. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  13. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  14. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  15. TRACK The New Beam Dynamics Code

    CERN Document Server

    Mustapha, Brahim; Ostroumov, Peter; Schnirman-Lessner, Eliane

    2005-01-01

    The new ray-tracing code TRACK was developed* to fulfill the special requirements of the RIA accelerator systems. The RIA lattice includes an ECR ion source, a LEBT containing a MHB and a RFQ followed by three SC linac sections separated by two stripping stations with appropriate magnetic transport systems. No available beam dynamics code meet all the necessary requirements for an end-to-end simulation of the RIA driver linac. The latest version of TRACK was used for end-to-end simulations of the RIA driver including errors and beam loss analysis.** In addition to the standard capabilities, the code includes the following new features: i) multiple charge states ii) realistic stripper model; ii) static and dynamic errors iii) automatic steering to correct for misalignments iv) detailed beam-loss analysis; v) parallel computing to perform large scale simulations. Although primarily developed for simulations of the RIA machine, TRACK is a general beam dynamics code. Currently it is being used for the design and ...

  16. Advanced Presentation of BETHSY 6.2TC Test Results Calculated by RELAP5 and TRACE

    Directory of Open Access Journals (Sweden)

    Andrej Prošek

    2012-01-01

    Full Text Available Today most software applications come with a graphical user interface, including U.S. Nuclear Regulatory Commission TRAC/RELAP Advanced Computational Engine (TRACE best-estimate reactor system code. The graphical user interface is called Symbolic Nuclear Analysis Package (SNAP. The purpose of the present study was to assess the TRACE computer code and to assess the SNAP capabilities for input deck preparation and advanced presentation of the results. BETHSY 6.2 TC test was selected, which is 15.24 cm equivalent diameter horizontal cold leg break. For calculations the TRACE V5.0 Patch 1 and RELAP5/MOD3.3 Patch 4 were used. The RELAP5 legacy input deck was converted to TRACE input deck using SNAP. The RELAP5 and TRACE comparison to experimental data showed that TRACE results are as good as or better than the RELAP5 calculated results. The developed animation masks were of great help in comparison of results and investigating the calculated physical phenomena and processes.

  17. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  18. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  19. Measurement of temporal asymmetries of glucose consumption using linear profiles: reproducibility and comparison with visual analysis

    International Nuclear Information System (INIS)

    Matheja, P.; Kuwert, T.; Schaefers, M.; Schaefers, K.; Schober, O.; Diehl, B.; Stodieck, S.R.G.; Ringelstein, E.B.; Schuierer, G.

    1998-01-01

    The aim of our study was to test the reproducibility of this method and to compare its diagnostic performance to that of visual analysis in patients with complex partial seizures (CPS). Regional cerebral glucose consumption (rCMRGLc) was measured interictally in 25 CPS patients and 10 controls using F-18-deoxyglucose and the positron emission tomography (PET) camera ECAT EXACT 47. The PET scans were visually analyzed for the occurrence of unilateral temporal hypometabolism. Furthermore, rCMRGLc was quantified on six contiguous coronal planes by manually tracing maximal values of temporal glucose consumption, thus creating line profiles of temporal glucose consumption for each side. Indices of asymmetry (ASY) were then calculated from these line profiles in four temporal regions and compared to the corresponding 95% confidence intervals of the control data. All analyses were performed by two observers independently from each other and without knowledge of the clinical findings. The agreement between the two observers with regard to focus lateralization was 96% on visual analysis and 100% on quantitative analysis. There was an excellent agreement with regard to focus lateralization between visual and quantitative evaluation. (orig.) [de

  20. Image Tracing: An Analysis of Its Effectiveness in Children's Pictorial Discrimination Learning

    Science.gov (United States)

    Levin, Joel R.; And Others

    1977-01-01

    A total of 45 fifth grade students were the subjects of an experiment offering support for a component of learning strategy (memory imagery). Various theoretical explanations of the image-tracing phenomenon are considered, including depth of processing, dual coding and frequency. (MS)

  1. Generalised tally-based decoders for traitor tracing and group testing

    NARCIS (Netherlands)

    Skoric, B.; de Groot, W.

    2015-01-01

    We propose a new type of score function for Tardos traitor tracing codes. It is related to the recently introduced tally-based score function, but it utilizes more of the information available to the decoder. It does this by keeping track of sequences of symbols in the distributed codewords instead

  2. Assessment of precision and reproducibility of a new myograph

    Directory of Open Access Journals (Sweden)

    Piepenbrock Siegfried

    2007-12-01

    Full Text Available Abstract Background The physiological characteristics of muscle activity and the assessment of muscle strength represent important diagnostic information. There are many devices that measure muscle force in humans, but some require voluntary contractions, which are difficult to assess in weak or unconscious patients who are unable to complete a full range of voluntary force assessment tasks. Other devices, which obtain standard muscle contractions by electric stimulations, do not have the technology required to induce and measure reproducible valid contractions at the optimum muscle length. Methods In our study we used a newly developed diagnostic device which measures accurately the reproducibility and time-changed-variability of the muscle force in an individual muscle. A total of 500 in-vivo measurements of supra-maximal isometric single twitch contractions were carried out on the musculus adductor pollicis of 5 test subjects over 10 sessions, with ten repetitions per session. The same protocol was performed on 405 test subjects with two repetitions each to determine a reference-interval on healthy subjects. Results Using our test setting, we found a high reproducibility of the muscle contractions of each test subject. The precision of the measurements performed with our device was 98.74%. Only two consecutive measurements are needed in order to assess a real, representative individual value of muscle force. The mean value of the force of contraction was 9.51 N and the 95% reference interval was 4.77–14.25 N. Conclusion The new myograph is a highly reliable measuring device with which the adductor pollicis can be investigated at the optimum length. It has the potential to become a reliable and valid tool for diagnostic in the clinical setting and for monitoring neuromuscular diseases.

  3. Efficient and reproducible mammalian cell bioprocesses without probes and controllers?

    Science.gov (United States)

    Tissot, Stéphanie; Oberbek, Agata; Reclari, Martino; Dreyer, Matthieu; Hacker, David L; Baldi, Lucia; Farhat, Mohamed; Wurm, Florian M

    2011-07-01

    Bioprocesses for recombinant protein production with mammalian cells are typically controlled for several physicochemical parameters including the pH and dissolved oxygen concentration (DO) of the culture medium. Here we studied whether these controls are necessary for efficient and reproducible bioprocesses in an orbitally shaken bioreactor (OSR). Mixing, gas transfer, and volumetric power consumption (P(V)) were determined in both a 5-L OSR and a 3-L stirred-tank bioreactor (STR). The two cultivation systems had a similar mixing intensity, but the STR had a lower volumetric mass transfer coefficient of oxygen (k(L)a) and a higher P(V) than the OSR. Recombinant CHO cell lines expressing either tumor necrosis factor receptor as an Fc fusion protein (TNFR:Fc) or an anti-RhesusD monoclonal antibody were cultivated in the two systems. The 5-L OSR was operated in an incubator shaker with 5% CO(2) in the gas environment but without pH and DO control whereas the STR was operated with or without pH and DO control. Higher cell densities and recombinant protein titers were obtained in the OSR as compared to both the controlled and the non-controlled STRs. To test the reproducibility of a bioprocess in a non-controlled OSR, the two CHO cell lines were each cultivated in parallel in six 5-L OSRs. Similar cell densities, cell viabilities, and recombinant protein titers along with similar pH and DO profiles were achieved in each group of replicates. Our study demonstrated that bioprocesses can be performed in OSRs without pH or DO control in a highly reproducible manner, at least at the scale of operation studied here. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. REPRODUCIBILITY OF MASKED HYPERTENSION AMONG ADULTS 30 YEARS AND OLDER

    Science.gov (United States)

    Viera, Anthony J.; Lin, Feng-Chang; Tuttle, Laura A.; Olsson, Emily; Stankevitz, Kristin; Girdler, Susan S.; Klein, J. Larry; Hinderliter, Alan L.

    2015-01-01

    Objective Masked hypertension (MH) refers to non-elevated office blood pressure (BP) with elevated out-of-office BP, but its reproducibility has not been conclusively established. We examined one-week reproducibility of MH by home BP monitoring (HBPM) and ambulatory BP monitoring (ABPM). Methods We recruited 420 adults not on BP-lowering medication with recent clinic BP between 120/80 and 149/95 mm Hg. For main comparisons, participants with office average ABPM average was ≥135/85 mm Hg; they were considered to have MH by HBPM if the average was ≥135/85 mm Hg. Percent agreements were quantified using kappa. We also examined prevalence of MH defined as office average ABPM average ≥130/80 mm Hg. We conducted sensitivity analyses using different threshold BP levels for ABPM-office pairings and HBPM-office pairings for defining MH. Results Prevalence rates of MH based on office-awake ABPM pairings were 44% and 43%, with agreement of 71% (kappa=0.40; 95% CI 0.31–0.49). MH was less prevalent (15% and 17%) using HBPM-office pairings, with agreement of 82% (kappa=0.30; 95% CI 0.16–0.44), and more prevalent when considering 24-hour average (50% and 48%). MH was also less prevalent when more stringent diagnostic criteria were applied. Office-HBPM pairings and office-awake ABPM pairings had fair agreement on MH classification on both occasions, with kappas of 0.36 and 0.30. Conclusions MH has fair short-term reproducibility, providing further evidence that for some people, out-of-office BP is systematically higher than when measured in the office setting. PMID:24842491

  5. Reproducibility of an aerobic endurance test for nonexpert swimmers.

    Science.gov (United States)

    Veronese da Costa, Adalberto; Costa, Manoel da Cunha; Carlos, Daniel Medeiros; Guerra, Luis Marcos de Medeiros; Silva, Antônio José; Barbosa, Tiago Manoel Cabral Dos Santos

    2012-01-01

    This study aimed to verify the reproduction of an aerobic test to determine nonexpert swimmers' resistance. The sample consisted of 24 male swimmers (age: 22.79 ± 3.90 years; weight: 74.72 ± 11.44 kg; height: 172.58 ± 4.99 cm; and fat percentage: 15.19% ± 3.21%), who swim for 1 hour three times a week. A new instrument was used in this study (a Progressive Swim Test): the swimmer wore an underwater MP3 player and increased their swimming speed on hearing a beep after every 25 meters. Each swimmer's heart rate was recorded before the test (BHR) and again after the test (AHR). The rate of perceived exertion (RPE) and the number of laps performed (NLP) were also recorded. The sample size was estimated using G*Power software (v 3.0.10; Franz Faul, Kiel University, Kiel, Germany). The descriptive values were expressed as mean and standard deviation. After confirming the normality of the data using both the Shapiro-Wilk and Levene tests, a paired t-test was performed to compare the data. The Pearson's linear correlation (r) and intraclass coefficient correlation (ICC) tests were used to determine relative reproducibility. The standard error of measurement (SEM) and the coefficient of variation (CV) were used to determine absolute reproducibility. The limits of agreement and the bias of the absolute and relative values between days were determined by Bland-Altman plots. All values had a significance level of P 0.50 and ICC > 0.66. The SEM had a variation of ±2% and the CV was 0.90; SEM swimmers. The Progressive Swim Test for nonexpert swimmers produces comparable results for noncompetitive swimmers with a favorable degree of reproducibility, thus presenting possible applications for researching the physiological performance of nonexpert swimmers.

  6. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    International Nuclear Information System (INIS)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H.

    2010-01-01

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean ± SD values for GBEF1 and GBEF2 were 52±17% and 52±16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  7. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H. [School of Medicine, Jordan University of Science and Technology, Irbid (Jordan)

    2010-12-15

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean {+-} SD values for GBEF1 and GBEF2 were 52{+-}17% and 52{+-}16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  8. Repeatability and reproducibility of decisions by latent fingerprint examiners.

    Directory of Open Access Journals (Sweden)

    Bradford T Ulery

    Full Text Available The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as "difficult" than for "easy" or "moderate" comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4; 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization. Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases.

  9. Trace Metals Bioaccumulation Potentials of Three Indigenous ...

    African Journals Online (AJOL)

    User

    grasses as bioaccumulators of trace metals from polluted soils. Seeds of ... transfer factor (TF) showed that Zn was the most bioaccumulated trace metals by all the grasses followed by. Pb, Mn ... was used to de-contaminate copper (Cu) and.

  10. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  11. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  12. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  13. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  14. Reproducibility and Reliability of Repeated Quantitative Fluorescence Angiography

    DEFF Research Database (Denmark)

    Nerup, Nikolaj; Knudsen, Kristine Bach Korsholm; Ambrus, Rikard

    2017-01-01

    INTRODUCTION: When using fluorescence angiography (FA) in perioperative perfusion assessment, repeated measures with re-injections of fluorescent dye (ICG) may be required. However, repeated injections may cause saturation of dye in the tissue, exceeding the limit of fluorescence intensity...... that the camera can detect. As the emission of fluorescence is dependent of the excitatory light intensity, reduction of this may solve the problem. The aim of the present study was to investigate the reproducibility and reliability of repeated quantitative FA during a reduction of excitatory light....

  15. On weights which admit the reproducing kernel of Bergman type

    Directory of Open Access Journals (Sweden)

    Zbigniew Pasternak-Winiarski

    1992-01-01

    Full Text Available In this paper we consider (1 the weights of integration for which the reproducing kernel of the Bergman type can be defined, i.e., the admissible weights, and (2 the kernels defined by such weights. It is verified that the weighted Bergman kernel has the analogous properties as the classical one. We prove several sufficient conditions and necessary and sufficient conditions for a weight to be an admissible weight. We give also an example of a weight which is not of this class. As a positive example we consider the weight μ(z=(Imz2 defined on the unit disk in ℂ.

  16. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  17. INFRARED IMAGING OF CARBON AND CERAMIC COMPOSITES: DATA REPRODUCIBILITY

    International Nuclear Information System (INIS)

    Knight, B.; Howard, D. R.; Ringermacher, H. I.; Hudson, L. D.

    2010-01-01

    Infrared NDE techniques have proven to be superior for imaging of flaws in ceramic matrix composites (CMC) and carbon silicon carbide composites (C/SiC). Not only can one obtain accurate depth gauging of flaws such as delaminations and layered porosity in complex-shaped components such as airfoils and other aeronautical components, but also excellent reproducibility of image data is obtainable using the STTOF (Synthetic Thermal Time-of-Flight) methodology. The imaging of large complex shapes is fast and reliable. This methodology as applied to large C/SiC flight components at the NASA Dryden Flight Research Center will be described.

  18. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  19. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  20. Infrared Imaging of Carbon and Ceramic Composites: Data Reproducibility

    Science.gov (United States)

    Knight, B.; Howard, D. R.; Ringermacher, H. I.; Hudson, L. D.

    2010-02-01

    Infrared NDE techniques have proven to be superior for imaging of flaws in ceramic matrix composites (CMC) and carbon silicon carbide composites (C/SiC). Not only can one obtain accurate depth gauging of flaws such as delaminations and layered porosity in complex-shaped components such as airfoils and other aeronautical components, but also excellent reproducibility of image data is obtainable using the STTOF (Synthetic Thermal Time-of-Flight) methodology. The imaging of large complex shapes is fast and reliable. This methodology as applied to large C/SiC flight components at the NASA Dryden Flight Research Center will be described.

  1. Explicit signal to noise ratio in reproducing kernel Hilbert spaces

    DEFF Research Database (Denmark)

    Gomez-Chova, Luis; Nielsen, Allan Aasbjerg; Camps-Valls, Gustavo

    2011-01-01

    This paper introduces a nonlinear feature extraction method based on kernels for remote sensing data analysis. The proposed approach is based on the minimum noise fraction (MNF) transform, which maximizes the signal variance while also minimizing the estimated noise variance. We here propose...... an alternative kernel MNF (KMNF) in which the noise is explicitly estimated in the reproducing kernel Hilbert space. This enables KMNF dealing with non-linear relations between the noise and the signal features jointly. Results show that the proposed KMNF provides the most noise-free features when confronted...

  2. Distributed trace using central performance counter memory

    Science.gov (United States)

    Satterfield, David L.; Sexton, James C.

    2013-01-22

    A plurality of processing cores, are central storage unit having at least memory connected in a daisy chain manner, forming a daisy chain ring layout on an integrated chip. At least one of the plurality of processing cores places trace data on the daisy chain connection for transmitting the trace data to the central storage unit, and the central storage unit detects the trace data and stores the trace data in the memory co-located in with the central storage unit.

  3. TRACE Assessment for BWR ATWS Analysis

    International Nuclear Information System (INIS)

    Cheng, L.Y.; Diamond, D.; Cuadra, Arantxa; Raitses, Gilad; Aronson, Arnold

    2010-01-01

    A TRACE/PARCS input model has been developed in order to be able to analyze anticipated transients without scram (ATWS) in a boiling water reactor. The model is based on one developed previously for the Browns Ferry reactor for doing loss-of-coolant accident analysis. This model was updated by adding the control systems needed for ATWS and a core model using PARCS. The control systems were based on models previously developed for the TRAC-B code. The PARCS model is based on information (e.g., exposure and moderator density (void) history distributions) obtained from General Electric Hitachi and cross sections for GE14 fuel obtained from an independent source. The model is able to calculate an ATWS, initiated by the closure of main steam isolation valves, with recirculation pump trip, water level control, injection of borated water from the standby liquid control system and actuation of the automatic depressurization system. The model is not considered complete and recommendations are made on how it should be improved.

  4. Semiclassical structure of trace formulas

    International Nuclear Information System (INIS)

    Littlejohn, R.G.

    1990-01-01

    Trace formulas provide the only general relations known connecting quantum mechanics with classical mechanics in the case that the classical motion is chaotic. In particular, they connect quantal objects such as the density of states with classical periodic orbits. In this paper, several trace formulas, including those of Gutzwiller, Balian and Bloch, Tabor, and Berry, are examined from a geometrical standpoint. New forms of the amplitude determinant in asymptotic theory are developed as tools for this examination. The meaning of caustics in these formulas is revealed in terms of intersections of Lagrangian manifolds in phase space. The periodic orbits themselves appear as caustics of an unstable kind, lying on the intersection of two Lagrangian manifolds in the appropriate phase space. New insight is obtained into the Weyl correspondence and the Wigner function, especially their caustic structures

  5. Measuring Trace Hydrocarbons in Silanes

    Science.gov (United States)

    Lesser, L. A.

    1984-01-01

    Technique rapid and uses standard analytical equipment. Silane gas containing traces of hydrocarbons injected into carrier gas of moist nitrogen having about 0.2 percent water vapor. Carrier, water and silane pass through short column packed with powdered sodium hydroxide which combines moisture and silane to form nonvolatile sodium silicate. Carrier gas free of silane but containing nonreactive hydrocarbons, pass to silica-gel column where chromatographic separation takes place. Hydrocarbons measured by FID.

  6. Olfactory memory traces in Drosophila

    OpenAIRE

    Berry, Jacob; Krause, William C.; Davis, Ronald L.

    2008-01-01

    In Drosophila the fruit fly, coincident exposure to an odor and an aversive electric shock can produce robust behavioral memory. This behavioral memory is thought to be regulated by cellular memory traces within the central nervous system of the fly. These molecular, physiological or structural changes in neurons, induced by pairing odor and shock, regulate behavior by altering the neurons’ response to the learned environment. Recently, novel in vivo functional imaging techniques have allowed...

  7. Trace element emissions from coal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-09-15

    Trace elements are emitted during coal combustion. The quantity, in general, depends on the physical and chemical properties of the element itself, the concentration of the element in the coal, the combustion conditions and the type of particulate control device used, and its collection efficiency as a function of particle size. Some trace elements become concentrated in certain particle streams following combustion such as bottom ash, fly ash, and flue gas particulate matter, while others do not. Various classification schemes have been developed to describe this partitioning behaviour. These classification schemes generally distinguish between: Class 1: elements that are approximately equally concentrated in the fly ash and bottom ash, or show little or no fine particle enrichment, examples include Mn, Be, Co and Cr; Class 2: elements that are enriched in the fly ash relative to bottom ash, or show increasing enrichment with decreasing particle size, examples include As, Cd, Pb and Sb; Class 3: elements which are emitted in the gas phase (primarily Hg (not discussed in this review), and in some cases, Se). Control of class 1 trace elements is directly related to control of total particulate matter emissions, while control of the class 2 elements depends on collection of fine particulates. Due to the variability in particulate control device efficiencies, emission rates of these elements can vary substantially. The volatility of class 3 elements means that particulate controls have only a limited impact on the emissions of these elements.

  8. Verification of CTF/PARCSv3.2 coupled code in a Turbine Trip scenario

    International Nuclear Information System (INIS)

    Abarca, A.; Hidalga, P.; Miro, R.; Verdu, G.; Sekhri, A.

    2017-01-01

    Multiphysics codes had revealed as a best-estimate approach to simulate core behavior in LWR. Coupled neutronics and thermal-hydraulics codes are being used and improved to achieve reliable results for reactor safety transient analysis. The implementation of the feedback procedure between the coupled codes at each time step allows a more accurate simulation and a better prediction of the safety limits of analyzed scenarios. With the objective of testing the recently developed CTF/PARCSv3.2 coupled code, a code-to-code verification against TRACE has been developed in a BWR Turbine Trip scenario. CTF is a thermal-hydraulic subchannel code that features two-fluid, three-field representation of the two-phase flow, while PARCS code solves the neutronic diffusion equation in a 3D nodal distribution. PARCS features allow as well the use of extended sets of cross section libraries for a more precise neutronic performance in different formats like PMAX or NEMTAB. Using this option the neutronic core composition of KKL will be made taking advantage of the core follow database. The results of the simulation will be verified against TRACE results. TRACE will be used as a reference code for the validation process since it has been a recommended code by the USNRC. The model used for TRACE includes a full core plus relevant components such as the steam lines and the valves affecting and controlling the turbine trip evolution. The coupled code performance has been evaluated using the Turbine Trip event that took place in Kern Kraftwerk Leibstadt (KKL), at the fuel cycle 18. KKL is a Nuclear Power Plant (NPP) located in Leibstadt, Switzerland. This NPP operates with a BWR developing 3600 MWt in fuel cycles of one year. The Turbine Trip is a fast transient developing a pressure peak in the reactor followed by a power decreasing due to the selected control rod insertion. This kind of transient is very useful to check the feedback performance between both coupled codes due to the fast

  9. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  10. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  11. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  12. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  13. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  14. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  15. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  16. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  17. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  18. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  19. Highly Efficient and Reproducible Nonfullerene Solar Cells from Hydrocarbon Solvents

    KAUST Repository

    Wadsworth, Andrew; Ashraf, Raja; Abdelsamie, Maged; Pont, Sebastian; Little, Mark; Moser, Maximilian; Hamid, Zeinab; Neophytou, Marios; Zhang, Weimin; Amassian, Aram; Durrant, James R.; Baran, Derya; McCulloch, Iain

    2017-01-01

    With chlorinated solvents unlikely to be permitted for use in solution-processed organic solar cells in industry, there must be a focus on developing nonchlorinated solvent systems. Here we report high-efficiency devices utilizing a low-bandgap donor polymer (PffBT4T-2DT) and a nonfullerene acceptor (EH-IDTBR) from hydrocarbon solvents and without using additives. When mesitylene was used as the solvent, rather than chlorobenzene, an improved power conversion efficiency (11.1%) was achieved without the need for pre- or post-treatments. Despite altering the processing conditions to environmentally friendly solvents and room-temperature coating, grazing incident X-ray measurements confirmed that active layers processed from hydrocarbon solvents retained the robust nanomorphology obtained with hot-processed chlorinated solvents. The main advantages of hydrocarbon solvent-processed devices, besides the improved efficiencies, were the reproducibility and storage lifetime of devices. Mesitylene devices showed better reproducibility and shelf life up to 4000 h with PCE dropping by only 8% of its initial value.

  20. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  1. Size Control of Sessile Microbubbles for Reproducibly Driven Acoustic Streaming

    Science.gov (United States)

    Volk, Andreas; Kähler, Christian J.

    2018-05-01

    Acoustically actuated bubbles are receiving growing interest in microfluidic applications, as they induce a streaming field that can be used for particle sorting and fluid mixing. An essential but often unspoken challenge in such applications is to maintain a constant bubble size to achieve reproducible conditions. We present an automatized system for the size control of a cylindrical bubble that is formed at a blind side pit of a polydimethylsiloxane microchannel. Using a pressure control system, we adapt the protrusion depth of the bubble into the microchannel to a precision of approximately 0.5 μ m on a timescale of seconds. By comparing the streaming field generated by bubbles of width 80 μ m with a protrusion depth between -12 and 60 μ m , we find that the mean velocity of the induced streaming fields varies by more than a factor of 4. We also find a qualitative change of the topology of the streaming field. Both observations confirm the importance of the bubble size control system in order to achieve reproducible and reliable bubble-driven streaming experiments.

  2. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  3. Reproducibility of suppression of Pythium wilt of cucumber by compost

    Directory of Open Access Journals (Sweden)

    Mauritz Vilhelm Vestberg

    2014-10-01

    Full Text Available There is increasing global interest in using compost to suppress soil-borne fungal and bacterial diseases and nematodes. We studied the reproducibility of compost suppressive capacity (SC against Pythium wilt of cucumber using nine composts produced by the same composting plant in 2008 and 2009. A bioassay was set up in a greenhouse using cucumber inoculated with two strains of Pythium. The composts were used as 20% mixtures (v:v of a basic steam-sterilized light Sphagnum peat and sand (3:1, v:v. Shoot height was measured weekly during the 5-week experiment. At harvest, the SC was calculated as the % difference in shoot dry weight (DW between non-inoculated and inoculated cucumbers. The SC was not affected by year of production (2008 or 2009, indicating reproducibility of SC when the raw materials and the composting method are not changed. Differences in shoot height were not as pronounced as those for shoot DW. The results were encouraging, but further studies are still needed for producing compost with guaranteed suppressiveness properties.

  4. Reproducibility of tomographic evaluation of posterolateral lumbar arthrodesis consolidation

    Directory of Open Access Journals (Sweden)

    Marcelo Italo Risso Neto

    2015-06-01

    Full Text Available OBJECTIVE: To evaluate interobserver agreement of Glassman classification for posterolateral lumbar spine arthrodesis.METHODS: One hundred and thirty-four CT scans from patients who underwent posterolateral arthrodesis of the lumbar and lumbosacral spine were evaluated by four observers, namely two orthopedic surgeons experienced in spine surgery and two in training in this area. Using the reconstructed tomographic images at oblique coronal plane, 299 operated levels were systematically analyzed looking for arthrodesis signals. The appearance of bone healing in each operated level was classified in five categories as proposed by Glassman to the posterolateral arthrodesis: 1 bilateral solid arthrodesis; 2 unilateral solid arthrodesis; 3 bilateral partial arthrodesis; 4 unilateral partial arthrodesis; 5 absence of arthrodesis. In a second step, the evaluation of each operated level was divided into two categories: fusion (including type 1, 2, 3, and 4 and non fusion (type 5. Statistical analysis was performed by calculating the Kappa coefficient considering the paired analysis between the two experienced observers and between the two observers in training.RESULTS: The interobserver reproducibility by the kappa coefficient for arthrodesis consolidation analysis for the classification proposed, divided into 5 types, was 0.729 for both experienced surgeons and training surgeons. Considering only two categories kappa coefficient was 0.745 between experienced surgeons and 0.795 between training surgeons. In all analyzes, we obtained high concordance power.CONCLUSION: Interobserver reproducibility was observed with high concordance in the classification proposed by Glassman for posterolateral arthrodesis of the lumbar and lumbosacral spine.

  5. Reproducibility of P-31 spectroscopic imaging of normal human myocardium

    International Nuclear Information System (INIS)

    Tavares, N.J.; Chew, W.; Auffermann, W.; Higgins, C.B.

    1988-01-01

    To assess reproducibility of P-31 MR spectroscopy of human myocardium, ten normal male volunteers were studied on two separate occasions. Spectra were acquired on a clinical 1.5-T MR imaging unit (Signa, General Electric) using a one-dimensional gated spectroscopic imaging sequence (matrix size, 32 X 256) over 20 minutes. Peaks in the adenosine triphosphate (ATP) region, phosphocreatine (PCR), phosphodiesters (PD), and peaks attributable to 2,3 diphosphoglycerate from blood were observed. Interindividual and intraindividual variability expressed as standard errors of the mean (mean +- SEM) were 1.54 +- 0.04 (variability among subjects) and 0.04 (variability between first and second studies) for PCR/β ATP; 0.97 +- 0.18 and 0.06 for PD/β ATP; and 0.62 +- 0.10 and 0.05 for PD/PCR, respectively. In conclusion, P-31 MR spectroscopy yields consistent and reproducible myocardial spectra that might be useful in the future for the evaluation and monitoring of cardiac disease

  6. Highly Efficient and Reproducible Nonfullerene Solar Cells from Hydrocarbon Solvents

    KAUST Repository

    Wadsworth, Andrew

    2017-06-01

    With chlorinated solvents unlikely to be permitted for use in solution-processed organic solar cells in industry, there must be a focus on developing nonchlorinated solvent systems. Here we report high-efficiency devices utilizing a low-bandgap donor polymer (PffBT4T-2DT) and a nonfullerene acceptor (EH-IDTBR) from hydrocarbon solvents and without using additives. When mesitylene was used as the solvent, rather than chlorobenzene, an improved power conversion efficiency (11.1%) was achieved without the need for pre- or post-treatments. Despite altering the processing conditions to environmentally friendly solvents and room-temperature coating, grazing incident X-ray measurements confirmed that active layers processed from hydrocarbon solvents retained the robust nanomorphology obtained with hot-processed chlorinated solvents. The main advantages of hydrocarbon solvent-processed devices, besides the improved efficiencies, were the reproducibility and storage lifetime of devices. Mesitylene devices showed better reproducibility and shelf life up to 4000 h with PCE dropping by only 8% of its initial value.

  7. A Bayesian Perspective on the Reproducibility Project: Psychology.

    Science.gov (United States)

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  8. Automation of cDNA Synthesis and Labelling Improves Reproducibility

    Directory of Open Access Journals (Sweden)

    Daniel Klevebring

    2009-01-01

    Full Text Available Background. Several technologies, such as in-depth sequencing and microarrays, enable large-scale interrogation of genomes and transcriptomes. In this study, we asses reproducibility and throughput by moving all laboratory procedures to a robotic workstation, capable of handling superparamagnetic beads. Here, we describe a fully automated procedure for cDNA synthesis and labelling for microarrays, where the purification steps prior to and after labelling are based on precipitation of DNA on carboxylic acid-coated paramagnetic beads. Results. The fully automated procedure allows for samples arrayed on a microtiter plate to be processed in parallel without manual intervention and ensuring high reproducibility. We compare our results to a manual sample preparation procedure and, in addition, use a comprehensive reference dataset to show that the protocol described performs better than similar manual procedures. Conclusions. We demonstrate, in an automated gene expression microarray experiment, a reduced variance between replicates, resulting in an increase in the statistical power to detect differentially expressed genes, thus allowing smaller differences between samples to be identified. This protocol can with minor modifications be used to create cDNA libraries for other applications such as in-depth analysis using next-generation sequencing technologies.

  9. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  10. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  11. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  12. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  13. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  14. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  15. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  16. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  17. Reproducibility of temporomandibular joint tomography. Influence of shifted X-ray beam and tomographic focal plane on reproducibility

    International Nuclear Information System (INIS)

    Saito, Masashi

    1999-01-01

    Proper tomographic focal plane and x-ray beam direction are the most important factors to obtain accurate images of the temporomandibular joint (TMJ). In this study, to clarify the magnitude of effect of these two factors on the image quality. We evaluated the reproducibility of tomograms by measuring the distortion when the x-ray beam was shifted from the correct center of the object. The effects of the deviation of the tomographic focal plane on image quality were evaluated by the MTF (Modulation Transfer Function). Two types of tomograms, one the plane type, the other the rotational type were used in this study. A TMJ model was made from Teflon for the purpose of evaluation by shifting the x-ray beam. The x-ray images were obtained by tilting the model from 0 to 10 degrees 2-degree increments. These x-ray images were processed for computer image analysis, and then the distance between condyle and the joint space was measured. To evaluate the influence of the shifted tomographic focal plane on image sharpness, the x-ray images from each setting were analyzed by MTF. To obtain the MTF, ''knife-edge'' made from Pb was used. The images were scanned with a microdensitometer at the central focal plane, and 0, 0.5, 1 mm away respectively. The density curves were analyzed by Fourier analysis and the MTF was calculated. The reproducibility of images became worse by shifting the x-ray beam. This tendency was similar for both tomograms. Object characteristics such as anterior and posterior portion of the joint space affected the deterioration of reproducibility of the tomography. The deviation of the tomographic focal plane also decreased the reproducibility of the x-ray images. The rotational type showed a better MTF, but it became seriously unfavorable with slight changes of the tomographic focal plane. Contrarily, the plane type showed a lower MTF, but the image was stable with shifting of the tomographic focal plane. (author)

  18. Trace metal speciation: Finally, correctly addressing trace metal issues

    International Nuclear Information System (INIS)

    Donard, O.F.X.

    2001-01-01

    The history of the development of trace metal speciation was discussed and the reasons behind the relatively slow widespread acceptance of its importance were presented. Partially, this was due to the lack of availability of commercial instrumentation and partly to the drive towards improving sensitivity in analytical chemistry which had focused attention on total concentration determinations. The sophistication and control of analytical instrumentation is now such that the spotlight must be turned onto the chemical species of an element present in a sample since this is what governs its behaviour in the biosphere. Indeed, several companies are currently considering the introduction of instrumentation specifically designed for metal species determination

  19. Adaption of the PARCS Code for Core Design Audit Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyong Chol; Lee, Young Jin; Uhm, Jae Beop; Kim, Hyunjik [Nuclear Safety Evaluation, Daejeon (Korea, Republic of); Jeong, Hun Young; Ahn, Seunghoon; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-05-15

    The eigenvalue calculation also includes quasi-static core depletion analyses. PARCS has implemented variety of features and has been qualified as a regulatory audit code in conjunction with other NRC thermal-hydraulic codes such as TRACE or RELAP5. In this study, as an adaptation effort for audit applications, PARCS is applied for an audit analysis of a reload core design. The lattice physics code HELIOS is used for cross section generation. PARCS-HELIOS code system has been established as a core analysis tool. Calculation results have been compared on a wide spectrum of calculations such as power distribution, critical soluble boron concentration, and rod worth. A reasonable agreement between the audit calculation and the reference results has been found.

  20. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification