WorldWideScience

Sample records for tracing code reproducing

  1. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    Science.gov (United States)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  2. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. Papa, a Particle Tracing Code in Pascal

    NARCIS (Netherlands)

    Haselhoff, E.H.; Haselhoff, Eltjo H.; Ernst, G.J.

    1990-01-01

    During the design of a 10 ¿m high-gain FEL oscillator (TEUFEL Project) we developed a new particle-tracing code to perform simulations of thermionic- and photo-cathode electron injectors/accelerators. The program allows predictions of current, energy and beam emittance in a user-specified linac

  4. Assessment of TRACE code against CHF experiments

    International Nuclear Information System (INIS)

    Audrius Jasiulevicius; Rafael Macian-Juan; Paul Coddington

    2005-01-01

    Full text of publication follows: This paper reports on the validation of the USNRC 'consolidate' code TRACE with data obtained during Critical Heat Flux (CHF) experiments in single channels and round and annular tubes. CHF is one of the key reactor safety parameters, because it determines the conditions for the onset of transition boiling in the core rod bundles, leading to the low heat transfer rates characteristics of the post-CHF heat transfer regime. In the context of the participation of PSI in the the International Programme for uncertainty analysis BEMUSE, we have carried out extensive work for the validation of some important TRACE models. The present work is aimed at assessing the range of validity for the CHF correlations and post-CHF heat transfer models currently included in TRACE. The heat transfer experiments selected for the assessment were performed at the Royal Institute of Technology (RIT) in Stockholm, Sweden and at the Atomic Energy Establishment in Winfrith, UK. The experimental investigations of the CHF and post-CHF heat transfer at RIT for flow of water in vertical tubes and annulus were performed at pressures ranging from 1 to 20 MPa and coolant mass fluxes from 500 to 3000 kg/m 2 s. The liquid was subcooled by 10 deg. C and 40 deg. C at the inlet of the test section. The experiments were performed on two different types of test sections. Experiments with uniformly heated single 7.0 m long tubes were carried out with three different inner tube diameters of 10, 14.9 and 24.7 mm. A series of experiments with non-uniform axial power distribution were also conducted in order to study the effect of the axial heat flux distribution on the CHF conditions in both 7.0 m long single tubes and 3.65 long annulus. Several different axial power profiles were employed with bottom, middle and top power peaks as well as the double-humped axial power profiles. In total more than 100 experiments with uniform axial heat flux distribution and several hundreds

  5. Simulation of the turbine discharge transient with the code Trace

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Filio L, C.

    2014-10-01

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  6. Re-run, Repeat, Reproduce, Reuse, Replicate: Transforming Code into Scientific Contributions

    Directory of Open Access Journals (Sweden)

    Fabien C. Y. Benureau

    2018-01-01

    Full Text Available Scientific code is different from production software. Scientific code, by producing results that are then analyzed and interpreted, participates in the elaboration of scientific conclusions. This imposes specific constraints on the code that are often overlooked in practice. We articulate, with a small example, five characteristics that a scientific code in computational science should possess: re-runnable, repeatable, reproducible, reusable, and replicable. The code should be executable (re-runnable and produce the same result more than once (repeatable; it should allow an investigator to reobtain the published results (reproducible while being easy to use, understand and modify (reusable, and it should act as an available reference for any ambiguity in the algorithmic descriptions of the article (replicable.

  7. SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)

    2013-10-01

    SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.

  8. Comparative study of boron transport models in NRC Thermal-Hydraulic Code Trace

    Energy Technology Data Exchange (ETDEWEB)

    Olmo-Juan, Nicolás; Barrachina, Teresa; Miró, Rafael; Verdú, Gumersindo; Pereira, Claubia, E-mail: nioljua@iqn.upv.es, E-mail: tbarrachina@iqn.upv.es, E-mail: rmiro@iqn.upv.es, E-mail: gverdu@iqn.upv.es, E-mail: claubia@nuclear.ufmg.br [Institute for Industrial, Radiophysical and Environmental Safety (ISIRYM). Universitat Politècnica de València (Spain); Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    Recently, the interest in the study of various types of transients involving changes in the boron concentration inside the reactor, has led to an increase in the interest of developing and studying new models and tools that allow a correct study of boron transport. Therefore, a significant variety of different boron transport models and spatial difference schemes are available in the thermal-hydraulic codes, as TRACE. According to this interest, in this work it will be compared the results obtained using the different boron transport models implemented in the NRC thermal-hydraulic code TRACE. To do this, a set of models have been created using the different options and configurations that could have influence in boron transport. These models allow to reproduce a simple event of filling or emptying the boron concentration in a long pipe. Moreover, with the aim to compare the differences obtained when one-dimensional or three-dimensional components are chosen, it has modeled many different cases using only pipe components or a mix of pipe and vessel components. In addition, the influence of the void fraction in the boron transport has been studied and compared under close conditions to BWR commercial model. A final collection of the different cases and boron transport models are compared between them and those corresponding to the analytical solution provided by the Burgers equation. From this comparison, important conclusions are drawn that will be the basis of modeling the boron transport in TRACE adequately. (author)

  9. Analysis of a full-scale integral test in PERSEO facility by using TRACE code

    Science.gov (United States)

    D’Amico, S.; Lombardo, C.; Moscato, I.; Polidori, M.; Vella, G.

    2017-11-01

    Over the last decades a lot of experimental researches have been done to increase the reliability of passive decay heat removal systems implementing in-pool immersed heat exchanger. In this framework, a domestic research program on innovative safety systems was carried out leading the design and the development of the PERSEO facility at the SIET laboratories. The configuration of the system consists of an heat exchanger contained in a small pool which is connected both at the bottom and at the top to a large water reservoir pool. Within the frame of a national research program funded by the Italian minister of economic development, the DEIM department of the University of Palermo in cooperation with ENEA has developed a computational model of the PERSEO facility in order to simulate its behaviour during an integrated test. The analysis here presented has been performed by using the best-estimate TRACE code and - in order to highlight the capabilities and limits of the TRACE model in reproducing qualitatively and quantitatively the experimental trends - the main results have been compared with the experimental data. The comparison shows that the model is able to predict the overall behaviour of the plant during the meaningful phases of the transient analysed. Nevertheless, some improvements in the modelling of certain components in which take place complex three-dimensional phenomena are suggested in order to reduce some discrepancies observed between code results and test measurements.

  10. Non-coding RNA detection methods combined to improve usability, reproducibility and precision

    Directory of Open Access Journals (Sweden)

    Kreikemeyer Bernd

    2010-09-01

    Full Text Available Abstract Background Non-coding RNAs gain more attention as their diverse roles in many cellular processes are discovered. At the same time, the need for efficient computational prediction of ncRNAs increases with the pace of sequencing technology. Existing tools are based on various approaches and techniques, but none of them provides a reliable ncRNA detector yet. Consequently, a natural approach is to combine existing tools. Due to a lack of standard input and output formats combination and comparison of existing tools is difficult. Also, for genomic scans they often need to be incorporated in detection workflows using custom scripts, which decreases transparency and reproducibility. Results We developed a Java-based framework to integrate existing tools and methods for ncRNA detection. This framework enables users to construct transparent detection workflows and to combine and compare different methods efficiently. We demonstrate the effectiveness of combining detection methods in case studies with the small genomes of Escherichia coli, Listeria monocytogenes and Streptococcus pyogenes. With the combined method, we gained 10% to 20% precision for sensitivities from 30% to 80%. Further, we investigated Streptococcus pyogenes for novel ncRNAs. Using multiple methods--integrated by our framework--we determined four highly probable candidates. We verified all four candidates experimentally using RT-PCR. Conclusions We have created an extensible framework for practical, transparent and reproducible combination and comparison of ncRNA detection methods. We have proven the effectiveness of this approach in tests and by guiding experiments to find new ncRNAs. The software is freely available under the GNU General Public License (GPL, version 3 at http://www.sbi.uni-rostock.de/moses along with source code, screen shots, examples and tutorial material.

  11. Safety related investigations of the VVER-1000 reactor type by the coupled code system TRACE/PARCS

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Lischke, Wolfgang; Sanchez Espinoza, Victor Hugo

    2007-01-01

    This study was performed at the Institute of Reactor Safety at the Research Center Karlsruhe. It is embedded in the ongoing investigations of the international code application and maintenance program (CAMP) for qualification and validation of system codes like TRACE [1] and PARCS [2]. The predestinated reactor type for the validation of these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2 [3] includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The posttest-investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement to the measured data. The coolant mixing pattern especially in the downcomer has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provides good results compared to reference values and the ones of other participants of the benchmark. It can be pointed out that the developed three-dimensional nodalisation of the reactor pressure vessel (RPV) is appropriate for the description of transients where the thermal-hydraulics and the neutronics are strongly linked. (author)

  12. Simulation of the turbine discharge transient with the code Trace; Simulacion del transitorio disparo de turbina con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M.; Filio L, C., E-mail: dulcemaria.mejia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Jose Ma. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico)

    2014-10-15

    In this paper the results of the simulation of the turbine discharge transient are shown, occurred in Unit 1 of nuclear power plant of Laguna Verde (NPP-L V), carried out with the model of this unit for the best estimate code Trace. The results obtained by the code Trace are compared with those obtained from the Process Information Integral System (PIIS) of the NPP-L V. The reactor pressure, level behavior in the down-comer, steam flow and flow rate through the recirculation circuits are compared. The results of the simulation for the operation power of 2027 MWt, show concordance with the system PIIS. (Author)

  13. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Directory of Open Access Journals (Sweden)

    David A Springate

    Full Text Available Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs. If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1% were accompanied by a full set of published clinical codes and 32 (8.6% stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  14. [Traces of health culture in the 1427 code of Grbalj].

    Science.gov (United States)

    Milovic-Karic, Grozdana; Milovic, Dorde

    2012-01-01

    The Code of Grbalj regulated a number of legal issues in this area and marked the passage from common to statutory law. It brings several curiosities related to the health culture of the time, such as that a barber was not liable for patient's death due to surgery. In fact, surgery was preceded by a symbolic act in which the barber would hand a razor to the patient, and the patient would hand it back. The intention of this provision was to protect the surgeon from blood feud. As for the corpse, the Code provided that it should be kept at home overnight and buried in the morning. Punitive provisions include stoning of the engaged couple in case of pregnancy, as engagement commanded absolute virtue. The punishment for striking parents was to cut off the hand that hit them; the ear was cut off or the nose scarred to permanently mark an adulteress or a woman who stole from the husband's house and sold the stolen property to fill up her belly. Children who stole from the house and sold the property were punished by flogging with a chibouk.

  15. Coupling the beam tracing code TORBEAM and the Fokker-Planck solver RELAX for fast electrons

    NARCIS (Netherlands)

    Maj, O.; Poli, E.; Westerhof, E.

    2012-01-01

    In this paper the interface between the beam tracing code TORBEAM [Poli, Peeters and Pereverzev, Comp. Phys. Comm. 136, 90 (2001)] and the quasi-linear Fokker-Planck solver RELAX [Westerhof, Peeters and Schippers, Rijnhuizen Report No. RR 92-211 CA, 1992] is presented together with preliminary

  16. Numerical discretization analysis of a HTR steam generator model for the thermal-hydraulics code trace

    Directory of Open Access Journals (Sweden)

    Esch Markus

    2014-01-01

    Full Text Available For future high temperature reactor projects, e. g., for electricity production or nuclear process heat applications, the steam generator is a crucial component. A typical design is a helical coil steam generator consisting of several tubes connected in parallel forming cylinders of different diameters. This type of steam generator was a significant component used at the thorium high temperature reactor. In the work presented the temperature profile is being analyzed by the nodal thermal hydraulics code TRACE for the thorium high temperature reactor steam generator. The influence of the nodalization is being investigated within the scope of this study and compared to experimental results from the past. The results of the standard TRACE code are compared to results using a modified Nusselt number for the primary side. The implemented heat transfer correlation was developed within the past German HTR program. This study shows that both TRACE versions are stable and provides a discussion of the nodalization requirements.

  17. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    Science.gov (United States)

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  18. Assessment of GOTHIC and TRACE codes against selected PANDA experiments on a Passive Containment Condenser

    Energy Technology Data Exchange (ETDEWEB)

    Papini, Davide, E-mail: davide.papini@psi.ch; Adamsson, Carl; Andreani, Michele; Prasser, Horst-Michael

    2014-10-15

    Highlights: • Code comparison on the performance of a Passive Containment Condenser. • Simulation of separate effect tests with pure steam and non-condensable gases. • Role of the secondary side and accuracy of pool boiling models are discussed. • GOTHIC and TRACE predict the experimental performance with slight underestimation. • Recirculatory flow pattern with injection of light non-condensable gas is inferred. - Abstract: Typical passive safety systems for ALWRs (Advanced Light Water Reactors) rely on the condensation of steam to remove the decay heat from the core or the containment. In the present paper the three-dimensional containment code GOTHIC and the one-dimensional system code TRACE are compared on the calculation of a variety of phenomena characterizing the response of a passive condenser submerged in a boiling pool. The investigation addresses the conditions of interest for the Passive Containment Cooling System (PCCS) proposed for the ESBWR (Economic Simplified Boiling Water Reactor). The analysis of selected separate effect tests carried out on a PCC (Passive Containment Condenser) unit in the PANDA large-scale thermal-hydraulic facility is presented to assess the code predictions. Both pure steam conditions (operating pressure of 3 bar, 6 bar and 9 bar) and the effect on the condensation heat transfer of non-condensable gases heavier than steam (air) and lighter than steam (helium) are considered. The role of the secondary side (pool side) heat transfer on the condenser performance is examined too. In general, this study shows that both the GOTHIC and TRACE codes are able to reasonably predict the heat transfer capability of the PCC as well as the influence of non-condensable gas on the system. A slight underestimation of the condenser performance is obtained with both codes. For those tests where the experimental and simulated efficiencies agree better the possibility of compensating errors among different parts of the heat transfer

  19. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  20. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    Science.gov (United States)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  1. 3-D TECATE/BREW: Thermal, stress, and birefringent ray-tracing codes for solid-state laser design

    Science.gov (United States)

    Gelinas, R. J.; Doss, S. K.; Nelson, R. G.

    1994-07-01

    This report describes the physics, code formulations, and numerics that are used in the TECATE (totally Eulerian code for anisotropic thermo-elasticity) and BREW (birefringent ray-tracing of electromagnetic waves) codes for laser design. These codes resolve thermal, stress, and birefringent optical effects in 3-D stationary solid-state systems. This suite of three constituent codes is a package referred to as LASRPAK.

  2. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs

    International Nuclear Information System (INIS)

    Mejia S, D. M.; Del Valle G, E.

    2015-09-01

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  3. Related research with thermo hydraulics safety by means of Trace code; Investigaciones relacionadas con seguridad termohidraulica con el codigo TRACE

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.; Del Valle G, E. [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico); Rodriguez H, A.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Sanchez E, V. H.; Jager, W., E-mail: evalle@esfm.ipn.mx [Karlsruhe Institute of Technology, Hermann-von-Helmholtz Platz I, D-76344 Eggenstein - Leopoldshafen (Germany)

    2014-10-15

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  4. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  5. Comparative Analysis of CTF and Trace Thermal-Hydraulic Codes Using OECD/NRC PSBT Benchmark Void Distribution Database

    Directory of Open Access Journals (Sweden)

    M. Avramova

    2013-01-01

    Full Text Available The international OECD/NRC PSBT benchmark has been established to provide a test bed for assessing the capabilities of thermal-hydraulic codes and to encourage advancement in the analysis of fluid flow in rod bundles. The benchmark was based on one of the most valuable databases identified for the thermal-hydraulics modeling developed by NUPEC, Japan. The database includes void fraction and departure from nucleate boiling measurements in a representative PWR fuel assembly. On behalf of the benchmark team, PSU in collaboration with US NRC has performed supporting calculations using the PSU in-house advanced thermal-hydraulic subchannel code CTF and the US NRC system code TRACE. CTF is a version of COBRA-TF whose models have been continuously improved and validated by the RDFMG group at PSU. TRACE is a reactor systems code developed by US NRC to analyze transient and steady-state thermal-hydraulic behavior in LWRs and it has been designed to perform best-estimate analyses of LOCA, operational transients, and other accident scenarios in PWRs and BWRs. The paper presents CTF and TRACE models for the PSBT void distribution exercises. Code-to-code and code-to-data comparisons are provided along with a discussion of the void generation and void distribution models available in the two codes.

  6. Feasibility Study of Coupling the CASMO-4/TABLES-3/SIMULATE-3 Code System to TRACE/PARCS

    International Nuclear Information System (INIS)

    Demaziere, Christophe; Staalek, Mathias

    2004-12-01

    This report investigates the feasibility of coupling the Studsvik Scandpower CASMO-4/TABLES-3/SIMULATE-3 codes to the US NRC TRACE/PARCS codes. The data required by TRACE/PARCS are actually the ones necessary to run its neutronic module PARCS. Such data are the macroscopic nuclear cross-sections, some microscopic nuclear cross-sections important for the Xenon and Samarium poisoning effects, the Assembly Discontinuity Factors, and the kinetic parameters. All these data can be retrieved from the Studsvik Scandpower codes. The data functionalization is explained in detail for both systems of codes and the possibility of coupling each of these codes to TRACE/PARCS is discussed. Due to confidentiality restrictions in the use of the CASMO-4 files and to an improper format of the TABLES-3 output files, it is demonstrated that TRACE/PARCS can only be coupled to SIMULATE-3. Specifically-dedicated SIMULATE-3 input decks allow easily editing the neutronic data at specific operating statepoints. Although the data functionalization is different between both systems of codes, such a procedure permits reconstructing a set of data directly compatible with PARCS

  7. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-01-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  8. Assessing flow paths in a karst aquifer based on multiple dye tracing tests using stochastic simulation and the MODFLOW-CFP code

    Science.gov (United States)

    Assari, Amin; Mohammadi, Zargham

    2017-09-01

    Karst systems show high spatial variability of hydraulic parameters over small distances and this makes their modeling a difficult task with several uncertainties. Interconnections of fractures have a major role on the transport of groundwater, but many of the stochastic methods in use do not have the capability to reproduce these complex structures. A methodology is presented for the quantification of tortuosity using the single normal equation simulation (SNESIM) algorithm and a groundwater flow model. A training image was produced based on the statistical parameters of fractures and then used in the simulation process. The SNESIM algorithm was used to generate 75 realizations of the four classes of fractures in a karst aquifer in Iran. The results from six dye tracing tests were used to assign hydraulic conductivity values to each class of fractures. In the next step, the MODFLOW-CFP and MODPATH codes were consecutively implemented to compute the groundwater flow paths. The 9,000 flow paths obtained from the MODPATH code were further analyzed to calculate the tortuosity factor. Finally, the hydraulic conductivity values calculated from the dye tracing experiments were refined using the actual flow paths of groundwater. The key outcomes of this research are: (1) a methodology for the quantification of tortuosity; (2) hydraulic conductivities, that are incorrectly estimated (biased low) with empirical equations that assume Darcian (laminar) flow with parallel rather than tortuous streamlines; and (3) an understanding of the scale-dependence and non-normal distributions of tortuosity.

  9. A model of polarized-beam AGS in the ray-tracing code Zgoubi

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ahrens, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Glenn, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Roser, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Shoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-07-12

    A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are available from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.

  10. Related research with thermo hydraulics safety by means of Trace code

    International Nuclear Information System (INIS)

    Chaparro V, F. J.; Del Valle G, E.; Rodriguez H, A.; Gomez T, A. M.; Sanchez E, V. H.; Jager, W.

    2014-10-01

    In this article the results of the design of a pressure vessel of a BWR/5 similar to the type of Laguna Verde NPP are presented, using the Trace code. A thermo hydraulics Vessel component capable of simulating the behavior of fluids and heat transfer that occurs within the reactor vessel was created. The Vessel component consists of a three-dimensional cylinder divided into 19 axial sections, 4 azimuthal sections and two concentric radial rings. The inner ring is used to contain the core and the central part of the reactor, while the outer ring is used as a down comer. Axial an azimuthal divisions were made with the intention that the dimensions of the internal components, heights and orientation of the external connections match the reference values of a reactor BWR/5 type. In the model internal components as, fuel assemblies, steam separators, jet pumps, guide tubes, etc. are included and main external connections as, steam lines, feed-water or penetrations of the recirculation system. The model presents significant simplifications because the object is to keep symmetry between each azimuthal section of the vessel. In most internal components lack a detailed description of the geometry and initial values of temperature, pressure, fluid velocity, etc. given that it only considered the most representative data, however with these simulations are obtained acceptable results in important parameters such as the total flow through the core, the pressure in the vessel, percentage of vacuums fraction, pressure drop in the core and the steam separators. (Author)

  11. Neutronic / thermal-hydraulic coupling with the code system Trace / Parcs; Acoplamiento neutronico / termohidraulico con el sistema de codigos TRACE / PARCS

    Energy Technology Data Exchange (ETDEWEB)

    Mejia S, D. M. [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico); Del Valle G, E., E-mail: dulcemaria.mejia@cnsns.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, Av. IPN s/n, Col. Lindavista, 07738 Ciudad de Mexico (Mexico)

    2015-09-15

    The developed models for Parcs and Trace codes corresponding for the cycle 15 of the Unit 1 of the Laguna Verde nuclear power plant are described. The first focused to the neutronic simulation and the second to thermal hydraulics. The model developed for Parcs consists of a core of 444 fuel assemblies wrapped in a radial reflective layer and two layers, a superior and another inferior, of axial reflector. The core consists of 27 total axial planes. The model for Trace includes the vessel and its internal components as well as various safety systems. The coupling between the two codes is through two maps that allow its intercommunication. Both codes are used in coupled form performing a dynamic simulation that allows obtaining acceptably a stable state from which is carried out the closure of all the main steam isolation valves (MSIVs) followed by the performance of safety relief valves (SRVs) and ECCS. The results for the power and reactivities introduced by the moderator density, the fuel temperature and total temperature are shown. Data are also provided like: the behavior of the pressure in the steam dome, the water level in the downcomer, the flow through the MSIVs and SRVs. The results are explained for the power, the pressure in the steam dome and the water level in the downcomer which show agreement with the actions of the MSIVs, SRVs and ECCS. (Author)

  12. Benchmarking of the WIMS9/PARCS/TRACE code system for neutronic calculations of the Westinghouse AP1000™ reactor

    Energy Technology Data Exchange (ETDEWEB)

    Elsawi, Mohamed A., E-mail: Mohamed.elsawi@kustar.ac.ae; Hraiz, Amal S. Bin, E-mail: Amal.Hraiz@kustar.ac.ae

    2015-11-15

    Highlights: • AP1000 core configuration is challenging due to its high degree of heterogeneity. • The proposed code was used to model neutronics/TH behavior of the AP1000 reactor. • Enhanced modeling features in WIMS9 facilitated neutronics modeling of the reactor. • PARCS/TRACE coupled code system was used to model the temperature feedback effects. • Final results showed reasonable agreement with publically available reactor data. - Abstract: The objective of this paper is to assess the accuracy of the WIMS9/PARCS/TRACE code system for power density calculations of the Westinghouse AP1000™ nuclear reactor, as a representative of modern pressurized water reactors (Gen III+). The cross section libraries were generated using the lattice physics code WIMS9 (the commercial version of the legacy lattice code WIMSD). Nine different fuel assembly types were analyzed in WIMS9 to generate the two-group cross sections required by the PARCS core simulator. The nine fuel assembly types were identified based on the distribution of Pyrex discrete burnable absorber (Borosilicate glass) and integral fuel burnable absorber (IFBA) rods in each fuel assembly. The generated cross sections were passed to the coupled core simulator PARCS/TRACE which performed 3-D, full-core diffusion calculations from within the US NRC Symbolic Nuclear Analysis Package (SNAP) interface. The results which included: assembly power distribution, effective multiplication factor (k{sub eff}), radial and axial power density, and whole core depletion were compared to reference Monte Carlo results and to a published reactor data available in the AP1000 Design Control Document (DCD). The results of the study show acceptable accuracy of the WIMS9/PARCS/TRACE code in predicting the power density of the AP1000 core and, hence, establish its adequacy in the evaluation of the neutronics parameters of modern PWRs of similar designs. The work reported here is new in that it uses, for the first time, the

  13. Qualification of TRACE V5.0 Code against Fast Cooldown Transient in the PKL-III Integral Test Facility

    Directory of Open Access Journals (Sweden)

    Eugenio Coscarelli

    2013-01-01

    Full Text Available The present paper deals with the analytical study of the PKL experiment G3.1 performed using the TRACE code (version 5.0 patch1. The test G3.1 simulates a fast cooldown transient, namely, a main steam line break. This leads to a strong asymmetry caused by an increase of the heat transfer from the primary to the secondary side that induces a fast cooldown transient on the primary side-affected loop. The asymmetric overcooling effect requires an assessment of the reactor pressure vessel integrity considering PTS (pressurized thermal shock and an assessment of potential recriticality following entrainment of colder water into the core area. The aim of this work is the qualification of the heat transfer capabilities of the TRACE code from primary to secondary side in the intact and affected steam generators (SGs during the rapid depressurization and the boiloff in the affected SG against experimental data.

  14. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Weeratunga, S K

    2008-11-06

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can be easily shared between these two code frameworks and concludes with a set of recommendations for its development.

  15. Critical review of conservation equations for two-phase flow in the U.S. NRC TRACE code

    International Nuclear Information System (INIS)

    Wulff, Wolfgang

    2011-01-01

    Research highlights: → Field equations as implemented in TRACE are incorrect. → Boundary conditions needed for cooling of nuclear fuel elements are wrong. → The two-fluid model in TRACE is not closed. → Three-dimensional flow modeling in TRACE has no basis. - Abstract: The field equations for two-phase flow in the computer code TRAC/RELAP Advanced Computational Engine or TRACE are examined to determine their validity, their capabilities and limitations in resolving nuclear reactor safety issues. TRACE was developed for the NRC to predict thermohydraulic phenomena in nuclear power plants during operational transients and postulated accidents. TRACE is based on the rigorously derived and well-established two-fluid field equations for 1-D and 3-D two-phase flow. It is shown that: (1)The two-fluid field equations for mass conservation as implemented in TRACE are wrong because local mass balances in TRACE are in conflict with mass conservation for the whole reactor system, as shown in Section . (2)Wrong equations of motion are used in TRACE in place of momentum balances, compromising at branch points the prediction of momentum transfer between, and the coupling of, loops in hydraulic networks by impedance (form loss and wall shear) and by inertia and thereby the simulation of reactor component interactions. (3)Most seriously, TRACE calculation of heat transfer from fuel elements is incorrect for single and two-phase flows, because Eq. of the TRACE Manual is wrong (see Section ). (4)Boundary conditions for momentum and energy balances in TRACE are restricted to flow regimes with single-phase wall contact because TRACE lacks constitutive relations for solid-fluid exchange of momentum and heat in prevailing flow regimes. Without a quantified assessment of consequences from (3) to (4), predictions of phasic fluid velocities, fuel temperatures and important safety parameters, e.g., peak clad temperature, are questionable. Moreover, TRACE cannot predict 3-D single- or

  16. Validation and application of the system code TRACE for safety related investigations of innovative nuclear energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim

    2011-12-19

    The system code TRACE is the latest development of the U.S. Nuclear Regulatory Commission (US NRC). TRACE, developed for the analysis of operational conditions, transients and accidents of light water reactors (LWR), is a best-estimate code with two fluid, six equation models for mass, energy, and momentum conservation, and related closure models. Since TRACE is mainly applied to LWR specific issues, the validation process related to innovative nuclear systems (liquid metal cooled systems, systems operated with supercritical water, etc.) is very limited, almost not existing. In this work, essential contribution to the validation of TRACE related to lead and lead alloy cooled systems as well as systems operated with supercritical water is provided in a consistent and corporate way. In a first step, model discrepancies of the TRACE source code were removed. This inconsistencies caused the wrong prediction of the thermo physical properties of supercritical water and lead bismuth eutectic, and hence the incorrect prediction of heat transfer relevant characteristic numbers like Reynolds or Prandtl number. In addition to the correction of the models to predict these quantities, models describing the thermo physical properties of lead and Diphyl THT (synthetic heat transfer medium) were implemented. Several experiments and numerical benchmarks were used to validate the modified TRACE version. These experiments, mainly focused on wall-to-fluid heat transfer, revealed that not only the thermo physical properties are afflicted with inconsistencies but also the heat transfer models. The models for the heat transfer to liquid metals were enhanced in a way that the code can now distinguish between pipe and bundle flow by using the right correlation. The heat transfer to supercritical water was not existing in TRACE up to now. Completely new routines were implemented to overcome that issue. The comparison of the calculations to the experiments showed, on one hand, the necessity

  17. Sensory system development influences the ontogeny of hippocampal associative coding and trace eyeblink conditioning.

    Science.gov (United States)

    Goldsberry, Mary E; Kim, Jangjin; Freeman, John H

    2017-09-01

    Until recently, it was believed that hippocampal development was the primary rate-limiting factor in the developmental emergence of hippocampal forms of learning, such as trace eyeblink conditioning (EBC). Indeed, hippocampal neuronal activity shows an age-related increase in both complexity and task responsiveness during trace EBC. However, recent work from our laboratory suggests that sensory system development may also play a role. Training with the earlier-developing somatosensory system results in an earlier emergence of trace EBC in rats, suggesting that the development of sensory input to the hippocampus may influence the development of trace EBC. The goal of the current study was to examine the activity of hippocampal CA1 pyramidal cells during acquisition of trace EBC with an early-developing somatosensory CS. Rat pups were trained with a vibration CS on postnatal days (P) 17-19, P21-23, and P24-26 while CA1 pyramidal cell activity was recorded. Results indicated that CA1 neurons show an age-related increase in responsiveness to trial events. Although the magnitude of neuronal responding showed age-related increases in activity, all three age groups demonstrated learning-related increases in firing rate magnitude and peaks in firing rate were evident both at CS onset and offset. These findings suggest that the ontogeny of trace eyeblink conditioning is related to both hippocampal and sensory system development. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The new semi-analytic code GalICS 2.0 - reproducing the galaxy stellar mass function and the Tully-Fisher relation simultaneously

    Science.gov (United States)

    Cattaneo, A.; Blaizot, J.; Devriendt, J. E. G.; Mamon, G. A.; Tollet, E.; Dekel, A.; Guiderdoni, B.; Kucukbas, M.; Thob, A. C. R.

    2017-10-01

    GalICS 2.0 is a new semi-analytic code to model the formation and evolution of galaxies in a cosmological context. N-body simulations based on a Planck cosmology are used to construct halo merger trees, track subhaloes, compute spins and measure concentrations. The accretion of gas on to galaxies and the morphological evolution of galaxies are modelled with prescriptions derived from hydrodynamic simulations. Star formation and stellar feedback are described with phenomenological models (as in other semi-analytic codes). GalICS 2.0 computes rotation speeds from the gravitational potential of the dark matter, the disc and the central bulge. As the rotation speed depends not only on the virial velocity but also on the ratio of baryons to dark matter within a galaxy, our calculation predicts a different Tully-Fisher relation from models in which vrot ∝ vvir. This is why, GalICS 2.0 is able to reproduce the galaxy stellar mass function and the Tully-Fisher relation simultaneously. Our results are also in agreement with halo masses from weak lensing and satellite kinematics, gas fractions, the relation between star formation rate (SFR) and stellar mass, the evolution of the cosmic SFR density, bulge-to-disc ratios, disc sizes and the Faber-Jackson relation.

  19. Trace Code Validation for BWR Spray Cooling Injection and CCFL Condition Based on GÖTA Facility Experiments

    Directory of Open Access Journals (Sweden)

    Stefano Racca

    2012-01-01

    Full Text Available Best estimate codes have been used in the past thirty years for the design, licensing, and safety of NPP. Nevertheless, large efforts are necessary for the qualification and the assessment of such codes. The aim of this work is to study the main phenomena involved in the emergency spray cooling injection in a Swedish-designed BWR. For this purpose, data from the Swedish separate effect test facility GÖTA have been simulated using TRACE version 5.0 Patch 2. Furthermore, uncertainty calculations have been performed with the propagation of input errors method, and the identification of the input parameters that mostly influence the peak cladding temperature has been performed.

  20. Analysis of a Station Black-Out transient in SMR by using the TRACE and RELAP5 code

    Science.gov (United States)

    De Rosa, F.; Lombardo, C.; Mascari, F.; Polidori, M.; Chiovaro, P.; D'Amico, S.; Moscato, I.; Vella, G.

    2014-11-01

    The present paper deals with the investigation of the evolution and consequences of a Station Black-Out (SBO) initiating event transient in the SPES3 facility [1]. This facility is an integral simulator of a small modular reactor being built at the SIET laboratories, in the framework of the R&D program on nuclear fission funded by the Italian Ministry of Economic Development and led by ENEA. The SBO transient will be simulated by using the RELAP5 and TRACE nodalizations of the SPES3 facility. Moreover, the analysis will contribute to study the differences on the code predictions considering the different modelling approach with one and/or three-dimensional components and to compare the capability of these codes to describe the SPES3 facility behaviour.

  1. Statistical safety evaluation of BWR turbine trip scenario using coupled neutron kinetics and thermal hydraulics analysis code SKETCH-INS/TRACE5.0

    International Nuclear Information System (INIS)

    Ichikawa, Ryoko; Masuhara, Yasuhiro; Kasahara, Fumio

    2012-01-01

    The Best Estimate Plus Uncertainty (BEPU) method has been prepared for the regulatory cross-check analysis at Japan Nuclear Energy Safety Organization (JNES) on base of the three-dimensional neutron-kinetics/thermal-hydraulics coupled code SKETCH-INS/TRACE5.0. In the preparation, TRACE5.0 is verified against the large-scale thermal-hydraulic tests carried out with NUPEC facility. These tests were focused on the pressure drop of steam-liquid two phase flow and void fraction distribution. From the comparison of the experimental data with other codes (RELAP5/MOD3.3 and TRAC-BF1), TRACE5.0 was judged better than other codes. It was confirmed that TRACE5.0 has high reliability for thermal hydraulics behavior and are used as a best-estimate code for the statistical safety evaluation. Next, the coupled code SKETCH-INS/TRACE5.0 was applied to turbine trip tests performed at the Peach Bottom-2 BWR4 Plant. The turbine trip event shows the rapid power peak due to the voids collapse with the pressure increase. The analyzed peak value of core power is better simulated than the previous version SKETCH-INS/TRAC-BF1. And the statistical safety evaluation using SKETCH-INS/TRACE5.0 was applied to the loss of load transient for examining the influence of the choice of sampling method. (author)

  2. Analysis of an ADS spurious opening event at a BWR/6 by means of the TRACE code

    International Nuclear Information System (INIS)

    Nikitin, Konstantin; Manera, Annalisa

    2011-01-01

    Highlights: → The spurious opening of 8 relief valves of the ADS system in a BWR/6 has been simulated. → The valves opening results in a fast depressurization and significant loads on the RPV internals. → This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the available plant data. - Abstract: The paper presents the results of a post-event analysis of a spurious opening of 8 relief valves of the automatic depressurization system (ADS) occurred in a BWR/6. The opening of the relief valves results in a fast depressurization (pressure blow down) of the primary system which might lead to significant dynamic loads on the RPV and associated internals. In addition, the RPV level swelling caused by the fast depressurization might lead to undesired water carry-over into the steam line and through the safety relief valves (SRVs). Therefore, the transient needs to be characterized in terms of evolution of pressure, temperature and fluid distribution in the system. This event has been modeled by means of the TRACE and TRAC-BF1 codes. The results are in good agreement with the plant data.

  3. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  4. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    Science.gov (United States)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  5. Uncertainty Methods Framework Development for the TRACE Thermal-Hydraulics Code by the U.S.NRC

    International Nuclear Information System (INIS)

    Bajorek, Stephen M.; Gingrich, Chester

    2013-01-01

    The Code of Federal Regulations, Title 10, Part 50.46 requires that the Emergency Core Cooling System (ECCS) performance be evaluated for a number of postulated Loss-Of-Coolant-Accidents (LOCAs). The rule allows two methods for calculation of the acceptance criteria; using a realistic model in the so-called 'Best Estimate' approach, or the more prescriptive following Appendix K to Part 50. Because of the conservatism of Appendix K, recent Evaluation Model submittals to the NRC used the realistic approach. With this approach, the Evaluation Model must demonstrate that the Peak Cladding Temperature (PCT), the Maximum Local Oxidation (MLO) and Core-Wide Oxidation (CWO) remain below their regulatory limits with a 'high probability'. Guidance for Best Estimate calculations following 50.46(a)(1) was provided by Regulatory Guide 1.157. This Guide identified a 95% probability level as being acceptable for comparisons of best-estimate predictions to the applicable regulatory limits, but was vague with respect to acceptable methods in which to determine the code uncertainty. Nor, did it specify if a confidence level should be determined. As a result, vendors have developed Evaluation Models utilizing several different methods to combine uncertainty parameters and determine the PCT and other variables to a high probability. In order to quantify the accuracy of TRACE calculations for a wide variety of applications and to audit Best Estimate calculations made by industry, the NRC is developing its own independent methodology to determine the peak cladding temperature and other parameters of regulatory interest to a high probability. Because several methods are in use, and each vendor's methodology ranges different parameters, the NRC method must be flexible and sufficiently general. Not only must the method apply to LOCA analysis for conventional light-water reactors, it must also be extendable to new reactor designs and type of analyses where the acceptance criteria are less

  6. A four-column theory for the origin of the genetic code: tracing the evolutionary pathways that gave rise to an optimized code

    Directory of Open Access Journals (Sweden)

    Higgs Paul G

    2009-04-01

    Full Text Available Abstract Background The arrangement of the amino acids in the genetic code is such that neighbouring codons are assigned to amino acids with similar physical properties. Hence, the effects of translational error are minimized with respect to randomly reshuffled codes. Further inspection reveals that it is amino acids in the same column of the code (i.e. same second base that are similar, whereas those in the same row show no particular similarity. We propose a 'four-column' theory for the origin of the code that explains how the action of selection during the build-up of the code leads to a final code that has the observed properties. Results The theory makes the following propositions. (i The earliest amino acids in the code were those that are easiest to synthesize non-biologically, namely Gly, Ala, Asp, Glu and Val. (ii These amino acids are assigned to codons with G at first position. Therefore the first code may have used only these codons. (iii The code rapidly developed into a four-column code where all codons in the same column coded for the same amino acid: NUN = Val, NCN = Ala, NAN = Asp and/or Glu, and NGN = Gly. (iv Later amino acids were added sequentially to the code by a process of subdivision of codon blocks in which a subset of the codons assigned to an early amino acid were reassigned to a later amino acid. (v Later amino acids were added into positions formerly occupied by amino acids with similar properties because this can occur with minimal disruption to the proteins already encoded by the earlier code. As a result, the properties of the amino acids in the final code retain a four-column pattern that is a relic of the earliest stages of code evolution. Conclusion The driving force during this process is not the minimization of translational error, but positive selection for the increased diversity and functionality of the proteins that can be made with a larger amino acid alphabet. Nevertheless, the code that results is one

  7. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    Science.gov (United States)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the

  8. Analysis of PWR control rod ejection accident with the coupled code system SKETCH-INS/TRACE by incorporating pin power reconstruction model

    International Nuclear Information System (INIS)

    Nakajima, T.; Sakai, T.

    2010-01-01

    The pin power reconstruction model was incorporated in the 3-D nodal kinetics code SKETCH-INS in order to produce accurate calculation of three-dimensional pin power distributions throughout the reactor core. In order to verify the employed pin power reconstruction model, the PWR MOX/UO 2 core transient benchmark problem was analyzed with the coupled code system SKETCH-INS/TRACE by incorporating the model and the influence of pin power reconstruction model was studied. SKETCH-INS pin power distributions for 3 benchmark problems were compared with the PARCS solutions which were provided by the host organisation of the benchmark. SKETCH-INS results were in good agreement with the PARCS results. The capability of employed pin power reconstruction model was confirmed through the analysis of benchmark problems. A PWR control rod ejection benchmark problem was analyzed with the coupled code system SKETCH-INS/ TRACE by incorporating the pin power reconstruction model. The influence of pin power reconstruction model was studied by comparing with the result of conventional node averaged flux model. The results indicate that the pin power reconstruction model has significant effect on the pin powers during transient and hence on the fuel enthalpy

  9. ARDISC (Argonne Dispersion Code): computer programs to calculate the distribution of trace element migration in partially equilibrating media

    International Nuclear Information System (INIS)

    Strickert, R.; Friedman, A.M.; Fried, S.

    1979-04-01

    A computer program (ARDISC, the Argonne Dispersion Code) is described which simulates the migration of nuclides in porous media and includes first order kinetic effects on the retention constants. The code allows for different absorption and desorption rates and solves the coupled migration equations by arithmetic reiterations. Input data needed are the absorption and desorption rates, equilibrium surface absorption coefficients, flow rates and volumes, and media porosities

  10. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    Science.gov (United States)

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  11. RELAP5 and TRACE assessment of the Achilles natural reflood experiment

    International Nuclear Information System (INIS)

    Berar, Ovidiu-Adrian; Prošek, Andrej; Mavko, Borut

    2013-01-01

    Highlights: • A RELAP5 and TRACE assessment of Achilles natural reflood experiment was performed. • The TRACE input model was converted from a RELAP5 input model. • The RELAP5/MOD 3.3 Patch 4 results are compared with TRACE v5 Patch 1 and 3 results. -- Abstract: The RELAP5 computer code was one of the most used best-estimate system codes in the international community for performing nuclear power plant safety analysis. At present time, the development of the U.S. Nuclear Regulatory Commission RELAP5 has been ceased in favor of the newer TRACE best-estimate system code. Thus, the importance of validation and assessment of the TRACE code becomes evident. The present work presents the assessment of the Achilles natural reflood experiment with the RELAP5/MOD3.3 Patch 4 and TRACE V5.0 Patch 1 and Patch 3. The TRACE input deck has been obtained by conversion of the already existing RELAP5 input deck. The results of the RELAP5/MOD3.3 Patch 4, TRACE V5.0 Patch 1 and Patch 3 calculations of the Achilles natural reflood were compared against the experimental data. The results show that both TRACE and RELAP5 are capable of reproducing the reflood phenomena at a satisfactory level. However, some discrepancies between the predicted variables and the experimental data suggest further investigation of the TRACE reflood model

  12. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    one is about the demonstration of a new MSR concept using the mathematic tools. In particular, the aim of the first part is to demonstrate the suitability of the TRACE code for the similar MSR designs by using a modified version of the TRACE code to implement the simulations for the steady-state, transient and accidental conditions. The basic approach of this part is to couple the thermal-hydraulic model and the modified point-kinetic model. The equivalent thermal-hydraulic model of the MSRE was built in 1D with three loops including all the critical main components. The point-kinetic model was improved through considering the precursor drift in order to produce more practical results in terms of the delayed neutron behavior. Additionally, new working fluids, namely the molten salts, were embedded into the source code of TRACE. Most results of the simulations show good agreements with the ORNL's reports and with another recent study and the errors were predictable and in an acceptable range. Therefore, the necessary code modification of TRACE appears to be successful and the model will be refined and its functions will be extended further in order to investigate new MSR design. Another part of this thesis is to implement a preliminary study on a new concept of molten salt reactor, namely the Dual Fluid Reactor (DFR). The DFR belongs to the group of the molten salt fast reactors (MSFR) and it is recently considered to be an option of minimum-waste and inherently safe operation of the nuclear reactors in the future. The DFR is using two separately circulating fluids in the reactor core. One is the fuel salt based on the mixture of tri-chlorides of uranium and plutonium (UCl{sub 3}-PuCl{sub 3}), while another is the coolant composed of the pure lead (Pb). The current work focuses on the basic dynamic behavior of a scaled-down DFR with 500 MW thermal output (DFR-500) instead of its reference design with 3000 MW thermal output (DFR-3000). For this purpose 10 parallel

  13. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  14. Simulation of a turbine trip from maximum power level without reactor trip in the TRILLO plant with the code TRACE v5.0 p3; Simulacion de un disparo de turbina desde maximo nivel de potencia sin disparo del reactor en la planta de TRILLO con el codigo TRACE v5.0 p3

    Energy Technology Data Exchange (ETDEWEB)

    Berna, C.; Escriva, A.; Munoz-Cobo, J. L.; Posada, J. M.

    2014-07-01

    The work consists in the simulation of code TRACE v5.0 p3 of the transient in turbine trip from highest level of power without reactor trip. In particular, a steady state with conditions very similar to the of the previous simulation made using the RELAP-MOD3 code has been obtained. In the transient, has been also satisfactory results, specifically the values of pressures, temperatures and mass flows, both in the secondary and primary circuit flow, are also very similar in both cases. In conclusion, have shown the ability to play the transition in study by the TRILLO plant using the code TRACE v5.0 p3 model, constituting a step in the process of verification of such a code. (Author)

  15. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  16. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  17. Reproducibility in Seismic Imaging

    Directory of Open Access Journals (Sweden)

    González-Verdejo O.

    2012-04-01

    Full Text Available Within the field of exploration seismology, there is interest at national level of integrating reproducibility in applied, educational and research activities related to seismic processing and imaging. This reproducibility implies the description and organization of the elements involved in numerical experiments. Thus, a researcher, teacher or student can study, verify, repeat, and modify them independently. In this work, we document and adapt reproducibility in seismic processing and imaging to spread this concept and its benefits, and to encourage the use of open source software in this area within our academic and professional environment. We present an enhanced seismic imaging example, of interest in both academic and professional environments, using Mexican seismic data. As a result of this research, we prove that it is possible to assimilate, adapt and transfer technology at low cost, using open source software and following a reproducible research scheme.

  18. Simulation with TRACE5 of a small break of 1% in the hot branch; Simulacion con TRACE5 de una Rotura Pequena del 1% en la Rama Caliente

    Energy Technology Data Exchange (ETDEWEB)

    Querol, A.; Gallardo, S.; Verdu, G.

    2013-07-01

    This work has been simulated, Thermo-hydraulic-coded TRACE5, the 1-2 Test of the OECD/NEA ROSA project that reproduces a break of 1% in the hot field of a water pressurized (PWR) reactor. The results are compared with the experimental values for studying the effect of the stratification of the liquid into hot branch, the geometry and the size of the break and flow injected by the HPI system.

  19. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  20. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  1. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  2. Kinetic modeling of high-Z tungsten impurity transport in ITER plasmas using the IMPGYRO code in the trace impurity limit

    Science.gov (United States)

    Yamoto, S.; Bonnin, X.; Homma, Y.; Inoue, H.; Hoshino, K.; Hatayama, A.; Pitts, R. A.

    2017-11-01

    In order to obtain a better understanding of tungsten (W) transport processes, we are developing the Monte-Carlo W transport code IMPGYRO. The code has the following characteristics which are important for calculating W transport: (1) the exact Larmor motion of W ions is computed so that the effects of drifts are automatically taken into account; (2) Coulomb collisions between W impurities and background plasma ions are modelled using the Binary Collision Model which provides more precise kinetic calculations of the friction and thermal forces. By using the IMPGYRO code, the W production/transport in the ITER geometry has been calculated under two different divertor operation modes (Case A: partially detached state and Case B: high recycling state) obtained from the SOLPS-ITER code suite calculation without the effect of drifts. The results of the W-density in the upstream SOL (scrape-off layer) strongly depend on the divertor operation mode. From the comparison of the W impurity transport between Case A and Case B, obtaining a partially detached state is shown to be effective to reduce W-impurities in the upstream SOL. The limitations of the employed model and the validity of the above results are discussed and future problems are summarized for further applications of IMPGYRO code to ITER plasmas.

  3. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  4. Opening Reproducible Research

    Science.gov (United States)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  5. Reproducibility in density functional theory calculations of solids

    DEFF Research Database (Denmark)

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn

    2016-01-01

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We...

  6. Intraoral gothic arch tracing.

    Science.gov (United States)

    Rubel, Barry; Hill, Edward E

    2011-01-01

    In order to create optimum esthetics, function and phonetics in complete denture fabrication, it is necessary to record accurate maxillo-mandibular determinants of occlusion. This requires clinical skill to establish an accurate, verifiable and reproducible vertical dimension of occlusion (VDO) and centric relation (CR). Correct vertical relation depends upon a consideration of several factors, including muscle tone, inter-dental arch space and parallelism of the ridges. Any errors made while taking maxillo-mandibular jaw relation records will result in dentures that are uncomfortable and, possibly, unwearable. The application of a tracing mechanism such as the Gothic arch tracer (a central bearing device) is a demonstrable method of determining centric relation. Intraoral Gothic arch tracers provide the advantage of capturing VDO and CR in an easy-to-use technique for practitioners. Intraoral tracing (Gothic arch tracing) is a preferred method of obtaining consistent positions of the mandible in motion (retrusive, protrusive and lateral) at a comfortable VDO.

  7. ITK: enabling reproducible research and open science

    Science.gov (United States)

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  8. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  9. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  10. A right to reproduce?

    Science.gov (United States)

    Emson, H E

    1992-10-31

    Conscious control of the environment by homo sapiens has brought almost total release from the controls of ecology that limit the population of all other species. After a mere 10,000 years, humans have brought the planet close to collapse, and all the debate in the world seems unlikely to save it. A combination of uncontrolled breeding and rapacity is propelling us down the slippery slope 1st envisioned by Malthus, dragging the rest of the planet along. Only the conscious control, and most likely voluntary, reimposition of controls on breeding will reduce the overgrowth of humans, and we have far to go in that direction. "According to the United Nations Universal Declaration of Human Rights (1948, articles 16[I] and 16 [III]), Men and women of full age without any limitation due to race, nationality or religion have the right to marry and to found a family ... the family is the natural and fundamental group unit of society." The rhetoric of rights without the balancing of responsibilities is wrong in health care, and even more wrong in the context of world population. The mind-set of dominance and exploitation over the rest of creation has meant human reluctance to admit participation in a system where every part is interdependent. We must balance the right to reproduce with it responsible use, valuing interdependence, understanding, and respect with a duty not to unbalance, damage, or destroy. It is long overdue that we discard every statement of right that is unmatched by the equivalent duty and responsibility.

  11. Trace Replay and Network Simulation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.

  12. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  13. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  14. Address tracing of parallel systems via TRAPEDS

    Science.gov (United States)

    Stunkel, Craig B.; Janssens, Bob; Fuchs, W. K.

    1992-01-01

    Trace-driven simulation is an important aid in performance analysis of computer systems. Capturing address traces to use in these simulations, however, is a difficult problem for parallel processor architectures. A technique termed TRAPEDS modifies executable code (at the assembly language level) to dynamically collect the address trace from executing code. TRAPEDS has recently been implemented on both a hypercube multicomputer and a shared-memory multiprocessor. Particular attention is focused on strategies for efficiently and accurately collecting traces from both classes of parallel machines. The iPSC/2 hypercube multicomputer implementation traces both user and system code, and performs simulation on-the-fly to avoid large storage costs. Strategies are detailed for mitigating address trace distortion when collecting operating system traces. The Encore Multimax multiprocessor implementation uses a timer-based approach to reflect the interleaving of the processor traces and stores the traces to disc. Time and space overhead results are presented for both TRAPEDS implementations. Experimental cache simulation results derived from iPSC/2 address traces are presented to illustrate the importance of tracing operating system references.

  15. Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges.

    Science.gov (United States)

    Mullane, Kevin; Williams, Michael

    2017-08-15

    Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Epidemiology and trace elements.

    Science.gov (United States)

    Elwood, P C

    1985-08-01

    Basically, epidemiology is the making of measurements of known reproducibility, in a bias-free manner, on representative samples of subjects drawn from defined communities. Epidemiology has become a relatively precise science and its value in medicine is widely appreciated. So too are its limitations: the difficulties in achieving a high response rate, in identifying and controlling confounding factors in the examination of an association, and the ultimate difficulties in distinguishing causation from association. While the value of community-based studies seems to be recognized by those interested in man and his environment, the need for the strict application of epidemiological procedures, and the limitations imposed on conclusions drawn from studies in which these procedures have been compromised, does not seem to be adequately understood. There are certain known links between trace elements in the environment and disease: for example the level of iodine in soil and water and the prevalence of goitre; the level of fluoride in water and the prevalence of dental caries. The investigation of other possible associations is difficult for a number of reasons, including interrelationships between trace elements, confounding of trace element levels (and disease) with social and dietary factors, and the probability that relationships are generally weak. Two conditions in which associations are likely are cardiovascular disease and cancer. Despite research along a number of lines, the relevance of trace elements to cardiovascular disease is not clear, and certainly the apparent association with hardness of domestic water supply seems unlikely to be causal. The same general conclusion seems reasonable for cancer, and although there are a very few well established associations which are likely to be causal, such as exposure to arsenic and skin cancer, the role of trace elements is obscure, and likely to be very small.

  17. Using different approaches to assess the reproducibility of a ...

    African Journals Online (AJOL)

    2011-02-24

    Feb 24, 2011 ... questionnaire (QFFQ) used for assessment of the habitual dietary intake of Setswana-speaking adults in the North West Province of South. Africa. ... Food intake was coded and analysed for nutrient intake per day for each subject. ..... and Ovarian Cancer Study Groups reported the reproducibility of a.

  18. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  19. Main considerations for modelling a station blackout scenario with trace

    Energy Technology Data Exchange (ETDEWEB)

    Querol, Andrea; Turégano, Jara; Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: anquevi@upv.es, E-mail: jaturna@upv.es, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), Universitat Politècnica de València (Spain)

    2017-07-01

    In the nuclear safety field, the thermal hydraulic phenomena that take place during an accident in a nuclear power plant is of special importance. One of the most studied accidents is the Station BlackOut (SBO). The aim of the present work is the analysis of the PKL integral test facility nodalization using the thermal-hydraulic code TRACE5 to reproduce a SBO accidental scenario. The PKL facility reproduces the main components of the primary and secondary systems of its reference nuclear power plant (Philippsburg II). The results obtained with different nodalization have been compared: 3D vessel vs 1D vessel, Steam Generator (SG) modelling using PIPE or TEE components and pressurizer modelling with PIPE or PRIZER components. Both vessel nodalization (1D vessel and 3D vessel) reproduce the physical phenomena of the experiment. However, there are significant discrepancies between them. The appropriate modelling of the SG is also relevant in the results. Regarding the other nodalization (PIPE or TEE components for SG and PIPE or PRIZER components for pressurizer), do not produce relevant differences in the results. (author)

  20. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0....

  1. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0....

  2. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  3. Context-sensitive trace inlining for Java.

    Science.gov (United States)

    Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter

    2013-12-01

    Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.

  4. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  5. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  6. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    from the quantified attributes predict overall preference well. The findings allow for some generalizations within musical program genres regarding the perception of and preference for certain spatial reproduction modes, but for limited generalizations across selections from different musical genres.......A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes...

  7. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  8. Progress toward openness, transparency, and reproducibility in cognitive neuroscience.

    Science.gov (United States)

    Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal

    2017-05-01

    Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.

  9. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scien...

  10. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  11. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  12. Simulation of a passive auxiliary feedwater system with TRACE5

    Energy Technology Data Exchange (ETDEWEB)

    Lorduy, María; Gallardo, Sergio; Verdú, Gumersindo, E-mail: maloral@upv.es, E-mail: sergalbe@iqn.upv.es, E-mail: gverdu@iqn.upv.es [Instituto Universitario de Seguridad Industrial, Radiofísica y Medioambiental (ISIRYM), València (Spain)

    2017-07-01

    The study of the nuclear power plant accidents occurred in recent decades, as well as the probabilistic risk assessment carried out for this type of facility, present human error as one of the main contingency factors. For this reason, the design and development of generation III, III+ and IV reactors, which include inherent and passive safety systems, have been promoted. In this work, a TRACE5 model of ATLAS (Advanced Thermal- Hydraulic Test Loop for Accident Simulation) is used to reproduce an accidental scenario consisting in a prolonged Station BlackOut (SBO). In particular, the A1.2 test of the OECD-ATLAS project is analyzed, whose purpose is to study the primary system cooling by means of the water supply to one of the steam generators from a Passive Auxiliary Feedwater System (PAFS). This safety feature prevents the loss of secondary system inventory by means of the steam condensation and its recirculation. Thus, the conservation of a heat sink allows the natural circulation flow rate until restoring stable conditions. For the reproduction of the test, an ATLAS model has been adapted to the experiment conditions, and a PAFS has been incorporated. >From the simulation test results, the main thermal-hydraulic variables (pressure, flow rates, collapsed water level and temperature) are analyzed in the different circuits, contrasting them with experimental data series. As a conclusion, the work shows the TRACE5 code capability to correctly simulate the behavior of a passive feedwater system. (author)

  13. Traces et espaces de consommation

    Directory of Open Access Journals (Sweden)

    Franck Cochoy

    2016-10-01

    Full Text Available L’avènement des technologies numériques mobiles contribue à une évolution des modalités de distribution et de consommation. Le présent article porte sur l’usage des QR-codes, ces codes-barres bidimensionnels qui offrent à tout usager équipé d’un smartphone l’accès à des contenus commerciaux en ligne. Ils participent à l’Internet des objets et donc au couplage entre espace physique et univers numérique. Ils permettent aussi la collecte de traces numériques porteuses de sens pour les professionnels mais aussi pour les sciences sociales. Grâce à ces traces, on peut comprendre les nouveaux liens marchands tissés entre l’espace physique et le développement de flux informationnels continus. À partir de l’analyse des traces enregistrées à l’occasion de la visite des QR-codes apposés sur trois produits alimentaires (une boîte de sel, une barre chocolatée, une bouteille d’eau, notre enquête s’attache à expliciter les enjeux théoriques, méthodologiques et analytiques du processus de numérisation de l’espace de mobilité physique marchand.

  14. LUCID - an optical design and raytrace code

    International Nuclear Information System (INIS)

    Nicholas, D.J.; Duffey, K.P.

    1980-11-01

    A 2D optical design and ray trace code is described. The code can operate either as a geometric optics propagation code or provide a scalar diffraction treatment. There are numerous non-standard options within the code including design and systems optimisation procedures. A number of illustrative problems relating to the design of optical components in the field of high power lasers is included. (author)

  15. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  16. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  17. Prospective Coding by Spiking Neurons.

    Directory of Open Access Journals (Sweden)

    Johanni Brea

    2016-06-01

    Full Text Available Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron's firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ.

  18. A Pragmatic Approach for Reproducible Research With Sensitive Data.

    Science.gov (United States)

    Shepherd, Bryan E; Blevins Peratikos, Meridith; Rebeiro, Peter F; Duda, Stephany N; McGowan, Catherine C

    2017-08-15

    Reproducible research is important for assessing the integrity of findings and disseminating methods, but it requires making original study data sets publicly available. This requirement is difficult to meet in settings with sensitive data, which can mean that resulting studies are not reproducible. For studies in which data cannot be shared, we propose a pragmatic approach to make research quasi-reproducible. On a publicly available website without restriction, researchers should post 1) analysis code used in the published study, 2) simulated data, and 3) results obtained by applying the analysis code used in the published study to the simulated data. Although it is not a perfect solution, such an approach makes analyses transparent for critical evaluation and dissemination and is therefore a significant improvement over current practice. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    Directory of Open Access Journals (Sweden)

    Julio Cesar Estrada Espinosa

    2014-01-01

    Full Text Available In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel’s size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel’s size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo.

  20. Development of PARASOL code

    Energy Technology Data Exchange (ETDEWEB)

    Hosokawa, Masanari [Research Organization for Information Science and Technology, Tokai, Ibaraki (Japan); Takizuka, Tomonori [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2000-05-01

    The divertor is expected to play key roles in tokamak reactors, such as ITER, for the heat removal, ash exhaust, and impurity shielding. Its performance is being predicted by using comprehensive simulation codes with the fluid model. In the fluid model for scrape-off layer (SOL) and divertor plasmas, various physics models are introduced. Kinetic approach is required to examine the validity of such physics models. One of the most powerful kinetic models is the particle simulation. Therefore a particle code PARASOL has been developed, and is being used for the simulation study of SOL and divertor plasmas. The PARASOL code treats the plasma bounded by two divertor plates, in which motions of ions and electrons are traced by using a electrostatic PIC method. Effects of Coulomb collisions are simulated by using a Monte-Carlo=method binary collision model. Motions of neutral particles are traced simultaneously with charged particles. In this report, we describe the physics model of PARASOL, the numerical methods, the configuration of the program, input parameters, output formats, samples of simulation results, the parallel computing method. The efficiency of the parallel computing with Paragon XP/S15-256 is demonstrated. (author)

  1. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  2. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  3. Epidemic contact tracing via communication traces.

    Science.gov (United States)

    Farrahi, Katayoun; Emonet, Rémi; Cebrian, Manuel

    2014-01-01

    Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  4. Epidemic contact tracing via communication traces.

    Directory of Open Access Journals (Sweden)

    Katayoun Farrahi

    Full Text Available Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.

  5. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  6. Parametric Trace Slicing

    Science.gov (United States)

    Rosu, Grigore (Inventor); Chen, Feng (Inventor); Chen, Guo-fang; Wu, Yamei; Meredith, Patrick O. (Inventor)

    2014-01-01

    A program trace is obtained and events of the program trace are traversed. For each event identified in traversing the program trace, a trace slice of which the identified event is a part is identified based on the parameter instance of the identified event. For each trace slice of which the identified event is a part, the identified event is added to an end of a record of the trace slice. These parametric trace slices can be used in a variety of different manners, such as for monitoring, mining, and predicting.

  7. The International Atomic Energy Agency Flag Code

    International Nuclear Information System (INIS)

    1999-01-01

    The document reproduces the text of the IAEA Flag Code which was promulgated by the Director General on 15 September 1999, pursuant to the decision of the Board of Governors on 10 June 1999 to adopt an Agency flag as described in document GOV/1999/41 and its use in accordance with a flag code to be promulgated by the Director General

  8. Network Coding

    Indian Academy of Sciences (India)

    message symbols downstream, network coding achieves vast performance gains by permitting intermediate nodes to carry out algebraic oper- ations on the incoming data. In this article we present a tutorial introduction to network coding as well as an application to the e±cient operation of distributed data-storage networks.

  9. Comparative evaluation of trace elements in blood

    International Nuclear Information System (INIS)

    Goeij, J.J.M. de; Tjioe, P.S.; Pries, C.; Zwiers, J.H.L.

    1976-01-01

    The Interuniversitair Reactor Instituut and the Centraal Laboratorium TNO have carried out a common investigation on neutron-activation-analytical procedures for the determination of trace elements in blood. A comparative evaluation of five methods, destructive as well as non-destructive, is given. The sensitivity and reproducibility of the procedures are discussed. By combining some of the methods it is possible, starting with 1 ml blood, to give quantitative information on 14 important trace elements: antimony, arsenic, bromine, cadmium, cobalt, gold, copper, mercury, molybdenum, nickel, rubidium, selenium, iron and zinc. The methods have also been applied to sodium, chromium and potassium

  10. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  11. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...

  12. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ...

  13. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  14. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  15. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017....... Coding Class projektet er et pilotprojekt, hvor en række skoler i København og Vejle kommuner har igangsat undervisningsaktiviteter med fokus på kodning og programmering i skolen. Evalueringen og dokumentationen af projektet omfatter kvalitative nedslag i udvalgte undervisningsinterventioner i efteråret...

  16. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  17. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  18. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  19. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  20. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  1. Transparent, reproducible and reusable research in pharmacoepidemiology

    NARCIS (Netherlands)

    Gardarsdottir, Helga; Sauer, Brian C.; Liang, Huifang; Ryan, Patrick; Klungel, Olaf; Reynolds, Robert

    2012-01-01

    Background: Epidemiological research has been criticized as being unreliable. Scientific evidence is strengthened when the study procedures of important findings are transparent, open for review, and easily reproduced by different investigators and in various settings. Studies often have common

  2. Thou Shalt Be Reproducible! A Technology Perspective

    Science.gov (United States)

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  3. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  4. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  5. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  6. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  7. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  8. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and softwa...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....

  9. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  10. ANIMAL code

    Energy Technology Data Exchange (ETDEWEB)

    Lindemuth, I.R.

    1979-02-28

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables.

  11. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  12. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  13. Expander Codes

    Indian Academy of Sciences (India)

    Codes and Channels. A noisy communication channel is illustrated in Fig- ... nication channel. Suppose we want to transmit a message over the unreliable communication channel so that even if the channel corrupts some of the bits we are able to recover ..... is d-regular, meaning thereby that every vertex has de- gree d.

  14. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  15. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  16. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  17. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  18. Generation and validation of traces between requirements and architecture based on formal trace semantics

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas

    The size and complexity of software systems make integration of the new/modified requirements to the software system costly and time consuming. The impact of requirements changes on other requirements, design elements and source code should be traced to determine parts of the software to be changed.

  19. The Trace of Superusers

    DEFF Research Database (Denmark)

    Samson, Kristine; Abasolo, José

    2013-01-01

    of people’s everyday life.However, traces of culture, the routines and every day habits of immigrant culture can both emerge through informal colonization in the every day and be intentionally designed. By juxtaposing immigrant spatial traces in Santiago Centro with the intentionally designed traces......The city and its public spaces can be seen as a fragmented whole carrying meanings and traces of culture, use and politics with it. Whereas architects impose new stories and meanings on the urban fabric, the city itself is layered and assembled, a collective of social flows and routines a result...... of immigrant culture at Superkilen, Nørrebro in Copenhagen, this article seeks to discuss how traces influence public space, and how various ideologies and even politics are interwoven into the urban fabric by means of urban traces....

  20. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  1. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael C.; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric-Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  2. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  3. REPRODUCIBILITY OF CHILDHOOD RESPIRATORY SYMPTOM QUESTIONS

    NARCIS (Netherlands)

    BRUNEKREEF, B; GROOT, B; RIJCKEN, B; HOEK, G; STEENBEKKERS, A; DEBOER, A

    The reproducibility of answers to childhood respiratory symptom questions was investigated by administering two childhood respiratory symptom questionnaires twice, with a one month interval, to the same population of Dutch school children. The questionnaires were completed by the parents of 410

  4. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  5. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  6. Repeat: a framework to assess empirical reproducibility in biomedical research

    Directory of Open Access Journals (Sweden)

    Leslie D. McIntosh

    2017-09-01

    Full Text Available Abstract Background The reproducibility of research is essential to rigorous science, yet significant concerns of the reliability and verifiability of biomedical research have been recently highlighted. Ongoing efforts across several domains of science and policy are working to clarify the fundamental characteristics of reproducibility and to enhance the transparency and accessibility of research. Methods The aim of the proceeding work is to develop an assessment tool operationalizing key concepts of research transparency in the biomedical domain, specifically for secondary biomedical data research using electronic health record data. The tool (RepeAT was developed through a multi-phase process that involved coding and extracting recommendations and practices for improving reproducibility from publications and reports across the biomedical and statistical sciences, field testing the instrument, and refining variables. Results RepeAT includes 119 unique variables grouped into five categories (research design and aim, database and data collection methods, data mining and data cleaning, data analysis, data sharing and documentation. Preliminary results in manually processing 40 scientific manuscripts indicate components of the proposed framework with strong inter-rater reliability, as well as directions for further research and refinement of RepeAT. Conclusions The use of RepeAT may allow the biomedical community to have a better understanding of the current practices of research transparency and accessibility among principal investigators. Common adoption of RepeAT may improve reporting of research practices and the availability of research outputs. Additionally, use of RepeAT will facilitate comparisons of research transparency and accessibility across domains and institutions.

  7. Computer codes validation for conditions of core voiding

    International Nuclear Information System (INIS)

    Delja, A.; Hawley, P.

    2011-01-01

    Void generation during a Loss of Coolant Accident (LOCA) in a core of a CANDU reactor is of specific importance because of its strong coupling with reactor neutronics. The use of dynamic behaviour and computer code capability to predict void generation accurately in the temporal and spatial domain of the reactor core is fundamental for the determination of CANDU safety. The Canadian industry has used the RD-14M test facilities for its code validation. The validation exercises for the Canadian computer codes TUF and CATHENA were performed some years ago. Recently, the CNSC has gained access to the USNRC computer code TRACE. This has provided an opportunity to explore the use of this code in CANDU related applications. As a part of regulatory assessment and resolving identified Generic Issues (GI), and in an effort to build independent thermal hydraulic computer codes assessment capability within the CNSC, preliminary validation exercises were performed using the TRACE computer code for an evaluation of the void generation phenomena. The paper presents a preliminary assessment of the TRACE computer code for an RD-14M channel voiding test. It is also a validation exercise of void generation for the TRACE computer code. The accuracy of the obtained results is discussed and compared with previous validation assessments that were done using the CATHENA and TUF codes. (author)

  8. What is Process Tracing actually tracing?

    DEFF Research Database (Denmark)

    Beach, Derek; Pedersen, Rasmus Brun

    and when we use PT case studies. First, there are differences in what we are actually tracing in the three variants, resulting in different methodological prescriptions for each variant. Second, the types of inferences being made are also different; the variants therefore have different analytical uses...

  9. System Code Models and Capabilities

    International Nuclear Information System (INIS)

    Bestion, D.

    2008-01-01

    System thermalhydraulic codes such as RELAP, TRACE, CATHARE or ATHLET are now commonly used for reactor transient simulations. The whole methodology of code development is described including the derivation of the system of equations, the analysis of experimental data to obtain closure relation and the validation process. The characteristics of the models are briefly presented starting with the basic assumptions, the system of equations and the derivation of closure relationships. An extensive work was devoted during the last three decades to the improvement and validation of these models, which resulted in some homogenisation of the different codes although separately developed. The so called two-fluid model is the common basis of these codes and it is shown how it can describe both thermal and mechanical nonequilibrium. A review of some important physical models allows to illustrate the main capabilities and limitations of system codes. Attention is drawn on the role of flow regime maps, on the various methods for developing closure laws, on the role of interfacial area and turbulence on interfacial and wall transfers. More details are given for interfacial friction laws and their relation with drift flux models. Prediction of chocked flow and CFFL is also addressed. Based on some limitations of the present generation of codes, perspectives for future are drawn.

  10. Interactive Stable Ray Tracing

    DEFF Research Database (Denmark)

    Dal Corso, Alessandro; Salvi, Marco; Kolb, Craig

    2017-01-01

    Interactive ray tracing applications running on commodity hardware can suffer from objectionable temporal artifacts due to a low sample count. We introduce stable ray tracing, a technique that improves temporal stability without the over-blurring and ghosting artifacts typical of temporal post-pr...

  11. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  12. Nuclear traces in glass

    International Nuclear Information System (INIS)

    Segovia A, M. de N.

    1978-01-01

    The charged particles produce, in dielectric materials, physical and chemical effects which make evident the damaged zone along the trajectory of the particle. This damaged zone is known as the latent trace. The latent traces can be enlarged by an etching of the detector material. This treatment attacks preferently the zones of the material where the charged particles have penetrated, producing concavities which can be observed through a low magnification optical microscope. These concavities are known as developed traces. In this work we describe the glass characteristics as a detector of the fission fragments traces. In the first chapter we present a summary of the existing basic theories to explain the formation of traces in solids. In the second chapter we describe the etching method used for the traces development. In the following chapters we determine some chatacteristics of the traces formed on the glass, such as: the development optimum time; the diameter variation of the traces and their density according to the temperature variation of the detector; the glass response to a radiation more penetrating than that of the fission fragments; the distribution of the developed traces and the existing relation between this ditribution and the fission fragments of 252 Cf energies. The method which has been used is simple and cheap and can be utilized in laboratories whose resources are limited. The commercial glass which has been employed allows the registration of the fission fragments and subsequently the realization of experiments which involve the counting of the traces as well as the identification of particles. (author)

  13. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  14. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  15. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  16. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  17. MCViNE - An object oriented Monte Carlo neutron ray tracing simulation package

    Science.gov (United States)

    Lin, Jiao Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A.; Aivazis, Michael; Fultz, Brent

    2016-02-01

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.

  18. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  19. An International Ki67 Reproducibility Study

    Science.gov (United States)

    2013-01-01

    Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without

  20. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  1. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  2. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  3. A reproducible canine model of esophageal varices.

    Science.gov (United States)

    Jensen, D M; Machicado, G A; Tapia, J I; Kauffman, G; Franco, P; Beilin, D

    1983-03-01

    One of the most promising nonoperative techniques for control of variceal hemorrhage is sclerosis via the fiberoptic endoscope. Many questions remain, however, about sclerosing agents, guidelines for effective use, and limitations of endoscopic techniques. A reproducible large animal model of esophageal varices would facilitate the critical evaluation of techniques for variceal hemostasis or sclerosis. Our purpose was to develop a large animal model of esophageal varices. Studies in pigs and dogs are described which led to the development of a reproducible canine model of esophageal varices. For the final model, mongrel dogs had laparotomy, side-to-side portacaval shunt, inferior vena cava ligation, placement of an ameroid constrictor around the portal vein, and liver biopsy. The mean (+/- SE) pre- and postshunt portal pressure increased significantly from 12 +/- 0.4 to 23 +/- 1 cm saline. Weekly endoscopies were performed to grade the varix size. Two-thirds of animals developed medium or large sized esophageal varices after the first operation. Three to six weeks later, a second laparotomy with complete ligation of the portal vein and liver biopsy were performed in animals with varices (one-third of the animals). All dogs developed esophageal varices and abdominal wall collateral veins of variable size 3-6 wk after the first operation. After the second operation, the varices became larger. Shunting of blood through esophageal varices via splenic and gastric veins was demonstrated by angiography. Sequential liver biopsies were normal. There was no morbidity or mortality. Ascites, encephalopathy, or spontaneous variceal bleeding did not occur. We have documented the lack of size change and the persistence of medium to large esophageal varices and abdominal collateral veins in all animals followed for more than 6 mo. Variceal bleeding could be induced by venipuncture for testing endoscopic hemostatic and sclerosis methods. We suggest other potential uses of this

  4. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  5. Queer nuclear families? Reproducing and transgressing heteronormativity.

    Science.gov (United States)

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  6. Traces of Drosophila Memory

    Science.gov (United States)

    Davis, Ronald L.

    2012-01-01

    Summary Studies using functional cellullar imaging of living flies have identified six memory traces that form in the olfactory nervous system after conditioning with odors. These traces occur in distinct nodes of the olfactory nervous system, form and disappear across different windows of time, and are detected in the imaged neurons as increased calcium influx or synaptic release in response to the conditioned odor. Three traces form at, or near acquisition and co-exist with short-term behavioral memory. One trace forms with a delay after learning and co-exists with intermediate-term behavioral memory. Two traces form many hours after acquisition and co-exist with long-term behavioral memory. The transient memory traces may support behavior across the time-windows of their existence. The experimental approaches for dissecting memory formation in the fly, ranging from the molecular to the systems, make it an ideal system for dissecting the logic by which the nervous system organizes and stores different temporal forms of memory. PMID:21482352

  7. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  8. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  9. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  10. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    contribution furthermore presents how the asymptotic convergence of Iterative Learning Control is combined with the closed-loop performance of Model Predictive Control to form a robust and asymptotically stable optimal controller for ensuring reliable and reproducible operation of batch processes....... This controller may also be used for Optimizing control. The modeling and control performance is demonstrated on a fed-batch protein cultivation example. The presented methodologies lend themselves directly for application as Process Analytical Technologies (PAT).......This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present...

  11. The NANOGrav Observing Program: Automation and Reproducibility

    Science.gov (United States)

    Brazier, Adam; Cordes, James; Demorest, Paul; Dolch, Timothy; Ferdman, Robert; Garver-Daniels, Nathaniel; Hawkins, Steven; Lam, Michael Timothy; Lazio, T. Joseph W.

    2018-01-01

    The NANOGrav Observing Program is a decades-long search for gravitational waves using pulsar timing which relies, for its sensitivity, on large data sets from observations of many pulsars. These are constructed through an intensive, long-term observing campaign. The nature of the program requires automation in the transfer and archiving of the large volume of raw telescope data, the calibration of those data, and making these resulting data products—required for diagnostic and data exploration purposes—available to NANOGrav members. Reproducibility of results is a key goal in this project, and essential to its success; it requires treating the software itself as a data product of the research, while ensuring easy access by, and collaboration between, members of NANOGrav, the International Pulsar Timing Array consortium (of which NANOGrav is a key member), as well as the wider astronomy community and the public.

  12. Uniform and reproducible stirring in a microbioreactor

    DEFF Research Database (Denmark)

    Bolic, Andrijana; Eliasson Lantz, Anna; Rottwitt, Karsten

    At present, research in bioprocess science and engineering increasingly requires fast and accurate analytical data (rapid testing) that can be used for investigation of the interaction between bioprocess operation conditions and the performance of the bioprocess. Miniaturization is certainly...... microbioreactor application. In order to address some of these questions, we are currently investigating and developing a microbioreactor platform with a reactor volume up to 1ml, as we believe that this volume is of interest to many industrial applications. It is widely known that stirring plays a very important...... role in achieving successful cultivations by promoting uniform process conditions and – for aerobic cultivations – a high oxygen transfer rate. In this contribution, the development of a suitable, reliable and reproducible stirrer in a microbioreactor for batch and continuous cultivation of S...

  13. Is Grannum grading of the placenta reproducible?

    Science.gov (United States)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  14. Trace analysis by TXRF

    International Nuclear Information System (INIS)

    Hockett, R.S.

    1995-01-01

    Total reflection X-Ray Fluorescence (TXRF) originally was developed for trace analysis of small residues but has become a widespread method for measuring trace surface metal contamination an semiconductor substrates. It is estimated that approximately 100 TXRF instruments are in se in the semiconductor industry worldwide, and approximately half that for residue analysis x analytical laboratories. TXRF instrumentation is available today for reaching detection limits d the order of 10 9 atoms/cm 2 . This review emphasizes some of the more recent developments in TXRF for trace analysis, in particular with the use of synchrotron x-ray sources (SR-TXRF). There is some promise of reaching 10 7 atoms/cm 2 detection limits for surface analysis of semi-conductor substrates. 19 refs

  15. Goya - an MHD equilibrium code for toroidal plasmas

    International Nuclear Information System (INIS)

    Scheffel, J.

    1984-09-01

    A description of the GOYA free-boundary equilibrium code is given. The non-linear Grad-Shafranov equation of ideal MHD is solved in a toroidal geometry for plasmas with purely poloidal magnetic fields. The code is based on a field line-tracing procedure, making storage of a large amount of information on a grid unnecessary. Usage of the code is demonstrated by computations of equi/libria for the EXTRAP-T1 device. (Author)

  16. International Workshop on Coding Theory and Algebraic Geometry

    CERN Document Server

    Tsfasman, Michael

    1992-01-01

    About ten years ago, V.D. Goppa found a surprising connection between the theory of algebraic curves over a finite field and error-correcting codes. The aim of the meeting "Algebraic Geometry and Coding Theory" was to give a survey on the present state of research in this field and related topics. The proceedings contain research papers on several aspects of the theory, among them: Codes constructed from special curves and from higher-dimensional varieties, Decoding of algebraic geometric codes, Trace codes, Exponen- tial sums, Fast multiplication in finite fields, Asymptotic number of points on algebraic curves, Sphere packings.

  17. Atom trap trace analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Z.-T.; Bailey, K.; Chen, C.-Y.; Du, X.; Li, Y.-M.; O' Connor, T. P.; Young, L.

    2000-05-25

    A new method of ultrasensitive trace-isotope analysis has been developed based upon the technique of laser manipulation of neutral atoms. It has been used to count individual {sup 85}Kr and {sup 81}Kr atoms present in a natural krypton sample with isotopic abundances in the range of 10{sup {minus}11} and 10{sup {minus}13}, respectively. The atom counts are free of contamination from other isotopes, elements,or molecules. The method is applicable to other trace-isotopes that can be efficiently captured with a magneto-optical trap, and has a broad range of potential applications.

  18. Controlling Fundamental Fluctuations for Reproducible Growth of Large Single-Crystal Graphene.

    Science.gov (United States)

    Guo, Wei; Wu, Bin; Wang, Shuai; Liu, Yunqi

    2018-02-27

    The controlled growth of graphene by the chemical vapor deposition method is vital for its various applications; however, the reproducibility remains a great challenge. Here, using single-crystal graphene growth on a Cu surface as a model system, we demonstrate that a trace amount of H 2 O and O 2 impurity gases in the reaction chamber is key for the large fluctuation of graphene growth. By precisely controlling their parts per million level concentrations, centimeter-sized single-crystal graphene is obtained in a reliable manner with a maximum growth rate up to 190 μm min -1 . The roles of oxidants are elucidated as an effective modulator for both graphene nucleation density and growth rate. This control is more fundamental for reliable growth of graphene beyond previous findings and is expected to be useful for the growth of various 2D materials that are also sensitive to trace oxidant impurities.

  19. Oscilloscope trace photograph digitizing system (TRACE)

    International Nuclear Information System (INIS)

    Richards, M.; Dabbs, R.D.

    1977-10-01

    The digitizing system allows digitization of photographs or sketches of waveforms and then the computer is used to reduce and analyze the data. The software allows for alignment, calibration, removal of baselines, removal of unwanted points and addition of new points which makes for a fairly versatile system as far as data reduction and manipulation are concerned. System considerations are introduced first to orient the potential user to the process of digitizing information. The start up and actual commands for TRACE are discussed. Detailed descriptions of each subroutine and program section are also provided. Three general examples of typical photographs are included. A partial listing of FAWTEK is made available. Once suitable arrays that contain the data are arranged, ''GO FA'' (active FAWTEK) and many mathematical operations to further analyze the data may be performed

  20. Response to Comment on "Estimating the reproducibility of psychological science"

    NARCIS (Netherlands)

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-01-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively

  1. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Model Children's Code.

    Science.gov (United States)

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  3. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show that af...

  4. The substorm cycle as reproduced by global MHD models

    Science.gov (United States)

    Gordeev, E.; Sergeev, V.; Tsyganenko, N.; Kuznetsova, M.; Rastäetter, L.; Raeder, J.; Tóth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.

    2017-01-01

    Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized 2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to postprocessing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.

  5. TRACING CULPABLE IGNORANCE

    NARCIS (Netherlands)

    Peels, Rik

    2011-01-01

    In this paper, I respond to the following argument which several authors have presented. If we are culpable for some action, we act either from akrasia or from culpable ignorance. However, akrasia is highly exceptional and it turns out that tracing culpable ignorance leads to a vicious regress.

  6. Third order trace formula

    Indian Academy of Sciences (India)

    adjoint operator is bounded and the perturbation is ... Keywords. Trace formula; spectral shift function; perturbations of self-adjoint operators. 1. Introduction. Notations. .... j=1 j−1. ∑ k=0. Ar− j−1. Y(A + X)k X Aj−k−1, leading to the estimate. ∥. ∥.

  7. Third order trace formula

    Indian Academy of Sciences (India)

    Home; Journals; Proceedings – Mathematical Sciences; Volume 123; Issue 4. Third Order Trace Formula. Arup Chattopadhyay Kalyan B Sinha. Volume 123 Issue 4 November 2013 pp 547-575. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/pmsc/123/04/0547-0575 ...

  8. Reliability: on the reproducibility of assessment data.

    Science.gov (United States)

    Downing, Steven M

    2004-09-01

    All assessment data, like other scientific experimental data, must be reproducible in order to be meaningfully interpreted. The purpose of this paper is to discuss applications of reliability to the most common assessment methods in medical education. Typical methods of estimating reliability are discussed intuitively and non-mathematically. Reliability refers to the consistency of assessment outcomes. The exact type of consistency of greatest interest depends on the type of assessment, its purpose and the consequential use of the data. Written tests of cognitive achievement look to internal test consistency, using estimation methods derived from the test-retest design. Rater-based assessment data, such as ratings of clinical performance on the wards, require interrater consistency or agreement. Objective structured clinical examinations, simulated patient examinations and other performance-type assessments generally require generalisability theory analysis to account for various sources of measurement error in complex designs and to estimate the consistency of the generalisations to a universe or domain of skills. Reliability is a major source of validity evidence for assessments. Low reliability indicates that large variations in scores can be expected upon retesting. Inconsistent assessment scores are difficult or impossible to interpret meaningfully and thus reduce validity evidence. Reliability coefficients allow the quantification and estimation of the random errors of measurement in assessments, such that overall assessment can be improved.

  9. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  10. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  11. Reproducibility of the water drinking test.

    Science.gov (United States)

    Muñoz, C R; Macias, J H; Hartleben, C

    2015-11-01

    To investigate the reproducibility of the water drinking test in determining intraocular pressure peaks and fluctuation. It has been suggested that there is limited agreement between the water drinking test and diurnal tension curve. This may be because it has only been compared with a 10-hour modified diurnal tension curve, missing 70% of IOP peaks that occurred during night. This was a prospective, analytical and comparative study that assesses the correlation, agreement, sensitivity and specificity of the water drinking test. The correlation between the water drinking test and diurnal tension curve was significant and strong (r=0.93, Confidence interval 95% between 0.79 and 0.96, p<01). A moderate agreement was observed between these measurements (pc=0.93, Confidence interval 95% between 0.87 and 0.95, p<.01). The agreement was within±2mmHg in 89% of the tests. Our study found a moderate agreement between the water drinking test and diurnal tension curve, in contrast with the poor agreement found in other studies, possibly due to the absence of nocturnal IOP peaks. These findings suggest that the water drinking test could be used to determine IOP peaks, as well as for determining baseline IOP. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  12. Are classifications of proximal radius fractures reproducible?

    Directory of Open Access Journals (Sweden)

    dos Santos João BG

    2009-10-01

    Full Text Available Abstract Background Fractures of the proximal radius need to be classified in an appropriate and reproducible manner. The aim of this study was to assess the reliability of the three most widely used classification systems. Methods Elbow radiographs images of patients with proximal radius fractures were classified according to Mason, Morrey, and Arbeitsgemeinschaft für osteosynthesefragen/Association for the Study of Internal Fixation (AO/ASIF classifications by four observers with different experience with this subject to assess their intra- and inter-observer agreement. Each observer analyzed the images on three different occasions on a computer with numerical sequence randomly altered. Results We found that intra-observer agreement of Mason and Morrey classifications were satisfactory (κ = 0.582 and 0.554, respectively, while the AO/ASIF classification had poor intra-observer agreement (κ = 0.483. Inter-observer agreement was higher in the Mason (κ = 0.429-0.560 and Morrey (κ = 0.319-0.487 classifications than in the AO/ASIF classification (κ = 0.250-0.478, which showed poor reliability. Conclusion Inter- and intra-observer agreement of the Mason and Morey classifications showed overall satisfactory reliability when compared to the AO/ASIF system. The Mason classification is the most reliable system.

  13. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  14. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    and transforming that information into quantitative data. However, this process is frequently required in research and quality assurance contexts. The purpose of this study was to examine inter-rater reproducibility (agreement and reliability) among an inexperienced group of clinicians in extracting spinal...... of radiological training is not required in order to transform MRI-derived pathoanatomic information from a narrative format to a quantitative format with high reproducibility for research or quality assurance purposes....... a categorical electronic coding matrix. Decision rules were developed after initial coding in an effort to resolve ambiguities in narrative reports. This process was repeated a further three times using separate samples of 20 MRI reports until no further ambiguities were identified (total n=80). Reproducibility...

  15. Using Docker Containers to Extend Reproducibility Architecture for the NASA Earth Exchange (NEX)

    Science.gov (United States)

    Votava, Petr; Michaelis, Andrew; Spaulding, Ryan; Becker, Jeffrey C.

    2016-01-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. As NEX has been growing into a petabyte-size platform for analysis, experiments and data production, it has been increasingly important to enable users to easily retrace their steps, identify what datasets were produced by which process chains, and give them ability to readily reproduce their results. This can be a tedious and difficult task even for a small project, but is almost impossible on large processing pipelines. We have developed an initial reproducibility and knowledge capture solution for the NEX, however, if users want to move the code to another system, whether it is their home institution cluster, laptop or the cloud, they have to find, build and install all the required dependencies that would run their code. This can be a very tedious and tricky process and is a big impediment to moving code to data and reproducibility outside the original system. The NEX team has tried to assist users who wanted to move their code into OpenNEX on Amazon cloud by creating custom virtual machines with all the software and dependencies installed, but this, while solving some of the issues, creates a new bottleneck that requires the NEX team to be involved with any new request, updates to virtual machines and general maintenance support. In this presentation, we will describe a solution that integrates NEX and Docker to bridge the gap in code-to-data migration. The core of the solution is saemi-automatic conversion of science codes, tools and services that are already tracked and described in the NEX provenance system, to Docker - an open-source Linux container software. Docker is available on most computer platforms, easy to install and capable of seamlessly creating and/or executing any application packaged in the appropriate format. We

  16. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  17. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  18. IonRayTrace: An HF Propagation Model for Communications and Radar Applications

    Science.gov (United States)

    2014-12-01

    unnecessary. 3 Note that some of these fixes (1 and 3), while necessary for correct estimation and reinsertion into the next AREPS distribution, are...TRANSITIONS Beyond reinsertion of the high-fidelity HF ray trace code into future AREPS distributions, there is considerable interest in adding IonRayTrace

  19. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  20. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  1. Improving performance of single-path code through a time-predictable memory hierarchy

    DEFF Research Database (Denmark)

    Cilku, Bekim; Puffitsch, Wolfgang; Prokesch, Daniel

    2017-01-01

    . The single-path code generation overcomes these problems by generating time-predictable code that has a single execution trace. However, the simplicity of this approach comes at the cost of longer execution times. This paper addresses performance improvements for single-path code. We propose a time......-predictable memory hierarchy with a prefetcher that exploits the predictability of execution traces in single-path code to speed up code execution. The new memory hierarchy reduces both the cache-miss penalty time and the cache-miss rate on the instruction cache. The benefit of the approach is demonstrated through...

  2. Experimental challenges to reproduce seismic fault motion

    Science.gov (United States)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  3. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  4. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    International Nuclear Information System (INIS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-01-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  5. Reproducibility of pacing profiles in elite swimmers.

    Science.gov (United States)

    Skorski, Sabrina; Faude, Oliver; Caviezel, Seraina; Meyer, Tim

    2014-03-01

    To analyze the reproducibility of pacing in elite swimmers during competitions and to compare heats and finals within 1 event. Finals and heats of 158 male swimmers (age 22.8 ± 2.9 y) from 29 nations were analyzed in 2 competitions (downloaded from swimrankings.net). Of these, 134 were listed in the world's top 50 in 2010; the remaining 24 were finalists of the Pan Pacific Games or European Championships. The level of both competitions for the analysis had to be at least national championships (7.7 ± 5.4 wk apart). Standard error of measurement expressed as percentage of the subject's mean score (CV) with 90% confidence limits (CL) for each 50-m split time and for total times were calculated. In addition, mixed general modeling was used to determine standard deviations between and within swimmers. CV for total time in finals ranged between 0.8% and 1.3% (CL 0.6-2.2%). Regarding split times, 200-m freestyle showed a consistent pacing over all split times (CV 0.9-1.6%). During butterfly, backstroke, and 400-m freestyle, CVs were low in the first 3 and 7 sections, respectively (CV 0.9-1.7%), with greater variability in the last section (1.9-2.2%). In breaststroke, values were higher in all sections (CV 1.2-2.3%). Within-subject SDs for changes between laps were between 0.9% and 2.6% in all finals. Split-time variability for finals and heats ranged between 0.9% and 2.5% (CL 0.3-4.9%). Pacing profiles are consistent between different competitions. Variability of pacing seems to be a result of the within-subject variation rather than a result of different competitions.

  6. Singular traces theory and applications

    CERN Document Server

    Sukochev, Fedor; Zanin, Dmitriy

    2012-01-01

    This text is the first complete study and monograph dedicated to singular traces. For mathematical readers the text offers, due to Nigel Kalton's contribution, a complete theory of traces on symmetrically normed ideals of compact operators. For mathematical physicists and other users of Connes' noncommutative geometry the text offers a complete reference to Dixmier traces and the deeper mathematical features of singular traces. An application section explores the consequences of these features, which previously were not discussed in general texts on noncommutative geometry.

  7. Coding of Stimuli by Animals: Retrospection, Prospection, Episodic Memory and Future Planning

    Science.gov (United States)

    Zentall, Thomas R.

    2010-01-01

    When animals code stimuli for later retrieval they can either code them in terms of the stimulus presented (as a retrospective memory) or in terms of the response or outcome anticipated (as a prospective memory). Although retrospective memory is typically assumed (as in the form of a memory trace), evidence of prospective coding has been found…

  8. Tracing Actual Causes

    Science.gov (United States)

    2016-08-08

    was articu- lated by David Lewis in his work on causal ex- planations [Lewis, 1986a]. We address the prob- lem by defining the causal history of the...of actual causation involve coun- terfactuals. The counterfactual tradition goes back to Hume [ Hume , 1748] whose position was that an event c is a...work We present a new take on the old problem of tracing ac- tual causes articulated by David Lewis in his work on causal explanations [Lewis, 1986a]. We

  9. Trace elements in adolescents

    OpenAIRE

    Barany, Ebba

    2002-01-01

    The major aim of the thesis was to monitor toxic and essential trace elements in a cohort of adolescents by blood and serum analyses, and describe the impact of different factors on the element concentrations. The adolescents were from the Swedish cities Uppsala and Trollhättan which represent different socioeconomic and environmental conditions, and were investigated at age 15 and 17. It was shown that an inductively coupled plasma mass spectrometry method was suitable for simultaneous deter...

  10. On Trace Zero Matrices

    Indian Academy of Sciences (India)

    In this note, we shall try to present an elemen- tary proof of a couple of closely related results which have both proved quite useful, and al~ indicate possible generalisations. The results we have in mind are the following facts: (a) A complex n x n matrix A has trace 0 if and only if it is expressible in the form A = PQ - Q P.

  11. An analysis of reproducibility and non-determinism in HEP software and ROOT data

    Science.gov (United States)

    Ivie, Peter; Zheng, Charles; Lannon, Kevin; Thain, Douglas

    2017-10-01

    Reproducibility is an essential component of the scientific method. In order to validate the correctness or facilitate the extension of a computational result, it should be possible to re-run a published result and verify that the same results are produced. However, reproducing a computational result is surprisingly difficult: non-determinism and other factors may make it impossible to get the same result, even when running the same code on the same machine on the same day. We explore this problem in the context of HEP codes and data, showing three high level methods for dealing with non-determinism in general: 1) Domain specific methods; 2) Domain specific comparisons; and 3) Virtualization adjustments. Using a CMS workflow with output data stored in ROOT files, we use these methods to prevent, detect, and eliminate some sources of non-determinism. We observe improved determinism using pre-determined random seeds, a predictable progression of system timestamps, and fixed process identifiers. Unfortunately, sources of non-determinism continue to exist despite the combination of all three methods. Hierarchical data comparisons also allow us to appropriately ignore some non-determinism when it is unavoidable. We conclude that there is still room for improvement, and identify directions that can be taken in each method to make an experiment more reproducible.

  12. Anisotropic ray trace

    Science.gov (United States)

    Lam, Wai Sze Tiffany

    Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for

  13. Tracers and tracing methods

    International Nuclear Information System (INIS)

    Leclerc, J.P.

    2001-01-01

    The first international congress on 'Tracers and tracing methods' took place in Nancy in May 2001. The objective of this second congress was to present the current status and trends on tracing methods and their applications. It has given the opportunity to people from different fields to exchange scientific information and knowledge about tracer methodologies and applications. The target participants were the researchers, engineers and technologists of various industrial and research sectors: chemical engineering, environment, food engineering, bio-engineering, geology, hydrology, civil engineering, iron and steel production... Two sessions have been planned to cover both fundamental and industrial aspects: 1)fundamental development (tomography, tracer camera visualization and particles tracking; validation of computational fluid dynamics simulations by tracer experiments and numerical residence time distribution; new tracers and detectors or improvement and development of existing tracing methods; data treatments and modeling; reactive tracer experiments and interpretation) 2)industrial applications (geology, hydrogeology and oil field applications; civil engineering, mineral engineering and metallurgy applications; chemical engineering; environment; food engineering and bio-engineering). The program included 5 plenary lectures, 23 oral communications and around 50 posters. Only 9 presentations are interested for the INIS database

  14. Semiautomated, Reproducible Batch Processing of Soy

    Science.gov (United States)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  15. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  16. Evaluation of the reproducibility of two techniques used to determine and record centric relation in angle's class I patients

    Directory of Open Access Journals (Sweden)

    Fernanda Paixão

    2007-08-01

    Full Text Available The centric relation is a mandibular position that determines a balance relation among the temporomandibular joints, the chew muscles and the occlusion. This position makes possible to the dentist to plan and to execute oral rehabilitation respecting the physiological principles of the stomatognathic system. The aim of this study was to investigate the reproducibility of centric relation records obtained using two techniques: Dawson's Bilateral Manipulation and Gysi's Gothic Arch Tracing. Twenty volunteers (14 females and 6 males with no dental loss, presenting occlusal contacts according to those described in Angle's I classification and without signs and symptoms of temporomandibular disorders were selected. All volunteers were submitted five times with a 1-week interval, always in the same schedule, to the Dawson's Bilateral Manipulation and to the Gysi's Gothic Arch Tracing with aid of an intraoral apparatus. The average standard error of each technique was calculated (Bilateral Manipulation 0.94 and Gothic Arch Tracing 0.27. Shapiro-Wilk test was applied and the results allowed application of Student's t-test (sampling error of 5%. The techniques showed different degrees of variability. The Gysi's Gothic Arch Tracing was found to be more accurate than the Bilateral Manipulation in reproducing the centric relation records.

  17. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  18. Error Correcting Codes

    Indian Academy of Sciences (India)

    sound quality is, in essence, obtained by accurate waveform coding and decoding of the audio signals. In addition, the coded audio information is protected against disc errors by the use of a Cross Interleaved Reed-Solomon Code (CIRC). Reed-. Solomon codes were discovered by Irving Reed and Gus Solomon in 1960.

  19. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Directory of Open Access Journals (Sweden)

    Ling-Hong Hung

    Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  20. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Science.gov (United States)

    Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2016-01-01

    Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  1. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  2. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    Science.gov (United States)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  3. REPRODUCIBLE DRUG REPURPOSING: WHEN SIMILARITY DOES NOT SUFFICE.

    Science.gov (United States)

    Guney, Emre

    2017-01-01

    Repurposing existing drugs for new uses has attracted considerable attention over the past years. To identify potential candidates that could be repositioned for a new indication, many studies make use of chemical, target, and side effect similarity between drugs to train classifiers. Despite promising prediction accuracies of these supervised computational models, their use in practice, such as for rare diseases, is hindered by the assumption that there are already known and similar drugs for a given condition of interest. In this study, using publicly available data sets, we question the prediction accuracies of supervised approaches based on drug similarity when the drugs in the training and the test set are completely disjoint. We first build a Python platform to generate reproducible similarity-based drug repurposing models. Next, we show that, while a simple chemical, target, and side effect similarity based machine learning method can achieve good performance on the benchmark data set, the prediction performance drops sharply when the drugs in the folds of the cross validation are not overlapping and the similarity information within the training and test sets are used independently. These intriguing results suggest revisiting the assumptions underlying the validation scenarios of similarity-based methods and underline the need for unsupervised approaches to identify novel drug uses inside the unexplored pharmacological space. We make the digital notebook containing the Python code to replicate our analysis that involves the drug repurposing platform based on machine learning models and the proposed disjoint cross fold generation method freely available at github.com/emreg00/repurpose.

  4. Network Coding Taxonomy

    OpenAIRE

    Adamson , Brian; Adjih , Cédric; Bilbao , Josu; Firoiu , Victor; Fitzek , Frank; Samah , Ghanem ,; Lochin , Emmanuel; Masucci , Antonia; Montpetit , Marie-Jose; Pedersen , Morten V.; Peralta , Goiuri; Roca , Vincent; Paresh , Saxena; Sivakumar , Senthil

    2017-01-01

    Internet Research Task Force - Working document of the Network Coding Research Group (NWCRG), draft-irtf-nwcrg-network-coding-taxonomy-05 (work in progress), https://datatracker.ietf.org/doc/draft-irtf-nwcrg-network-coding-taxonomy/; This document summarizes a recommended terminology for Network Coding concepts and constructs. It provides a comprehensive set of terms with unique names in order to avoid ambiguities in future Network Coding IRTF and IETF documents. This document is intended to ...

  5. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  6. On the Inclusion Relation of Reproducing Kernel Hilbert Spaces

    OpenAIRE

    Zhang, Haizhang; Zhao, Liang

    2011-01-01

    To help understand various reproducing kernels used in applied sciences, we investigate the inclusion relation of two reproducing kernel Hilbert spaces. Characterizations in terms of feature maps of the corresponding reproducing kernels are established. A full table of inclusion relations among widely-used translation invariant kernels is given. Concrete examples for Hilbert-Schmidt kernels are presented as well. We also discuss the preservation of such a relation under various operations of ...

  7. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  8. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  9. Layout for Assessing Dynamic Posture: Development, Validation, and Reproducibility.

    Science.gov (United States)

    Noll, Matias; Candotti, Cláudia Tarragô; da Rosa, Bruna Nichele; Sedrez, Juliana Adami; Vieira, Adriane; Loss, Jefferson Fagundes

    2016-01-01

    To determine the psychometric properties of the layout for assessing dynamic posture (LADy). The study was divided into 2 phases: (1) development of the instrument and (2) determination of validity and reproducibility. The LADy was designed to evaluate the position adopted in 9 dynamic postures. The results confirmed the validity and reproducibility of the instrument. From a total of 51 criteria assessing 9 postures, 1 was rejected. The reproducibility for each of the criteria was classified as moderate to excellent. The LADy constitutes a valid and reproducible instrument for the evaluation of dynamic postures in children 11 to 17 years old. It is low cost and applicable in the school environment.

  10. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    1. Introduction. Shannon's landmark paper 'A Mathematical Theory of. Communication' [1] laid the foundation for communica- ... coding theory, codes over graphs and iterative techniques, and informa- tion theory. .... An important consequence of independence is that if. {Xb X2 , . Xn} are independent random variables, each.

  11. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  12. Queer Tracings of Genre

    DEFF Research Database (Denmark)

    Balle, Søren Hattesen

    A prominent feature of John Ashbery's debut collection Some Trees is the near ubiquity of classical and post-classical genre designations attached as titles to its poems. Among them appear titles such as "Eclogue", "Sonnet", "Meditations of a Parrot", and "A Pastoral". Neither does Ashbery hesitate...... as (re)tracings of genres that appear somehow residual or defunct in a post-modernist poetic context. On the other, they are made to "encode new [and queer, shb] meanings" (Anne Ferry) inasmuch as Ashbery, for instance, doubles and literalizes Dante's false etymology of the word ‘eclogue' (aig- and logos...

  13. Osteoporosis and trace elements

    DEFF Research Database (Denmark)

    Aaseth, J.; Boivin, G.; Andersen, Ole

    2012-01-01

    More than 200 million people are affected by osteoporosis worldwide, as estimated by 2 million annual hip fractures and other debilitating bone fractures (vertebrae compression and Colles' fractures). Osteoporosis is a multi-factorial disease with potential contributions from genetic, endocrine...... in new bone and results in a net gain in bone mass, but may be associated with a tissue of poor quality. Aluminum induces impairment of bone formation. Gallium and cadmium suppresses bone turnover. However, exact involvements of the trace elements in osteoporosis have not yet been fully clarified...

  14. Trace conditioning in insects—keep the trace!

    Science.gov (United States)

    Dylla, Kristina V.; Galili, Dana S.; Szyszka, Paul; Lüdke, Alja

    2013-01-01

    Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS) and an unconditioned stimulus (US) following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination—a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase (Rut-AC), which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning. PMID:23986710

  15. Trace conditioning in insects-keep the trace!

    Science.gov (United States)

    Dylla, Kristina V; Galili, Dana S; Szyszka, Paul; Lüdke, Alja

    2013-01-01

    Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS) and an unconditioned stimulus (US) following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination-a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase (Rut-AC), which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  16. Trace conditioning in insects – Keep the trace!

    Directory of Open Access Journals (Sweden)

    Kristina V Dylla

    2013-08-01

    Full Text Available Trace conditioning is a form of associative learning that can be induced by presenting a conditioned stimulus (CS and an unconditioned stimulus (US following each other, but separated by a temporal gap. This gap distinguishes trace conditioning from classical delay conditioning, where the CS and US overlap. To bridge the temporal gap between both stimuli and to form an association between CS and US in trace conditioning, the brain must keep a neural representation of the CS after its termination – a stimulus trace. Behavioral and physiological studies on trace and delay conditioning revealed similarities between the two forms of learning, like similar memory decay and similar odor identity perception in invertebrates. On the other hand differences were reported also, like the requirement of distinct brain structures in vertebrates or disparities in molecular mechanisms in both vertebrates and invertebrates. For example, in commonly used vertebrate conditioning paradigms the hippocampus is necessary for trace but not for delay conditioning, and Drosophila delay conditioning requires the Rutabaga adenylyl cyclase, which is dispensable in trace conditioning. It is still unknown how the brain encodes CS traces and how they are associated with a US in trace conditioning. Insects serve as powerful models to address the mechanisms underlying trace conditioning, due to their simple brain anatomy, behavioral accessibility and established methods of genetic interference. In this review we summarize the recent progress in insect trace conditioning on the behavioral and physiological level and emphasize similarities and differences compared to delay conditioning. Moreover, we examine proposed molecular and computational models and reassess different experimental approaches used for trace conditioning.

  17. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    In this paper, the combination of homotopy deform method (HDM) and simplified reproducing kernel method (SRKM) is introduced for solving the boundary value problems (BVPs) of nonlinear differential equations. The solution methodology is based on Adomian decomposition and reproducing kernel method (RKM).

  18. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  19. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  20. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  1. A Denotational Semantics for Communicating Unstructured Code

    Directory of Open Access Journals (Sweden)

    Nils Jähnig

    2015-03-01

    Full Text Available An important property of programming language semantics is that they should be compositional. However, unstructured low-level code contains goto-like commands making it hard to define a semantics that is compositional. In this paper, we follow the ideas of Saabas and Uustalu to structure low-level code. This gives us the possibility to define a compositional denotational semantics based on least fixed points to allow for the use of inductive verification methods. We capture the semantics of communication using finite traces similar to the denotations of CSP. In addition, we examine properties of this semantics and give an example that demonstrates reasoning about communication and jumps. With this semantics, we lay the foundations for a proof calculus that captures both, the semantics of unstructured low-level code and communication.

  2. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  3. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  4. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  5. Development of code PRETOR for stellarator simulation

    International Nuclear Information System (INIS)

    Dies, J.; Fontanet, J.; Fontdecaba, J.M.; Castejon, F.; Alejandre, C.

    1998-01-01

    The Department de Fisica i Enginyeria Nuclear (DFEN) of the UPC has some experience in the development of the transport code PRETOR. This code has been validated with shots of DIII-D, JET and TFTR, it has also been used in the simulation of operational scenarios of ITER fast burnt termination. Recently, the association EURATOM-CIEMAT has started the operation of the TJ-II stellarator. Due to the need of validating the results given by others transport codes applied to stellarators and because all of them made some approximations, as a averaging magnitudes in each magnetic surface, it was thought suitable to adapt the PRETOR code to devices without axial symmetry, like stellarators, which is very suitable for the specific needs of the study of TJ-II. Several modifications are required in PRETOR; the main concerns to the models of: magnetic equilibrium, geometry and transport of energy and particles. In order to solve the complex magnetic equilibrium geometry the powerful numerical code VMEC has been used. This code gives the magnetic surface shape as a Fourier series in terms of the harmonics (m,n). Most of the geometric magnitudes are also obtained from the VMEC results file. The energy and particle transport models will be replaced by other phenomenological models that are better adapted to stellarator simulation. Using the proposed models, it is pretended to reproduce experimental data available from present stellarators, given especial attention to the TJ-II of the association EURATOM-CIEMAT. (Author)

  6. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  7. Traces generating what was there

    CERN Document Server

    2017-01-01

    Traces keep time contained and make visible what was there. Going back to the art of trace-reading, they continue to be a fundamental resource for scientific knowledge production. The contributions study, from the biology laboratory to the large colliders of particle physics, techniques involved in the production of material traces. Following their changes over two centuries, this collection shows the continuities they have in the digital age.

  8. Lidar Detection of Explosives Traces

    Directory of Open Access Journals (Sweden)

    Bobrovnikov Sergei M.

    2016-01-01

    Full Text Available The possibility of remote detection of traces of explosives using laser fragmentation/laser-induced fluorescence (LF/LIF is studied. Experimental data on the remote visualization of traces of trinitrotoluene (TNT, hexogen (RDX, trotyl-hexogen (Comp B, octogen (HMX, and tetryl with a scanning lidar detector of traces of nitrogen-containing explosives at a distance of 5 m are presented.

  9. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  10. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  11. FormTracer. A mathematica tracing package using FORM

    Science.gov (United States)

    Cyrol, Anton K.; Mitter, Mario; Strodthoff, Nils

    2017-10-01

    We present FormTracer, a high-performance, general purpose, easy-to-use Mathematica tracing package which uses FORM. It supports arbitrary space and spinor dimensions as well as an arbitrary number of simple compact Lie groups. While keeping the usability of the Mathematica interface, it relies on the efficiency of FORM. An additional performance gain is achieved by a decomposition algorithm that avoids redundant traces in the product tensors spaces. FormTracer supports a wide range of syntaxes which endows it with a high flexibility. Mathematica notebooks that automatically install the package and guide the user through performing standard traces in space-time, spinor and gauge-group spaces are provided. Program Files doi:http://dx.doi.org/10.17632/7rd29h4p3m.1 Licensing provisions: GPLv3 Programming language: Mathematica and FORM Nature of problem: Efficiently compute traces of large expressions Solution method: The expression to be traced is decomposed into its subspaces by a recursive Mathematica expansion algorithm. The result is subsequently translated to a FORM script that takes the traces. After FORM is executed, the final result is either imported into Mathematica or exported as optimized C/C++/Fortran code. Unusual features: The outstanding features of FormTracer are the simple interface, the capability to efficiently handle an arbitrary number of Lie groups in addition to Dirac and Lorentz tensors, and a customizable input-syntax.

  12. Suspension of the NAB Code and Its Effect on Regulation of Advertising.

    Science.gov (United States)

    Maddox, Lynda M.; Zanot, Eric J.

    1984-01-01

    Traces events leading to the suspension of the Television Code of the National Association of Broadcasters in 1982 and looks at changes that have occurred in the informal and formal regulation of advertising as a result of that suspension. (FL)

  13. Trace analysis of semiconductor materials

    CERN Document Server

    Cali, J Paul; Gordon, L

    1964-01-01

    Trace Analysis of Semiconductor Materials is a guidebook concerned with procedures of ultra-trace analysis. This book discusses six distinct techniques of trace analysis. These techniques are the most common and can be applied to various problems compared to other methods. Each of the four chapters basically includes an introduction to the principles and general statements. The theoretical basis for the technique involved is then briefly discussed. Practical applications of the techniques and the different instrumentations are explained. Then, the applications to trace analysis as pertaining

  14. Trace Mineral Losses in Sweat

    National Research Council Canada - National Science Library

    Chinevere, Troy D; McClung, James P; Cheuvront, Samuel N

    2007-01-01

    Copper, iron and zinc are nutritionally essential trace minerals that confer vital biological roles including the maintenance of cell structure and integrity, regulation of metabolism, immune function...

  15. ARC Code TI: CODE Software Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — CODE is a software framework for control and observation in distributed environments. The basic functionality of the framework allows a user to observe a distributed...

  16. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  17. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  18. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    the reading of data from memory the receiving process. Protecting data in computer memories was one of the earliest applications of Hamming codes. We now describe the clever scheme invented by Hamming in 1948. To keep things simple, we describe the binary length 7 Hamming code. Encoding in the Hamming Code.

  19. Morse Code Activity Packet.

    Science.gov (United States)

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  20. The Clawpack Community of Codes

    Science.gov (United States)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  1. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser.

    Science.gov (United States)

    Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R

    2012-01-01

    Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".

  2. Tracing Cultural Memory

    DEFF Research Database (Denmark)

    Wiegand, Frauke Katharina

    to Soweto’s Regina Mundi Church, this thesis analyses tourists’ snapshots at sites of memory and outlines their tracing activity in cultural memory. It draws on central concepts of actor - network theory and visual culture studies for a cross - disciplinary methodology to comprehend the collective...... of memory. They highlight the role of mundane uses of the past and indicate the need for cross - disciplinary research on the visual and on memory......We encounter, relate to and make use of our past and that of others in multifarious and increasingly mobile ways. Tourism is one of the main paths for encountering sites of memory. This thesis examines tourists’ creative appropriations of sites of memory – the objects and future memories inspired...

  3. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  4. Emulating Ray-Tracing Channels in Multi-probe Anechoic Chamber Setups for Virtual Drive Testing

    DEFF Research Database (Denmark)

    Fan, Wei; Llorente, Ines Carton; Kyösti, Pekka

    2016-01-01

    This paper discusses virtual drive testing (VDT) for multiple-input multiple-output (MIMO) capable terminals in multi-probe anechoic chamber (MPAC) setups. We propose to perform VDT, via reproducing ray tracing (RT) simulated channels with the field synthesis technique. Simulation results demonst...

  5. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  6. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  7. Unveiling Exception Handling Bug Hazards in Android Based on GitHub and Google Code Issues

    NARCIS (Netherlands)

    Coelho, R.; Almeida, L.; Gousios, G.; Van Deursen, A.

    2015-01-01

    This paper reports on a study mining the exception stack traces included in 159,048 issues reported on Android projects hosted in GitHub (482 projects) and Google Code (157 projects). The goal of this study is to investigate whether stack trace information can reveal bug hazards related to exception

  8. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  9. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  10. Digital Traces of Information Systems

    DEFF Research Database (Denmark)

    Hedman, Jonas; Srinivasan, Nikhil; Lindgren, Rikard

    2013-01-01

    In this paper, we point to the potential and implications of digital traces as novel data source in the study of contemporary activities and behaviors. We do this to raise awareness of IS researchers of such traces in increasingly complex sociomaterial practices. We develop a two-dimensional fram...

  11. Reproducible Earth observation analytics: challenges, ideas, and a study case on containerized land use change detection

    Science.gov (United States)

    Appel, Marius; Nüst, Daniel; Pebesma, Edzer

    2017-04-01

    Geoscientific analyses of Earth observation data typically involve a long path from data acquisition to scientific results and conclusions. Before starting the actual processing, scenes must be downloaded from the providers' platforms and the computing infrastructure needs to be prepared. The computing environment often requires specialized software, which in turn might have lots of dependencies. The software is often highly customized and provided without commercial support, which leads to rather ad-hoc systems and irreproducible results. To let other scientists reproduce the analyses, the full workspace including data, code, the computing environment, and documentation must be bundled and shared. Technologies such as virtualization or containerization allow for the creation of identical computing environments with relatively little effort. Challenges, however, arise when the volume of the data is too large, when computations are done in a cluster environment, or when complex software components such as databases are used. We discuss these challenges for the example of scalable Land use change detection on Landsat imagery. We present a reproducible implementation that runs R and the scalable data management and analytical system SciDB within a Docker container. Thanks to an explicit container recipe (the Dockerfile), this enables the all-in-one reproduction including the installation of software components, the ingestion of the data, and the execution of the analysis in a well-defined environment. We furthermore discuss possibilities how the implementation could be transferred to multi-container environments in order to support reproducibility on large cluster environments.

  12. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  13. RIA Fuel Codes Benchmark - Volume 1

    International Nuclear Information System (INIS)

    Marchand, Olivier; Georgenthum, Vincent; Petit, Marc; Udagawa, Yutaka; Nagase, Fumihisa; Sugiyama, Tomoyuki; Arffman, Asko; Cherubini, Marco; Dostal, Martin; Klouzal, Jan; Geelhood, Kenneth; Gorzel, Andreas; Holt, Lars; Jernkvist, Lars Olof; Khvostov, Grigori; Maertens, Dietmar; Spykman, Gerold; Nakajima, Tetsuo; Nechaeva, Olga; Panka, Istvan; Rey Gayo, Jose M.; Sagrado Garcia, Inmaculada C.; Shin, An-Dong; Sonnenburg, Heinz Guenther; Umidova, Zeynab; Zhang, Jinzhao; Voglewede, John

    2013-01-01

    Reactivity-initiated accident (RIA) fuel rod codes have been developed for a significant period of time and they all have shown their ability to reproduce some experimental results with a certain degree of adequacy. However, they sometimes rely on different specific modelling assumptions the influence of which on the final results of the calculations is difficult to evaluate. The NEA Working Group on Fuel Safety (WGFS) is tasked with advancing the understanding of fuel safety issues by assessing the technical basis for current safety criteria and their applicability to high burnup and to new fuel designs and materials. The group aims at facilitating international convergence in this area, including the review of experimental approaches as well as the interpretation and use of experimental data relevant for safety. As a contribution to this task, WGFS conducted a RIA code benchmark based on RIA tests performed in the Nuclear Safety Research Reactor in Tokai, Japan and tests performed or planned in CABRI reactor in Cadarache, France. Emphasis was on assessment of different modelling options for RIA fuel rod codes in terms of reproducing experimental results as well as extrapolating to typical reactor conditions. This report provides a summary of the results of this task. (authors)

  14. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  15. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    2016-09-23

    Sep 23, 2016 ... Nonlinear differential equations; the homotopy deform method; the simplified reproducing kernel ... an equivalent integro differential equation. ... an algorithm for solving nonlinear multipoint BVPs by combining homotopy perturbation and variational iteration methods. Most recently, Duan and Rach [12].

  16. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    2016-09-23

    s12043-016-1269-8. Homotopy deform method for reproducing kernel space for nonlinear boundary value problems. MIN-QIANG XU. ∗ and YING-ZHEN LIN. School of Science, Zhuhai Campus, Beijing Institute of Technology, ...

  17. Transition questions in clinical practice - validity and reproducibility

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein

    2008-01-01

    Transition questions in CLINICAL practice - validity and reproducibility Lauridsen HH1, Manniche C3, Grunnet-Nilsson N1, Hartvigsen J1,2 1   Clinical Locomotion Science, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark. e-mail: hlauridsen....... One way to determine the relevance of change scores is through the use of transition questions (TQ) that assesses patients’ retrospective perception of treatment effect. However, little is known about the validity and reproducibility of TQ’s. The objectives of this study were to explore aspects...... of construct validity and reproducibility of a TQ and make proposals for standardised use. One-hundred-and-ninety-one patients with low back pain and/or leg pain were followed over an 8-week period receiving 3 disability and 2 pain questionnaires together with a 7-point TQ. Reproducibility was determined using...

  18. Description of SHARC: The Strategic High-Altitude Radiance Code

    Science.gov (United States)

    Sharma, R. D.; Ratkowski, A. J.; Sundberg, R. L.; Duff, J. W.; Bernstein, L. S.

    1989-08-01

    The Strategic High-Altitude Radiance Code (SHARC) is a new computer code that calculates atmospheric radiation and transmittance for paths from 60 to 300 km altitude in the 2 to 40 microns spectral region. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium) molecular emissions. This initial version of SHARC includes the five strongest IR radiators, NO, CO, H2, O3, and CO2. This report describes the code and models used to calculate the NLTE molecular populations and the resulting atmospheric radiance. The SHARC Manual is reproduced in the appendix.

  19. Case Studies and Challenges in Reproducibility in the Computational Sciences

    OpenAIRE

    Arabas, Sylwester; Bareford, Michael R.; de Silva, Lakshitha R.; Gent, Ian P.; Gorman, Benjamin M.; Hajiarabderkani, Masih; Henderson, Tristan; Hutton, Luke; Konovalov, Alexander; Kotthoff, Lars; McCreesh, Ciaran; Nacenta, Miguel A.; Paul, Ruma R.; Petrie, Karen E. J.; Razaq, Abdul

    2014-01-01

    This paper investigates the reproducibility of computational science research and identifies key challenges facing the community today. It is the result of the First Summer School on Experimental Methodology in Computational Science Research (https://blogs.cs.st-andrews.ac.uk/emcsr2014/). First, we consider how to reproduce experiments that involve human subjects, and in particular how to deal with different ethics requirements at different institutions. Second, we look at whether parallel an...

  20. Tracing Geothermal Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Michael C. Adams; Greg Nash

    2004-03-01

    Geothermal water must be injected back into the reservoir after it has been used for power production. Injection is critical in maximizing the power production and lifetime of the reservoir. To use injectate effectively the direction and velocity of the injected water must be known or inferred. This information can be obtained by using chemical tracers to track the subsurface flow paths of the injected fluid. Tracers are chemical compounds that are added to the water as it is injected back into the reservoir. The hot production water is monitored for the presence of this tracer using the most sensitive analytic methods that are economically feasible. The amount and concentration pattern of the tracer revealed by this monitoring can be used to evaluate how effective the injection strategy is. However, the tracers must have properties that suite the environment that they will be used in. This requires careful consideration and testing of the tracer properties. In previous and parallel investigations we have developed tracers that are suitable from tracing liquid water. In this investigation, we developed tracers that can be used for steam and mixed water/steam environments. This work will improve the efficiency of injection management in geothermal fields, lowering the cost of energy production and increasing the power output of these systems.

  1. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  2. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  3. Modernized Approach for Generating Reproducible Heterogeneity Using Transmitted-Light for Flow Visualization Experiments

    Science.gov (United States)

    Jones, A. A.; Holt, R. M.

    2017-12-01

    Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).

  4. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  5. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow t...... the transversal implementation of a universal set of gates by gauge fixing, while error-dectecting measurements involve only four or six qubits....

  6. Doubled Color Codes

    Science.gov (United States)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  7. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  8. Effective holographic models for QCD: Glueball spectrum and trace anomaly

    Science.gov (United States)

    Ballon-Bayona, Alfonso; Boschi-Filho, Henrique; Mamani, Luis A. H.; Miranda, Alex S.; Zanchin, Vilson T.

    2018-02-01

    We investigate effective holographic models for QCD arising from five-dimensional dilaton gravity. The models are characterized by a dilaton with a mass term in the UV, dual to a CFT deformation by a relevant operator, and quadratic in the IR. The UV constraint leads to the explicit breaking of conformal symmetry, whereas the IR constraint guarantees linear confinement. We propose semianalytic interpolations between the UV and the IR and obtain a spectrum for scalar and tensor glueballs consistent with lattice QCD data. We use the glueball spectrum as a physical constraint to find the evolution of the model parameters as the mass term goes to 0. Finally, we reproduce the universal result for the trace anomaly of deformed CFTs and propose a dictionary between this result and the QCD trace anomaly. A nontrivial consequence of this dictionary is the emergence of a β function similar to the two-loop perturbative QCD result.

  9. A clinical evaluation of visual feedback-guided breath-hold reproducibility of tumor location

    International Nuclear Information System (INIS)

    Yoshitake, Tadamasa; Shioyama, Yoshiyuki; Ohga, Saiji; Nonoshita, Takeshi; Ohnishi, Kayoko; Terashima, Kotaro; Honda, Hiroshi; Nakamura, Katsumasa; Arimura, Hidetaka; Hirata, Hideki

    2009-01-01

    The purpose of this study was to evaluate the reproducibility of visual feedback-guided breath-hold using a machine vision system with a charge-coupled device camera and a monocular head-mounted display. Sixteen patients with lung tumors who were treated with stereotactic radiotherapy were enrolled. A machine vision system with a charge-coupled device camera was used for monitoring respiration. A monocular head-mounted display was used to provide the patient with visual feedback about the breathing trace. The patients could control their breathing so that the breathing waveform would fall between the upper and lower threshold lines. Planning and treatment were performed under visual feedback-guided expiratory breath-hold. Electronic portal images were obtained during treatment. The range of cranial-caudal motion of the tumor location during each single breath-hold was calculated as the intra-breath-hold (intra-BH) variability. The maximum displacement between the two to five averaged tumor locations of each single breath-hold was calculated as the inter-breath-hold (inter-BH) variability. All 16 patients tolerated the visual feedback-guided breath-hold maneuvers well. The intra- and inter-BH variability of all patients was 1.5 ± 0.6 mm and 1.2 ± 0.5 mm, respectively. A visual feedback-guided breath-hold technique using the machine vision system is feasible with good breath-hold reproducibility.

  10. MORSE Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  11. Bar Code Labels

    Science.gov (United States)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  12. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  13. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  14. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  15. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  16. Analysis of measuring system parameters that influence reproducibility of morphometric assessments with a graphic tablet.

    Science.gov (United States)

    Fleege, J C; Baak, J P; Smeulders, A W

    1988-05-01

    The morphometric analysis of nuclear characteristics by means of a graphic tablet is, in principle, objective and highly reproducible. However, a recent study found considerable variation in the morphometric assessments, which was in contrast to the findings of others. The way in which measurements were performed differed in these studies. Therefore, measuring system factors that can potentially influence the quantitative results were analyzed systematically. One observer, experienced in microscopic analysis and working with a commercially available graphic tablet, conducted all the measurements, thus excluding interobserver variation. The tracing speed, localization (on the graphic tablet), magnification, pen and cursor usage, shape, and orientation on the graphic tablet were analyzed. A nomogram was developed for cursor application that indicates the relation between "projected" particle size, tracing speed, and required coefficient of variation (CV). When the influence of these factors is taken into account, a measuring system can be tuned optimally. With such a regimen, the CV can be kept below 1.5%. Our results show that in the assessment of morphometric features with the use of a graphic tablet, errors due to the measuring system can be virtually eliminated.

  17. Trace formulae for arithmetical systems

    International Nuclear Information System (INIS)

    Bogomolny, E.B.; Georgeot, B.; Giannoni, M.J.; Schmit, C.

    1992-09-01

    For quantum problems on the pseudo-sphere generated by arithmetic groups there exist special trace formulae, called trace formulae for Hecke operators, which permit the reconstruction of wave functions from the knowledge of periodic orbits. After a short discussion of this subject, the Hecke operators trace formulae are presented for the Dirichlet problem on the modular billiard, which is a prototype of arithmetical systems. The results of numerical computations for these semiclassical type relations are in good agreement with the directly computed eigenfunctions. (author) 23 refs.; 2 figs

  18. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  19. Reproducing Sea-Ice Deformation Distributions With Viscous-Plastic Sea-Ice Models

    Science.gov (United States)

    Bouchat, A.; Tremblay, B.

    2016-02-01

    High resolution sea-ice dynamic models offer the potential to discriminate between sea-ice rheologies based on their ability to reproduce the satellite-derived deformation fields. Recent studies have shown that sea-ice viscous-plastic (VP) models do not reproduce the observed statistical properties of the strain rate distributions of the RADARSAT Geophysical Processor System (RGPS) deformation fields [1][2]. We use the elliptical VP rheology and we compute the probability density functions (PDFs) for simulated strain rate invariants (divergence and maximum shear stress) and compare against the deformations obtained with the 3-day gridded products from RGPS. We find that the large shear deformations are well reproduced by the elliptical VP model and the deformations do not follow a Gaussian distribution as reported in Girard et al. [1][2]. On the other hand, we do find an overestimation of the shear in the range of mid-magnitude deformations in all of our VP simulations tested with different spatial resolutions and numerical parameters. Runs with no internal stress (free-drift) or with constant viscosity coefficients (Newtonian fluid) also show this overestimation. We trace back this discrepancy to the elliptical yield curve aspect ratio (e = 2) having too little shear strength, hence not resisting enough the inherent shear in the wind forcing associated with synoptic weather systems. Experiments where we simply increase the shear resistance of the ice by modifying the ellipse ratio confirm the need for a rheology with an increased shear strength. [1] Girard et al. (2009), Evaluation of high-resolution sea ice models [...], Journal of Geophysical Research, 114[2] Girard et al. (2011), A new modeling framework for sea-ice mechanics [...], Annals of Glaciology, 57, 123-132

  20. On {\\sigma}-LCD codes

    OpenAIRE

    Carlet, Claude; Mesnager, Sihem; Tang, Chunming; Qi, Yanfeng

    2017-01-01

    Linear complementary pairs (LCP) of codes play an important role in armoring implementations against side-channel attacks and fault injection attacks. One of the most common ways to construct LCP of codes is to use Euclidean linear complementary dual (LCD) codes. In this paper, we first introduce the concept of linear codes with $\\sigma$ complementary dual ($\\sigma$-LCD), which includes known Euclidean LCD codes, Hermitian LCD codes, and Galois LCD codes. As Euclidean LCD codes, $\\sigma$-LCD ...

  1. ARTENOLIS: Automated Reproducibility and Testing Environment for Licensed Software

    OpenAIRE

    Heirendt, Laurent; Arreckx, Sylvain; Trefois, Christophe; Yarosz, Yohan; Vyas, Maharshi; Satagopam, Venkata P.; Schneider, Reinhard; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-01

    Motivation: Automatically testing changes to code is an essential feature of continuous integration. For open-source code, without licensed dependencies, a variety of continuous integration services exist. The COnstraint-Based Reconstruction and Analysis (COBRA) Toolbox is a suite of open-source code for computational modelling with dependencies on licensed software. A novel automated framework of continuous integration in a semi-licensed environment is required for the development of the COB...

  2. Code of ethics and conduct for European nursing.

    Science.gov (United States)

    Sasso, Loredana; Stievano, Alessandro; González Jurado, Máximo; Rocco, Gennaro

    2008-11-01

    A main identifying factor of professions is professionals' willingness to comply with ethical and professional standards, often defined in a code of ethics and conduct. In a period of intense nursing mobility, if the public are aware that health professionals have committed themselves to the drawing up of a code of ethics and conduct, they will have more trust in the health professional they choose, especially if this person comes from another European Member State. The Code of Ethics and Conduct for European Nursing is a programmatic document for the nursing profession constructed by the FEPI (European Federation of Nursing Regulators) according to Directive 2005/36/EC On recognition of professional qualifications , and Directive 2006/123/EC On services in the internal market, set out by the European Commission. This article describes the construction of the Code and gives an overview of some specific areas of importance. The main text of the Code is reproduced in Appendix 1.

  3. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when...

  4. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    Science, Bangalore. Her interests are in. Theoretical Computer. Science. SERIES I ARTICLE. Error Correcting Codes. 2. The Hamming Codes. Priti Shankar. In the first article of this series we showed how redundancy introduced into a message transmitted over a noisy channel could improve the reliability of transmission. In.

  5. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    set up a well defined goal - that of achieving a per- formance bound set by the noisy channel coding theo- rem, proved in the paper. Whereas the goal appeared elusive twenty five years ago, today, there are practi- cal codes and decoding algorithms that come close to achieving it. It is interesting to note that all known.

  6. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  7. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  8. Codes of Conduct

    Science.gov (United States)

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  9. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Department of Computer. Science 'and Automation,. lISe. Their research addresses ... The fifty five year old history of error correcting codes began with Claude Shannon's path-breaking paper en- titled 'A ... given the limited computing power available then, Gal- lager's codes were not considered practical. A landmark.

  10. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  11. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 9. Decoding Codes on Graphs - Low Density Parity Check Codes. A S Madhu Aditya Nori. General Article Volume 8 Issue 9 September 2003 pp 49-59. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. READING A NEURAL CODE

    NARCIS (Netherlands)

    BIALEK, W; RIEKE, F; VANSTEVENINCK, RRD; WARLAND, D

    1991-01-01

    Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task - extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from

  13. The elimination of ray tracing in Monte Carlo shielding programs

    International Nuclear Information System (INIS)

    Bendall, D.E.

    1988-01-01

    The MONK6 code has clearly demonstrated the advantages of hole tracking, which was devised by Woodcock et at. for use in criticality codes from earlier work by Von Neumann. Hole tracking eliminates ray tracing by introducing, for all materials present in the problem, a pseudo scattering reaction that forward scatters without energy loss. The cross section for this reaction is chosen so that the total cross sections for all the materials are equal at a given energy. By this means, tracking takes place with a constant total cross section everywhere, so there is now no need to ray trace. The present work extends hole tracking to shielding codes, where it functions in tandem with Russian roulette and splitting. An algorithm has been evolved and its performance is compared with the ray-tracking code McBEND. A disadvantage with hole tracking occurs when there is a wide variation in total cross section for materials present. As the tracking uses the total cross section of the material that has the maximum cross section, there can be a large number of pseudo collisions in the materials with low total cross sections. In extreme cases, the advantages of hole tracking can be lost by the by the extra time taken in servicing these pseudo collisions; however, techniques for eliminating this problem are under consideration

  14. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  15. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  16. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  17. Reproducibility2020: Progress and priorities [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Leonard P. Freedman

    2017-05-01

    Full Text Available The preclinical research process is a cycle of idea generation, experimentation, and reporting of results. The biomedical research community relies on the reproducibility of published discoveries to create new lines of research and to translate research findings into therapeutic applications. Since 2012, when scientists from Amgen reported that they were able to reproduce only 6 of 53 “landmark” preclinical studies, the biomedical research community began discussing the scale of the reproducibility problem and developing initiatives to address critical challenges. Global Biological Standards Institute (GBSI released the “Case for Standards” in 2013, one of the first comprehensive reports to address the rising concern of irreproducible biomedical research. Further attention was drawn to issues that limit scientific self-correction, including reporting and publication bias, underpowered studies, lack of open access to methods and data, and lack of clearly defined standards and guidelines in areas such as reagent validation. To evaluate the progress made towards reproducibility since 2013, GBSI identified and examined initiatives designed to advance quality and reproducibility. Through this process, we identified key roles for funders, journals, researchers and other stakeholders and recommended actions for future progress. This paper describes our findings and conclusions.

  18. Trace elements and human fertility.

    OpenAIRE

    Stovell, Alex Gordon.

    1999-01-01

    Methods were developed and validated for the analysis of trace elements in human scalp hair, blood serum, ovarian follicular fluid and seminal plasma by inductively coupled plasma mass spectrometry (ICP-MS). An interlaboratory comparison was also undertaken to compare the analysis of biological materials by ICP-MS with instrumental neutron activation analysis (INAA). Preliminary trace element protein speciation experiments were carried out using size exclusion high performance liquid chromato...

  19. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  20. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  1. Traces Synchronization in Distributed Networks

    Directory of Open Access Journals (Sweden)

    Eric Clément

    2009-01-01

    Full Text Available This article proposes a novel approach to synchronize a posteriori the detailed execution traces from several networked computers. It can be used to debug and investigate complex performance problems in systems where several computers exchange information. When the distributed system is under study, detailed execution traces are generated locally on each system using an efficient and accurate system level tracer, LTTng. When the tracing is finished, the individual traces are collected and analysed together. The messaging events in all the traces are then identified and correlated in order to estimate the time offset over time between each node. The time offset computation imprecision, associated with asymmetric network delays and operating system latency in message sending and receiving, is amortized over a large time interval through a linear least square fit over several messages covering a large time span. The resulting accuracy is such that it is possible to estimate the clock offsets in a distributed system, even with a relatively low volume of messages exchanged, to within the order of a microsecond while having a very low impact on the system execution, which is sufficient to properly order the events traced on the individual computers in the distributed system.

  2. Reproducible preclinical research-Is embracing variability the answer?

    Science.gov (United States)

    Karp, Natasha A

    2018-03-01

    Translational failures and replication issues of published research are undermining preclinical research and, if the outcomes are questionable, raise ethical implications over the continued use of animals. Standardization of procedures, environmental conditions, and genetic background has traditionally been proposed as the gold standard approach, as it reduces variability, thereby enhancing sensitivity and supporting reproducibility when the environment is defined precisely. An alternative view is that standardization can identify idiosyncratic effects and hence decrease reproducibility. In support of this alternative view, Voelkl and colleagues present evidence from resampling a large quantity of research data exploring a variety of treatments. They demonstrate that by implementing multi-laboratory experiments with as few as two sites, we can increase reproducibility by embracing variation without increasing the sample size.

  3. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  4. Current Status of Post-combustor Trace Chemistry Modeling and Simulation at NASA Glenn Research Center

    Science.gov (United States)

    Wey, Thomas; Liu, Nan-Suey

    2003-01-01

    The overall objective of the current effort at NASA GRC is to evaluate, develop, and apply methodologies suitable for modeling intra-engine trace chemical changes over post combustor flow path relevant to the pollutant emissions from aircraft engines. At the present time, the focus is the high pressure turbine environment. At first, the trace chemistry model of CNEWT were implemented into GLENN-HT as well as NCC. Then, CNEWT, CGLENN-HT, and NCC were applied to the trace species evolution in a cascade of Cambridge University's No. 2 rotor and in a turbine vane passage. In general, the results from these different codes provide similar features. However, the details of some of the quantities of interest can be sensitive to the differences of these codes. This report summaries the implementation effort and presents the comparison of the No. 2 rotor results obtained from these different codes. The comparison of the turbine vane passage results is reported elsewhere. In addition to the implementation of trace chemistry model into existing CFD codes, several pre/post-processing tools that can handle the manipulations of the geometry, the unstructured and structured grids as well as the CFD solutions also have been enhanced and seamlessly tied with NCC, CGLENN-HT, and CNEWT. Thus, a complete CFD package consisting of pre/post-processing tools and flow solvers suitable for post-combustor intra-engine trace chemistry study is assembled.

  5. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  6. Reserves of reproducibility and accuracy of spectrochemical methods of analysis

    International Nuclear Information System (INIS)

    Britske, M.Eh.; Slabodenyuk, I.V.

    1982-01-01

    Reproducibility and accuracy of analysis by methods of absorption and emission flame spectroscopy are practically adequate under the conditions of comparability of detection limits. The basic part of error is contributed by fluctuations of free atom concentration in the flame torch. An instrumental error and a part of er-- ror, contributed by random fluctations of flame temperature, can be practically neglected. Further improvement of the reproducibility can be achieved at the expense of stabilization of aerozol generation or by using the internal standard technique in the work on two-channel spectrometers. Dispersions for the both techniques are compared on the example of indium determination

  7. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  8. Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.

    Science.gov (United States)

    Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic

    2015-08-01

    This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.

  9. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  10. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  11. CHEETAH: A next generation thermochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Fried, L.; Souers, P.

    1994-11-01

    CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0. We have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. We find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with the new standard run command. CHEETAH will make the use of thermochemical codes more attractive to practical explosive formulators. We have also made an extensive effort to improve over the results of TIGER. CHEETAH`s version of the BKW equation of state (BKWC) is able to accurately reproduce energies from cylinder tests; something that other BKW parameter sets have been unable to do. Calculations performed with BKWC execute very quickly; typical run times are under 10 seconds on a workstation. In the future we plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.

  12. Laser propagation code study

    OpenAIRE

    Rockower, Edward B.

    1985-01-01

    A number of laser propagation codes have been assessed as to their suitability for modeling Army High Energy Laser (HEL) weapons used in an anti- sensor mode. We identify a number of areas in which systems analysis HEL codes are deficient. Most notably, available HEL scaling law codes model the laser aperture as circular, possibly with a fixed (e.g. 10%) obscuration. However, most HELs have rectangular apertures with up to 30% obscuration. We present a beam-quality/aperture shape scaling rela...

  13. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  14. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  15. Decoding the productivity code

    DEFF Research Database (Denmark)

    Hansen, David

    , that is, the productivity code of the 21st century, is dissolved. Today, organizations are pressured for operational efficiency, often in terms of productivity, due to increased global competition, demographical changes, and use of natural resources. Taylor’s principles for rationalization founded...... that swing between rationalization and employee development. The productivity code is the lack of alternatives to this ineffective approach. This thesis decodes the productivity code based on the results from a 3-year action research study at a medium-sized manufacturing facility. During the project period...

  16. CALIPSOS code report

    International Nuclear Information System (INIS)

    Fanselau, R.W.; Thakkar, J.G.; Hiestand, J.W.; Cassell, D.S.

    1980-04-01

    CALIPSOS is a steady-state three-dimensional flow distribution code which predicts the fluid dynamics and heat transfer interactions of the secondary two-phase flow in a steam generator. The mathematical formulation is sufficiently general to accommodate two fluid models described by separate gas and liquid momentum equations. However, if the user selects the homogeneous flow option, the code automatically equates the gas and liquid phase velocities (thereby reducing the number of momentum equations solved to three) and utilizes a homogeneous density mixture. This report presents the basic features of the CALIPSOS code and includes assumptions, equations solved, the finite-difference grid, and highlights of the solution procedure

  17. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  18. TRACK The New Beam Dynamics Code

    CERN Document Server

    Mustapha, Brahim; Ostroumov, Peter; Schnirman-Lessner, Eliane

    2005-01-01

    The new ray-tracing code TRACK was developed* to fulfill the special requirements of the RIA accelerator systems. The RIA lattice includes an ECR ion source, a LEBT containing a MHB and a RFQ followed by three SC linac sections separated by two stripping stations with appropriate magnetic transport systems. No available beam dynamics code meet all the necessary requirements for an end-to-end simulation of the RIA driver linac. The latest version of TRACK was used for end-to-end simulations of the RIA driver including errors and beam loss analysis.** In addition to the standard capabilities, the code includes the following new features: i) multiple charge states ii) realistic stripper model; ii) static and dynamic errors iii) automatic steering to correct for misalignments iv) detailed beam-loss analysis; v) parallel computing to perform large scale simulations. Although primarily developed for simulations of the RIA machine, TRACK is a general beam dynamics code. Currently it is being used for the design and ...

  19. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    Science.gov (United States)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  20. Language actively reproduces the socio-economic inequalities in ...

    African Journals Online (AJOL)

    Our relations in society as men and women are determined and expressed by the language we speak. However, language does not passively reflect society but rather actively reproduces the inequalities in society. Language is a cognitive process involving the production and understanding of linguistic communication as ...

  1. Intercenter reproducibility of binary typing for Staphylococcus aureus

    NARCIS (Netherlands)

    van Leeuwen, Willem B.; Snoeijers, Sandor; van der Werken-Libregts, Christel; Tuip, Anita; van der Zee, Anneke; Egberink, Diane; de Proost, Monique; Bik, Elisabeth; Lunter, Bjorn; Kluytmans, Jan; Gits, Etty; van Duyn, Inge; Heck, Max; van der Zwaluw, Kim; Wannet, Wim; Noordhoek, Gerda T.; Mulder, Sije; Renders, Nicole; Boers, Miranda; Zaat, Sebastiaan; van der Riet, Daniëlle; Kooistra, Mirjam; Talens, Adriaan; Dijkshoorn, Lenie; van der Reyden, Tanny; Veenendaal, Dick; Bakker, Nancy; Cookson, Barry; Lynch, Alisson; Witte, Wolfgang; Cuny, Christa; Blanc, Dominique; Vernez, Isabelle; Hryniewicz, Waleria; Fiett, Janusz; Struelens, Marc; Deplano, Ariane; Landegent, Jim; Verbrugh, Henri A.; van Belkum, Alex

    2002-01-01

    The reproducibility of the binary typing (BT) protocol developed for epidemiological typing of Staphylococcus aureus was analyzed in a biphasic multicenter study. In a Dutch multicenter pilot study, 10 genetically unique isolates of methicillin-resistant S. aureus (MRSA) were characterized by the BT

  2. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    Science.gov (United States)

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  3. Reproducible cavitation activity in water-particle suspensions

    NARCIS (Netherlands)

    Borkent, B.M.; Arora, M.; Ohl, C.D.

    2007-01-01

    The study of cavitation inception in liquids rarely yields reproducible data, unless special control is taken on the cleanliness of the experimental environment. In this paper, an experimental technique is demonstrated which allows repeatable measurements of cavitation activity in liquid-particle

  4. Reproducibility of BOLD signal change induced by breath holding.

    Science.gov (United States)

    Magon, Stefano; Basso, Gianpaolo; Farace, Paolo; Ricciardi, Giuseppe Kenneth; Beltramello, Alberto; Sbarbati, Andrea

    2009-04-15

    Blood oxygen level dependent (BOLD) contrast is influenced by some physiological factors such as blood flow and blood volume that can be a source of variability in fMRI analysis. Previous studies proposed to use the cerebrovascular response data to normalize or calibrate BOLD maps in order to reduce variability of fMRI data both among brain areas in single subject analysis and across subjects. Breath holding is one of the most widely used methods to investigate the vascular reactivity. However, little is known about the robustness and reproducibility of this procedure. In this study we investigated three different breath holding periods. Subjects were asked to hold their breath for 9, 15 or 21 s in three separate runs and the fMRI protocol was repeated after 15 to 20 days. Our data show that the BOLD response to breath holding after inspiration results in a complex shape due to physiological factors that influence the signal variation with a timing that is highly reproducible. Nevertheless, the reproducibility of the magnitude of the cerebrovascular response to CO(2), expressed as amplitude of BOLD signal and number of responding voxels, strongly depends on duration of breath holding periods. Breath holding period of 9 s results in high variability of the magnitude of the response while longer breath holding durations produce more robust and reproducible BOLD responses.

  5. Reproducible and expedient rice regeneration system using in vitro ...

    African Journals Online (AJOL)

    Inevitable prerequisite for expedient regeneration in rice is the selection of totipotent explant and developing an apposite combination of growth hormones. Here, we reported a reproducible regeneration protocol in which basal segments of the stem of the in vitro grown rice plants were used as ex-plant. Using the protocol ...

  6. Reproducible positioning in chest X-ray radiography

    International Nuclear Information System (INIS)

    1974-01-01

    A device is described that can be used to ensure reproducibility in the positioning of the patient during X-ray radiography of the thorax. Signals are taken from an electrocardiographic monitor and from a device recording the respiratory cycle. Radiography is performed only when two preselected signals coincide

  7. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  8. Reproducing (and Disrupting) Heteronormativity: Gendered Sexual Socialization in Preschool Classrooms

    Science.gov (United States)

    Gansen, Heidi M.

    2017-01-01

    Using ethnographic data from 10 months of observations in nine preschool classrooms, I examine gendered sexual socialization children receive from teachers' practices and reproduce through peer interactions. I find heteronormativity permeates preschool classrooms, where teachers construct (and occasionally disrupt) gendered sexuality in a number…

  9. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, W.J. (Wilhelmina J.); H.A. van Elteren (Hugo); T.G. Goos (Tom); I.K.M. Reiss (Irwin); R.C.J. de Jonge (Rogier); V.J. van den Berg (Victor J.)

    2017-01-01

    textabstractThe aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm

  10. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, Wilhelmina J.; Van Elteren, Hugo A.; Goos, T.G.; Reiss, Irwin K.M.; De Jonge, Rogier C.J.; van Den Berg, Victor J.

    2017-01-01

    The aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm neonates in the

  11. Exploring the Coming Repositories of Reproducible Experiments: Challenges and Opportunities

    DEFF Research Database (Denmark)

    Freire, Juliana; Bonnet, Philippe; Shasha, Dennis

    2011-01-01

    Computational reproducibility efforts in many communities will soon give rise to validated software and data repositories of high quality. A scientist in a field may want to query the components of such repositories to build new software workflows, perhaps after adding the scientist’s own algorithm....... This paper explores research challenges necessary to achieving this goal....

  12. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  13. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaete; Benedeti, Augusto Cesar Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge, E-mail: fernando@fatesa.edu.br [Faculdade de Tecnologia em Saude (FATESA), Ribeirao Preto, SP (Brazil); Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil). Departmento de Radiologia; Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Departmento de Medicina Clinica; Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras; Hospital Mae de Deus, Porto Alegre, RS (Brazil)

    2017-05-15

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. (author)

  14. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    Purpose: To determine the intra-session and inter-session reproducibility of corneal, macular and retinal nerve fiber layer thickness (RNFL) measurements with the iVue-100 optical coherence tomography in normal eyes. Methods: These parameters were measured in the right eyes of 50 healthy participants with normal ...

  15. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    in determining the utility of a device used for clinical and research purposes.2 The aim of this study was therefore to determine the reproducibility of corneal, macular and. RNFL thickness measurements in normal eyes using the. iVue-100 SD-OCT. Subjects and methods. The study was approved by the University of KwaZu-.

  16. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, Isaline C J M; Beelen, A.; Dedding, C; Cardol, M.; Dekker, J

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). DESIGN: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  17. The reproducibility of the Canadian occupational performance measure.

    NARCIS (Netherlands)

    Eyssen, I.C.J.M.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  18. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I.C.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    Objective: To assess the reproducibility (reliability and inter-rater agreement> of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  19. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I. C. J. M.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data analysis was based on

  20. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I.C.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  1. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    Science.gov (United States)

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~parametric imaging protocols. PMID:21094686

  2. Reproducibility in the assessment of acute pancreatitis with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Freire Filho, Edison de Oliveira; Vieira, Renata La Rocca; Yamada, Andre Fukunishi; Shigueoka, David Carlos; Bekhor, Daniel; Freire, Maxime Figueiredo de Oliveira; Ajzen, Sergio; D' Ippolito, Giuseppe [Universidade Federal de Sao Paulo (UNIFESP/EPM), SP (Brazil). Dept. of Imaging Diagnosis]. E-mail: eofilho@ig.com.br; eoffilho@uol.com.br

    2007-11-15

    Objective: To evaluate the reproducibility of unenhanced and contrast-enhanced computed tomography in the assessment of patients with acute pancreatitis. Materials and methods: Fifty-one unenhanced and contrast-enhanced abdominal computed tomography studies of patients with acute pancreatitis were blindly reviewed by two radiologists (observers 1 and 2). The morphological index was separately calculated for unenhanced and contrast-enhanced computed tomography and the disease severity index was established. Intraobserver and interobserver reproducibility of computed tomography was measured by means of the kappa index ({kappa}). Results: Interobserver agreement was {kappa} 0.666, 0.705, 0.648, 0.547 and 0.631, respectively for unenhanced and contrast-enhanced morphological index, presence of pancreatic necrosis, pancreatic necrosis extension, and disease severity index. Intraobserver agreement (observers 1 and 2, respectively) was {kappa} = 0.796 and 0.732 for unenhanced morphological index; {kappa} 0.725 and 0.802 for contrast- enhanced morphological index; {kappa} = 0.674 and 0.849 for presence of pancreatic necrosis; {kappa} = 0.606 and 0.770 for pancreatic necrosis extension; and {kappa} = 0.801 and 0.687 for disease severity index at computed tomography. Conclusion: Computed tomography for determination of morphological index and disease severity index in the staging of acute pancreatitis is a quite reproducible method. The absence of contrast- enhancement does not affect the computed tomography morphological index reproducibility. (author)

  3. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    Science.gov (United States)

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  4. Reproducibility and Reliability of Repeated Quantitative Fluorescence Angiography

    DEFF Research Database (Denmark)

    Nerup, Nikolaj; Knudsen, Kristine Bach Korsholm; Ambrus, Rikard

    2017-01-01

    that the camera can detect. As the emission of fluorescence is dependent of the excitatory light intensity, reduction of this may solve the problem. The aim of the present study was to investigate the reproducibility and reliability of repeated quantitative FA during a reduction of excitatory light....

  5. Annotating with Propp's Morphology of the Folktale: Reproducibility and Trainability

    NARCIS (Netherlands)

    Fisseni, B.; Kurji, A.; Löwe, B.

    2014-01-01

    We continue the study of the reproducibility of Propp’s annotations from Bod et al. (2012). We present four experiments in which test subjects were taught Propp’s annotation system; we conclude that Propp’s system needs a significant amount of training, but that with sufficient time investment, it

  6. Statecraft and Study Abroad: Imagining, Narrating and Reproducing the State

    Science.gov (United States)

    Lansing, Jade; Farnum, Rebecca L.

    2017-01-01

    Study abroad in higher education is on the rise, marketed as an effective way to produce global citizens and undermine international boundaries. In practice, however, programmes frequently reify rather than challenge states: participants "study Morocco" rather than "exploring Marrakech." This framing reproduces real and…

  7. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Directory of Open Access Journals (Sweden)

    Fernando Marum Mauad

    Full Text Available Abstract Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6% were men and 62 (61.4% were women-with a mean age of 66.3 years (60-80 years. The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest. We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  8. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  9. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  10. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  11. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access......, in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded...

  12. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  13. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  14. A Theory of Network Tracing

    Science.gov (United States)

    Acharya, Hrishikesh B.; Gouda, Mohamed G.

    Traceroute is a widely used program for computing the topology of any network in the Internet. Using Traceroute, one starts from a node and chooses any other node in the network. Traceroute obtains the sequence of nodes that occur between these two nodes, as specified by the routing tables in these nodes. Each use of Traceroute in a network produces a trace of nodes that constitute a simple path in this network. In every trace that is produced by Traceroute, each node occurs either by its unique identifier, or by the anonymous identifier"*". In this paper, we introduce the first theory aimed at answering the following important question. Is there an algorithm to compute the topology of a network N from a trace set T that is produced by using Traceroute in network N, assuming that each edge in N occurs in at least one trace in T, and that each node in N occurs by its unique identifier in at least one trace in T? We prove that the answer to this question is "No" if N is an even ring or a general network. However, it is "Yes" if N is a tree or an odd ring. The answer is also "No" if N is mostly-regular, but "Yes" if N is a mostly-regular even ring.

  15. CERN Analysis Preservation: A Novel Digital Library Service to Enable Reusable and Reproducible Research

    CERN Document Server

    AUTHOR|(CDS)2079501; Chen, Xiaoli; Dani, Anxhela; Dasler, Robin Lynnette; Delgado Fernandez, Javier; Fokianos, Pamfilos; Herterich, Patricia Sigrid; Simko, Tibor

    2016-01-01

    The latest policy developments require immediate action for data preservation, as well as reproducible and Open Science. To address this, an unprecedented digital library service is presented to enable the High-Energy Physics community to preserve and share their research objects (such as data, code, documentation, notes) throughout their research process. While facing the challenges of a “big data” community, the internal service builds on existing internal databases to make the process as easy and intrinsic as possible for researchers. Given the “work in progress” nature of the objects preserved, versioning is supported. It is expected that the service will not only facilitate better preservation techniques in the community, but will foremost make collaborative research easier as detailed metadata and novel retrieval functionality provide better access to ongoing works. This new type of e-infrastructure, fully integrated into the research workflow, could help in fostering Open Science practices acro...

  16. Dugong: a Docker image, based on Ubuntu Linux, focused on reproducibility and replicability for bioinformatics analyses.

    Science.gov (United States)

    Menegidio, Fabiano B; Jabes, Daniela L; Costa de Oliveira, Regina; Nunes, Luiz R

    2018-02-01

    This manuscript introduces and describes Dugong, a Docker image based on Ubuntu 16.04, which automates installation of more than 3500 bioinformatics tools (along with their respective libraries and dependencies), in alternative computational environments. The software operates through a user-friendly XFCE4 graphic interface that allows software management and installation by users not fully familiarized with the Linux command line and provides the Jupyter Notebook to assist in the delivery and exchange of consistent and reproducible protocols and results across laboratories, assisting in the development of open science projects. Source code and instructions for local installation are available at https://github.com/DugongBioinformatics, under the MIT open source license. Luiz.nunes@ufabc.edu.br. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Code de conduite

    International Development Research Centre (IDRC) Digital Library (Canada)

    irocca

    le respect de telles normes. Ce faisant, nous contribuons à la bonne réputation et à l'intégrité du Centre et allons dans le sens du Code de valeurs et d'éthique du secteur public du gouvernement du Canada. Je vous invite à prendre connaissance de cette nouvelle mouture du Code de conduite et à appliquer ses principes ...

  18. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  19. Aphasia for Morse code.

    Science.gov (United States)

    Wyler, A R; Ray, M W

    1986-03-01

    The ability to communicate by Morse code at high speed has, to our knowledge, not been localized within the cerebral cortex, but might be suspected as residing within the left (dominant) hemisphere. We report a case of a 54-year-old male who suffered a left temporal tip intracerebral hematoma and who temporarily lost his ability to communicate in Morse code, but who was minimally aphasic.

  20. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  1. Exploring the reproducibility of functional connectivity alterations in Parkinson's disease.

    Science.gov (United States)

    Badea, Liviu; Onu, Mihaela; Wu, Tao; Roceanu, Adina; Bajenaru, Ovidiu

    2017-01-01

    Since anatomic MRI is presently not able to directly discern neuronal loss in Parkinson's Disease (PD), studying the associated functional connectivity (FC) changes seems a promising approach toward developing non-invasive and non-radioactive neuroimaging markers for this disease. While several groups have reported such FC changes in PD, there are also significant discrepancies between studies. Investigating the reproducibility of PD-related FC changes on independent datasets is therefore of crucial importance. We acquired resting-state fMRI scans for 43 subjects (27 patients and 16 normal controls, with 2 replicate scans per subject) and compared the observed FC changes with those obtained in two independent datasets, one made available by the PPMI consortium (91 patients, 18 controls) and a second one by the group of Tao Wu (20 patients, 20 controls). Unfortunately, PD-related functional connectivity changes turned out to be non-reproducible across datasets. This could be due to disease heterogeneity, but also to technical differences. To distinguish between the two, we devised a method to directly check for disease heterogeneity using random splits of a single dataset. Since we still observe non-reproducibility in a large fraction of random splits of the same dataset, we conclude that functional heterogeneity may be a dominating factor behind the lack of reproducibility of FC alterations in different rs-fMRI studies of PD. While global PD-related functional connectivity changes were non-reproducible across datasets, we identified a few individual brain region pairs with marginally consistent FC changes across all three datasets. However, training classifiers on each one of the three datasets to discriminate PD scans from controls produced only low accuracies on the remaining two test datasets. Moreover, classifiers trained and tested on random splits of the same dataset (which are technically homogeneous) also had low test accuracies, directly substantiating

  2. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  3. SUSHI: an exquisite recipe for fully documented, reproducible and reusable NGS data analysis.

    Science.gov (United States)

    Hatakeyama, Masaomi; Opitz, Lennart; Russo, Giancarlo; Qi, Weihong; Schlapbach, Ralph; Rehrauer, Hubert

    2016-06-02

    Next generation sequencing (NGS) produces massive datasets consisting of billions of reads and up to thousands of samples. Subsequent bioinformatic analysis is typically done with the help of open source tools, where each application performs a single step towards the final result. This situation leaves the bioinformaticians with the tasks to combine the tools, manage the data files and meta-information, document the analysis, and ensure reproducibility. We present SUSHI, an agile data analysis framework that relieves bioinformaticians from the administrative challenges of their data analysis. SUSHI lets users build reproducible data analysis workflows from individual applications and manages the input data, the parameters, meta-information with user-driven semantics, and the job scripts. As distinguishing features, SUSHI provides an expert command line interface as well as a convenient web interface to run bioinformatics tools. SUSHI datasets are self-contained and self-documented on the file system. This makes them fully reproducible and ready to be shared. With the associated meta-information being formatted as plain text tables, the datasets can be readily further analyzed and interpreted outside SUSHI. SUSHI provides an exquisite recipe for analysing NGS data. By following the SUSHI recipe, SUSHI makes data analysis straightforward and takes care of documentation and administration tasks. Thus, the user can fully dedicate his time to the analysis itself. SUSHI is suitable for use by bioinformaticians as well as life science researchers. It is targeted for, but by no means constrained to, NGS data analysis. Our SUSHI instance is in productive use and has served as data analysis interface for more than 1000 data analysis projects. SUSHI source code as well as a demo server are freely available.

  4. SU-E-J-227: Breathing Pattern Consistency and Reproducibility: Comparative Analysis for Supine and Prone Body Positioning

    International Nuclear Information System (INIS)

    Laugeman, E; Weiss, E; Chen, S; Hugo, G; Rosu, M

    2014-01-01

    Purpose: Evaluate and compare the cycle-to-cycle consistency of breathing patterns and their reproducibility over the course of treatment, for supine and prone positioning. Methods: Respiratory traces from 25 patients were recorded for sequential supine/prone 4DCT scans acquired prior to treatment, and during the course of the treatment (weekly or bi-weekly). For each breathing cycle, the average(AVE), end-of-exhale(EoE) and end-of-inhale( EoI) locations were identified using in-house developed software. In addition, the mean values and variations for the above quantities were computed for each breathing trace. F-tests were used to compare the cycle-to-cycle consistency of all pairs of sequential supine and prone scans. Analysis of variances was also performed using population means for AVE, EoE and EoI to quantify differences between the reproducibility of prone and supine respiration traces over the treatment course. Results: Consistency: Cycle-to-cycle variations are less in prone than supine in the pre-treatment and during-treatment scans for AVE, EoE and EoI points, for the majority of patients (differences significant at p<0.05). The few cases where the respiratory pattern had more variability in prone appeared to be random events. Reproducibility: The reproducibility of breathing patterns (supine and prone) improved as treatment progressed, perhaps due to patients becoming more comfortable with the procedure. However, variability in supine position continued to remain significantly larger than in prone (p<0.05), as indicated by the variance analysis of population means for the pretreatment and subsequent during-treatment scans. Conclusions: Prone positioning stabilizes breathing patterns in most subjects investigated in this study. Importantly, a parallel analysis of the same group of patients revealed a tendency towards increasing motion amplitude of tumor targets in prone position regardless of their size or location; thus, the choice for body positioning

  5. Reproducibility of (n,γ) gamma ray spectrum in Pb under different ENDF/B releases

    Energy Technology Data Exchange (ETDEWEB)

    Kebwaro, J.M., E-mail: jeremiahkebwaro@gmail.com [Department of Physical Sciences, Karatina University, P.O. Box 1957-10101, Karatina (Kenya); He, C.H.; Zhao, Y.L. [School of Nuclear Science and Technology, Xian Jiaotong University, Xian, Shaanxi 710049 (China)

    2016-04-15

    Radiative capture reactions are of interest in shielding design and other fundamental research. In this study the reproducibility of (n,γ) reactions in Pb when cross-section data from different ENDF/B releases are used in the Monte-Carlo code, MCNP, was investigated. Pb was selected for this study because it is widely used in shielding applications where capture reactions are likely to occur. Four different neutron spectra were declared as source in the MCNP model which consisted of a simple spherical geometry. The gamma ray spectra due to the capture reactions were recorded at 10 cm from the center of the sphere. The results reveal that the gamma ray spectrum produced by ENDF/B-V is in reasonable agreement with that produced when ENDF/B-VI.6 is used. However the spectrum produced by ENDF/B-VII does not reveal any primary gamma rays in the higher energy region (E > 3 MeV). It is further observed that the intensities of the capture gamma rays produced when various releases are used differ by a some margin showing that the results are not reproducible. The generated spectra also vary with the spectrum of the source neutrons. The discrepancies observed among various ENDF/B releases could raise concerns to end users and need to be addressed properly during benchmarking calculations before the next release. The evaluation from ENDF to ACE format that is supplied with MCNP should also be examined because errors might have arisen during the evaluation.

  6. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Science.gov (United States)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  7. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Directory of Open Access Journals (Sweden)

    Wilke Daniel N.

    2017-01-01

    Full Text Available The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  8. [Trace elements of bone tumors].

    Science.gov (United States)

    Kalashnikov, V M; Zaĭchik, V E; Bizer, V A

    1983-01-01

    Due to activation analysis involving the use of neutrons from a nuclear reactor, the concentrations of 11 trace elements: scandium, iron, cobalt, mercury, rubidium, selenium, silver, antimony, chrome, zinc and terbium in intact bone and skeletal tumors were measured. 76 specimens of bioptates and resected material of operations for bone tumors and 10 specimens of normal bone tissue obtained in autopsies of cases of sudden death were examined. The concentrations of trace elements and their dispersion patterns in tumor tissue were found to be significantly higher than those in normal bone tissue. Also, the concentrations of some trace elements in tumor differed significantly from those in normal tissue; moreover, they were found to depend on the type and histogenesis of the neoplasm.

  9. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  10. Trace Invariance for Quaternion Matrices

    Directory of Open Access Journals (Sweden)

    Ralph John de la Cruz

    2015-12-01

    Full Text Available Let F be a f ield. It is a classical result in linear algebra that for each A, P ϵ Mn (F such that P is nonsingular, tr A = tr (PAP-1. We show in this paper that the preceding property does not hold true if F is the division ring of real quaternions. We show that the only quaternion matrices that have their trace invariant under unitary similarity are Hermitian matrices, and that the only matrices that have their trace invariant under similarity are real scalar matrices.

  11. Polynomial weights and code constructions

    DEFF Research Database (Denmark)

    Massey, J; Costello, D; Justesen, Jørn

    1973-01-01

    polynomial included. This fundamental property is then used as the key to a variety of code constructions including 1) a simplified derivation of the binary Reed-Muller codes and, for any primepgreater than 2, a new extensive class ofp-ary "Reed-Muller codes," 2) a new class of "repeated-root" cyclic codes...... that are subcodes of the binary Reed-Muller codes and can be very simply instrumented, 3) a new class of constacyclic codes that are subcodes of thep-ary "Reed-Muller codes," 4) two new classes of binary convolutional codes with large "free distance" derived from known binary cyclic codes, 5) two new classes...... of long constraint length binary convolutional codes derived from2^r-ary Reed-Solomon codes, and 6) a new class ofq-ary "repeated-root" constacyclic codes with an algebraic decoding algorithm....

  12. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  13. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  14. Composting in small laboratory pilots: Performance and reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Lashermes, G.; Barriuso, E. [INRA, UMR1091 Environment and Arable Crops (INRA, AgroParisTech), F-78850 Thiverval-Grignon (France); Le Villio-Poitrenaud, M. [VEOLIA Environment - Research and Innovation, F-78520 Limay (France); Houot, S., E-mail: sabine.houot@grignon.inra.fr [INRA, UMR1091 Environment and Arable Crops (INRA, AgroParisTech), F-78850 Thiverval-Grignon (France)

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  15. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  16. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  17. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  18. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  19. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  20. Variable Rate, Adaptive Transform Tree Coding Of Images

    Science.gov (United States)

    Pearlman, William A.

    1988-10-01

    A tree code, asymptotically optimal for stationary Gaussian sources and squared error distortion [2], is used to encode transforms of image sub-blocks. The variance spectrum of each sub-block is estimated and specified uniquely by a set of one-dimensional auto-regressive parameters. The expected distortion is set to a constant for each block and the rate is allowed to vary to meet the given level of distortion. Since the spectrum and rate are different for every block, the code tree differs for every block. Coding simulations for target block distortion of 15 and average block rate of 0.99 bits per pel (bpp) show that very good results can be obtained at high search intensities at the expense of high computational complexity. The results at the higher search intensities outperform a parallel simulation with quantization replacing tree coding. Comparative coding simulations also show that the reproduced image with variable block rate and average rate of 0.99 bpp has 2.5 dB less distortion than a similarly reproduced image with a constant block rate equal to 1.0 bpp.

  1. Reproducibility in protein profiling by MALDI-TOF mass spectrometry

    DEFF Research Database (Denmark)

    Albrethsen, Jakob

    2007-01-01

    , with the reported mean CV of the peak intensity varying among studies from 4% to 26%. There is additional interexperiment variation in peak intensity. Current approaches to improve the analytical performance of MALDI protein profiling include automated sample processing, extensive prefractionation strategies......, immunocapture, prestructured target surfaces, standardized matrix (co)crystallization, improved MALDI-TOF MS instrument components, internal standard peptides, quality-control samples, replicate measurements, and algorithms for normalization and peak detection. CONCLUSIONS: Further evaluation and optimization......BACKGROUND: Protein profiling with high-throughput sample preparation and MALDI-TOF MS analysis is a new potential tool for diagnosis of human diseases. However, analytical reproducibility is a significant challenge in MALDI protein profiling. This minireview summarizes studies of reproducibility...

  2. Towards a Reproducible Synthesis of High Aspect Ratio Gold Nanorods

    Directory of Open Access Journals (Sweden)

    Susanne Koeppl

    2011-01-01

    Full Text Available The seed-mediated method in presence of high concentrations of CTAB is frequently implemented in the preparation of high aspect ratio gold nanorods (i.e., nanorods with aspect ratios of 5 or more; however, the reproducibility has still been limited. We rendered the synthesis procedure simpler, decreased the susceptibility to impurities, and improved the reproducibility of the product distribution. As a result of the high aspect ratios, longitudinal plasmon absorptions were shifted up to very high absorption maxima of 1955 nm in UV-vis-NIR spectra (since this band is completely covered in aqueous solution by the strong absorption of water, the gold species were embedded in poly(vinyl alcohol films for UV-vis-NIR measurements. Finally, the directed particle growth in (110 direction leads to the conclusion that the adsorption of CTAB molecules at specific crystal faces accounts for nanorod growth and not cylindrical CTAB micelles, in agreement with other observations.

  3. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  4. Towards reproducibility of research by reuse of IT best practices

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Reproducibility of any research gives much higher credibility both to research results and to the researchers. This is true for any kind of research including computer science, where a lot of tools and approaches have been developed to ensure reproducibility. In this talk I will focus on basic and seemingly simple principles, which sometimes look too obvious to follow, but help researchers build beautiful and reliable systems that produce consistent, measurable results. My talk will cover, among other things, the problem of embedding machine learning techniques into analysis strategy. I will also speak about the most common pitfalls in this process and how to avoid them. In addition, I will demonstrate the research environment based on the principles that I will have outlined. About the speaker Andrey Ustyuzhanin (36) is Head of CERN partnership program at Yandex. He is involved in the development of event indexing and event filtering services which Yandex has been providing for the LHCb experiment sinc...

  5. Reproducibility of the Tronzo and AO classifications for transtrochanteric fractures.

    Science.gov (United States)

    Mattos, Carlos Augusto; Jesus, Alexandre Atsushi Koza; Floter, Michelle Dos Santos; Nunes, Luccas Franco Bettencourt; Sanches, Bárbara de Baptista; Zabeu, José Luís Amim

    2015-01-01

    To analyze the reproducibility of the Tronzo and AO classifications for transtrochanteric fractures. This was a cross-sectional study in which the intraobserver and interobserver concordance between two readings made by 11 observers was analyzed. The analysis of the variations used the kappa statistical method. Moderate concordance was found in relation to the AO classification, while slight concordance was found for the Tronzo classification. This study found that the AO/Asif classification for transtrochanteric presented greater intra and interobserver reproducibility and that greater concordance was correlated with greater experience of the observers. Without division into subgroups, the AO/Asif classification was shown, as described in the literature, to be acceptable for clinical use in relation to transtrochanteric fractures of the femur, although it did not show absolute concordance, given that its concordance level was only moderate. Nonetheless, its concordance was better than that of the Tronzo classification.

  6. Toric codes and quantum doubles from two-body Hamiltonians

    Energy Technology Data Exchange (ETDEWEB)

    Brell, Courtney G; Bartlett, Stephen D; Doherty, Andrew C [Centre for Engineered Quantum Systems, School of Physics, University of Sydney, Sydney (Australia); Flammia, Steven T, E-mail: cbrell@physics.usyd.edu.au [Perimeter Institute for Theoretical Physics, Waterloo (Canada)

    2011-05-15

    We present here a procedure to obtain the Hamiltonians of the toric code and Kitaev quantum double models as the low-energy limits of entirely two-body Hamiltonians. Our construction makes use of a new type of perturbation gadget based on error-detecting subsystem codes. The procedure is motivated by a projected entangled pair states (PEPS) description of the target models, and reproduces the target models' behavior using only couplings that are natural in terms of the original Hamiltonians. This allows our construction to capture the symmetries of the target models.

  7. SPECTROPHOTOMETRIC DETERMINATION OF TRACE OXALIC ...

    African Journals Online (AJOL)

    Based on the property that oxalic acid has the effect on the replacement of dibromochloroarsenazo in zirconium(IV)-dibromochloroarsenazo complex to produce hyperchromic effects in 1.26 M hydrochloric acid medium, a novel method for the determination of trace oxalic acid by spectrophotometry was developed.

  8. Trace elements in brazilian soils

    International Nuclear Information System (INIS)

    Rocha, Geraldo Cesar

    1995-01-01

    A literature revision on trace elements (Zn, B, Mn, Mo, Cu, Fe, and Cl) in Brazilian soils was prepared, with special attention to the chemical form and range in the soil, extraction methods and correlation of the amount in soils with soil properties

  9. Serous tubal intraepithelial carcinoma: diagnostic reproducibility and its implications.

    Science.gov (United States)

    Carlson, Joseph W; Jarboe, Elke A; Kindelberger, David; Nucci, Marisa R; Hirsch, Michelle S; Crum, Christopher P

    2010-07-01

    Serous tubal intraepithelial carcinoma (STIC) is detected in between 5% and 7% of women undergoing risk-reduction salpingooophorectomy for mutations in the BRCA1 or 2 genes (BRCA+), and seems to play a role in the pathogenesis of many ovarian and "primary peritoneal" serous carcinomas. The recognition of STIC is germane to the management of BRCA+ women; however, the diagnostic reproducibility of STIC is unknown. Twenty-one cases were selected and classified as STIC or benign, using both hematoxylin and eosin and immunohistochemical stains for p53 and MIB-1. Digital images of 30 hematoxylin and eosin-stained STICs (n=14) or benign tubal epithelium (n=16) were photographed and randomized for blind digital review in a Powerpoint format by 6 experienced gynecologic pathologists and 6 pathology trainees. A generalized kappa statistic for multiple raters was calculated for all groups. For all reviewers, the kappa was 0.333, indicating poor reproducibility; kappa was 0.453 for the experienced gynecologic pathologists (fair-to-good reproducibility), and kappa=0.253 for the pathology residents (poor reproducibility). In the experienced group, 3 of 14 STICs were diagnosed by all 6 reviewers, and 9 of 14 by a majority of the reviewers. These results show that interobserver concordance in the recognition of STIC in high-quality digital images is at best fair-to-good for even experienced gynecologic pathologists, and a proportion cannot be consistently identified even among experienced observers. In view of these findings, a diagnosis of STIC should be corroborated by a second pathologist, if feasible.

  10. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    Science.gov (United States)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  11. Composting in small laboratory pilots: performance and reproducibility.

    Science.gov (United States)

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  13. LHC Orbit Correction Reproducibility and Related Machine Protection

    OpenAIRE

    Baer, T; Fuchsberger, K; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the...

  14. Reproducibility of urinary biomarkers in multiple 24-h urine samples.

    Science.gov (United States)

    Sun, Qi; Bertrand, Kimberly A; Franke, Adrian A; Rosner, Bernard; Curhan, Gary C; Willett, Walter C

    2017-01-01

    Limited knowledge regarding the reproducibility of biomarkers in 24-h urine samples has hindered the collection and use of the samples in epidemiologic studies. We aimed to evaluate the reproducibility of various markers in repeat 24-h urine samples. We calculated intraclass correlation coefficients (ICCs) of biomarkers measured in 24-h urine samples that were collected in 3168 participants in the NHS (Nurses' Health Study), NHSII (Nurses' Health Study II), and Health Professionals Follow-Up Study. In 742 women with 4 samples each collected over the course of 1 y, ICCs for sodium were 0.32 in the NHS and 0.34 in the NHSII. In 2439 men and women with 2 samples each collected over 1 wk to ≥1 mo, the ICCs ranged from 0.33 to 0.68 for sodium at various intervals between collections. The urinary excretion of potassium, calcium, magnesium, phosphate, sulfate, and other urinary markers showed generally higher reproducibility (ICCs >0.4). In 47 women with two 24-h urine samples, ICCs ranged from 0.15 (catechin) to 0.75 (enterolactone) for polyphenol metabolites. For phthalates, ICCs were generally ≤0.26 except for monobenzyl phthalate (ICC: 0.55), whereas the ICC was 0.39 for bisphenol A (BPA). We further estimated that, for the large majority of the biomarkers, the mean of three 24-h urine samples could provide a correlation of ≥0.8 with true long-term urinary excretion. These data suggest that the urinary excretion of various biomarkers, such as minerals, electrolytes, most polyphenols, and BPA, is reasonably reproducible in 24-h urine samples that are collected within a few days or ≤1 y. Our findings show that three 24-h samples are sufficient for the measurement of long-term exposure status in epidemiologic studies. © 2017 American Society for Nutrition.

  15. Multi-parametric neuroimaging reproducibility: a 3-T resource study.

    Science.gov (United States)

    Landman, Bennett A; Huang, Alan J; Gifford, Aliya; Vikram, Deepti S; Lim, Issel Anne L; Farrell, Jonathan A D; Bogovic, John A; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A; Joel, Suresh; Mori, Susumu; Pekar, James J; Barker, Peter B; Prince, Jerry L; van Zijl, Peter C M

    2011-02-14

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60-min protocol on a 3-T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22-61 years old). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability, and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1-5% variability), while variation on diffusion and several other quantitative scans was higher (~parametric imaging protocols. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    metabolites with percent standard deviation Cramer- Rao lower bounds ≤20% were included in statistical analyses. One subject’s MRI#1 and one sub...relative to the mean as it is calculated as the standard deviation nor- malized by the average between visits. MRD provides information about the...inherent technical and physiological consistency of these measurements. This longitudinal study examined the variance and reproducibility of commonly

  17. Intra- and inter-examiner reproducibility of manual probing depth

    OpenAIRE

    Andrade,Roberto; Espinoza,Manuel; Gómez,Elena Maria; Rolando Espinoza,José; Cruz,Elizabeth

    2012-01-01

    The periodontal probe remains the best clinical diagnostic tool for the collection of information regarding the health status and the attachment level of periodontal tissues. The aim of this study was to evaluate intra- and inter-examiner reproducibility of probing depth (PD) measurements made with a manual probe. With the approval of an Ethics Committee, 20 individuals without periodontal disease were selected if they presented at least 6 teeth per quadrant. Using a Williams periodontal prob...

  18. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  19. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  20. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  1. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  2. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  3. Validity and reproducibility of a Spanish dietary history.

    Directory of Open Access Journals (Sweden)

    Pilar Guallar-Castillón

    Full Text Available To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E, which collects information on numerous aspects of the Spanish diet.The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart.The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66, meat (r = 0.66, fish (r = 0.42, vegetables (r = 0.62 and fruits (r = 0.44. The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76, proteins (r= 0.58, lipids (r = 0.73, saturated fat (r = 0.73, monounsaturated fat (r = 0.59, polyunsaturated fat (r = 0.57, and carbohydrates (r = 0.66. The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients.The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients.

  4. The Reproducibility of Nuclear Morphometric Measurements in Invasive Breast Carcinoma

    Directory of Open Access Journals (Sweden)

    Pauliina Kronqvist

    1997-01-01

    Full Text Available The intraobserver and interobserver reproducibility of computerized nuclear morphometry was determined in repeated measurements of 212 samples of invasive breast cancer. The influence of biological variation and the selection of the measurement area was also tested. Morphometrically determined mean nuclear profile area (Pearson’s r 0.89, grading efficiency (GE 0.95 and standard deviation (SD of nuclear profile area (Pearson’s r 0.84, GE 0.89 showed high reproducibility. In this respect, nuclear morphometry equals with other established methods of quantitative pathology and exceeds the results of subjective grading of nuclear atypia in invasive breast cancer. A training period of eight days was sufficient to produce clear improvement in consistency of nuclear morphometry results. By estimating the sources of variation it could be shown that the variation associated with the measurement procedure itself is small. Instead, sample associated variation is responsible for the majority of variation in the measurements (82.9% in mean nuclear profile area and 65.9% in SD of nuclear profile area. This study points out that when standardized methods are applied computerized morphometry is a reproducible and reliable method of assessing nuclear atypia in invasive breast cancer. For further improvement special emphasize should be put on sampling rules of selecting the microscope fields and measurement areas.

  5. Aveiro method in reproducing kernel Hilbert spaces under complete dictionary

    Science.gov (United States)

    Mai, Weixiong; Qian, Tao

    2017-12-01

    Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.

  6. Sex is over-rated: on the right to reproduce.

    Science.gov (United States)

    Cutas, Daniela

    2009-03-01

    In this article, I will show what is respected most in human reproduction and parenting is not a right to reproduce in the way in which this right is explicitly proposed. The only way in which people can become, and function as, parents without having to submit themselves to anyone else's judgements and decisions, is by having reproductive sex. Whatever one's intentions, social status, standard of living, income, etc., so long as assistance is not required, that person's reproductive decisions will not be interfered with in any way, at least not until neglect or abuse of their offspring becomes known. Moreover, none of the features that are said to back the right to reproduce (such as bodily integrity or personal autonomy) can justify one's unquestioned access to the relationship with another unable to consent (the child). This indicates that the discourse in terms of the right to reproduce as is currently used so as to justify non-interference with natural reproduction and parenting coupled with the regulation of assisted forms of reproduction and parenting, is at best self-deluding and that all it protects is people's freedom to have reproductive sex and handle the consequences.

  7. Reproducibility and validity of self-perceived oral health conditions.

    Science.gov (United States)

    Pinelli, Camila; de Castro Monteiro Loffredo, Leonor

    2007-12-01

    The reproducibility and validity of self-perceived periodontal, dental, and temporomandibular joint (TMJ) conditions were investigated. A questionnaire was applied in interview to 200 adults aged from 35 to 44, who were attending as casual patients at Araraquara School of Dentistry, São Paulo State University, São Paulo, Brazil. Clinical examination was based on the guidelines of the World Health Organization manual. The interview and the clinical examination were performed in two occasions, by a calibrated examiner. Reproducibility and validity were, respectively, verified by kappa statistics (kappa) and sensitivity (Sen) and specificity (Spec) values, having clinical examination as the validation criterion. The results showed an almost perfect agreement for self-perceived TMJ (kappa = 0.85) and periodontal conditions (kappa = 0.81), and it was substantial for dental condition (kappa = 0.69). Reproducibility according to clinical examination showed good results (kappa = 0.73 for CPI index, kappa = 0.96 for dental caries, and kappa = 0.74 for TMJ conditions). Sensitivity and specificity values were higher for self-perceived dental (Sen = 0.84, Spec = 1.0) and TMJ conditions (Sen = 1.0, Spec = 0.8). With regard to periodontal condition, specificity was low (0.43), although sensitivity was very high (1.0). Self-perceived oral health was reliable for the examined conditions. Validity was good to detect dental conditions and TMJ disorders, and it was more sensitive than specific to detect the presence of periodontal disease.

  8. Reproducibility of gene expression across generations of Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Haslett Judith N

    2003-06-01

    Full Text Available Abstract Background The development of large-scale gene expression profiling technologies is rapidly changing the norms of biological investigation. But the rapid pace of change itself presents challenges. Commercial microarrays are regularly modified to incorporate new genes and improved target sequences. Although the ability to compare datasets across generations is crucial for any long-term research project, to date no means to allow such comparisons have been developed. In this study the reproducibility of gene expression levels across two generations of Affymetrix GeneChips® (HuGeneFL and HG-U95A was measured. Results Correlation coefficients were computed for gene expression values across chip generations based on different measures of similarity. Comparing the absolute calls assigned to the individual probe sets across the generations found them to be largely unchanged. Conclusion We show that experimental replicates are highly reproducible, but that reproducibility across generations depends on the degree of similarity of the probe sets and the expression level of the corresponding transcript.

  9. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  10. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  11. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  12. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    A classic way to choose a supplier is through a bidding process where tenders from competing companies are evaluated in relation to the customer’s requirements. If the customer wants to hire an agile software developing team instead of buying a software product, a new approach for comparing tenders...... is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  13. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  14. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  15. Conversion of the input of teach bottom of TRAC-BF1 to trace

    International Nuclear Information System (INIS)

    Jambrina, A.; Mesado, C.; Barrachina, T.; Miro, R.; Verdu, G.

    2013-01-01

    The presented work explains the methodology followed to translate a model from TRAC-BF1 to TRACE. This methodology is not exclusive to the PBTT model; You can also use, taking into account minor modifications, to translate other models, whenever the codes of departure and goal are the same.

  16. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    Science.gov (United States)

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  17. Inter-examiner reproducibility of tests for lumbar motor control

    Directory of Open Access Journals (Sweden)

    Elkjaer Arne

    2011-05-01

    Full Text Available Abstract Background Many studies show a relation between reduced lumbar motor control (LMC and low back pain (LBP. However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS, sitting forward lean (SFL, sitting knee extension (SKE, and bent knee fall out (BKFO, all measured in cm, and leg lowering (LL, measured in mm Hg. A total of 40 subjects (14 males, 26 females 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8, were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27 cm for RPS, 1.01 (0.62 cm for SFL, 0.40 (0.29 cm for SKE, 1.07 (0.52 cm for BKFO, and 32.9 (7.1 mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these

  18. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    lowing function is maximized,. This kind of decoding strategy is called the maximum a posteriori probability (MAP) decoding strategy as it attempts to estimate each symbol of the codeword that ..... gate the effects of packet loss over digital networks. Un- doubtedly other applications will use these codes in the years to come.

  19. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  20. CERN Code of Conduct

    CERN Document Server

    Department, HR

    2010-01-01

    The Code is intended as a guide in helping us, as CERN contributors, to understand how to conduct ourselves, treat others and expect to be treated. It is based around the five core values of the Organization. We should all become familiar with it and try to incorporate it into our daily life at CERN.

  1. Error Correcting Codes

    Indian Academy of Sciences (India)

    focused pictures of Triton, Neptune's largest moon. This great feat was in no small measure due to the fact that the sophisticated communication system on Voyager had an elaborate error correcting scheme built into it. At Jupiter and Saturn, a convolutional code was used to enhance the reliability of transmission, and at ...

  2. Nuclear safety code study

    Energy Technology Data Exchange (ETDEWEB)

    Hu, H.H.; Ford, D.; Le, H.; Park, S.; Cooke, K.L.; Bleakney, T.; Spanier, J.; Wilburn, N.P.; O' Reilly, B.; Carmichael, B.

    1981-01-01

    The objective is to analyze an overpower accident in an LMFBR. A simplified model of the primary coolant loop was developed in order to understand the instabilities encountered with the MELT III and SAS codes. The computer programs were translated for switching to the IBM 4331. Numerical methods were investigated for solving the neutron kinetics equations; the Adams and Gear methods were compared. (DLC)

  3. Student Dress Codes.

    Science.gov (United States)

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  4. Differential pulse code modulation

    Science.gov (United States)

    Herman, C. F. (Inventor)

    1976-01-01

    A differential pulse code modulation (DPCM) encoding and decoding method is described along with an apparatus which is capable of transmission with minimum bandwidth. The apparatus is not affected by data transition density, requires no direct current (DC) response of the transmission link, and suffers from minimal ambiguity in resolution of the digital data.

  5. Error Correcting Codes

    Indian Academy of Sciences (India)

    syndrome is an indicator of underlying disease. Here too, a non zero syndrome is an indication that something has gone wrong during transmission. SERIES I ARTICLE. The first matrix on the left hand side is called the parity check matrix H. Thus every codeword c satisfies the equation o o. HcT = o o. Therefore the code can ...

  6. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  7. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  8. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives...

  9. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  10. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  11. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  12. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  13. Ptolemy Coding Style

    Science.gov (United States)

    2014-09-05

    because this would combine Ptolemy II with the GPL’d code and thus encumber Ptolemy II with the GPL. Another GNU license is the GNU Library General...permission on the source.eecs.berkeley.edu repositories, then use your local repository. bash-3.2$ svn co svn+ ssh ://source.eecs.berkeley.edu/chess

  14. Accumulate Repeat Accumulate Coded Modulation

    Science.gov (United States)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  15. Causation, constructors and codes.

    Science.gov (United States)

    Hofmeyr, Jan-Hendrik S

    2018-02-01

    Relational biology relies heavily on the enriched understanding of causal entailment that Robert Rosen's formalisation of Aristotle's four causes has made possible, although to date efficient causes and the rehabilitation of final cause have been its main focus. Formal cause has been paid rather scant attention, but, as this paper demonstrates, is crucial to our understanding of many types of processes, not necessarily biological. The graph-theoretic relational diagram of a mapping has played a key role in relational biology, and the first part of the paper is devoted to developing an explicit representation of formal cause in the diagram and how it acts in combination with efficient cause to form a mapping. I then use these representations to show how Von Neumann's universal constructor can be cast into a relational diagram in a way that avoids the logical paradox that Rosen detected in his own representation of the constructor in terms of sets and mappings. One aspect that was absent from both Von Neumann's and Rosen's treatments was the necessity of a code to translate the description (the formal cause) of the automaton to be constructed into the construction process itself. A formal definition of codes in general, and organic codes in particular, allows the relational diagram to be extended so as to capture this translation of formal cause into process. The extended relational diagram is used to exemplify causal entailment in a diverse range of processes, such as enzyme action, construction of automata, communication through the Morse code, and ribosomal polypeptide synthesis through the genetic code. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Code of Practice on the International Transboundary Movement of Radioactive Waste

    International Nuclear Information System (INIS)

    1990-01-01

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States [fr

  17. Code of Practice on the International Transboundary Movement of Radioactive Waste

    International Nuclear Information System (INIS)

    1990-11-01

    On 21 September 1990, the General Conference, by resolution GC(XXXIV)/RES/530, adopted a Code of Practice on the International Transboundary Movement of Radioactive Waste and requested the Director General - inter alia - to take all necessary steps to ensure wide dissemination of the Code of Practice at both the national and the international level. The Code of Practice was elaborated by a Group of Experts established pursuant to resolution GC(XXXII)/RES/490 adopted by the General Conference in 1988. The text of the Code of Practice is reproduced herewith for the information of all Member States

  18. Natural wetland emissions of methylated trace elements

    NARCIS (Netherlands)

    Vriens, B.; Lenz, M.; Charlet, L.; Berg, M.; Winkel, L.H.E.

    2014-01-01

    Natural wetlands are well known for their significant methane emissions. However, trace element emissions via biomethylation and subsequent volatilization from pristine wetlands are virtually unstudied, even though wetlands constitute large reservoirs for trace elements. Here we show that the

  19. Measurement of temporal asymmetries of glucose consumption using linear profiles: reproducibility and comparison with visual analysis

    International Nuclear Information System (INIS)

    Matheja, P.; Kuwert, T.; Schaefers, M.; Schaefers, K.; Schober, O.; Diehl, B.; Stodieck, S.R.G.; Ringelstein, E.B.; Schuierer, G.

    1998-01-01

    The aim of our study was to test the reproducibility of this method and to compare its diagnostic performance to that of visual analysis in patients with complex partial seizures (CPS). Regional cerebral glucose consumption (rCMRGLc) was measured interictally in 25 CPS patients and 10 controls using F-18-deoxyglucose and the positron emission tomography (PET) camera ECAT EXACT 47. The PET scans were visually analyzed for the occurrence of unilateral temporal hypometabolism. Furthermore, rCMRGLc was quantified on six contiguous coronal planes by manually tracing maximal values of temporal glucose consumption, thus creating line profiles of temporal glucose consumption for each side. Indices of asymmetry (ASY) were then calculated from these line profiles in four temporal regions and compared to the corresponding 95% confidence intervals of the control data. All analyses were performed by two observers independently from each other and without knowledge of the clinical findings. The agreement between the two observers with regard to focus lateralization was 96% on visual analysis and 100% on quantitative analysis. There was an excellent agreement with regard to focus lateralization between visual and quantitative evaluation. (orig.) [de

  20. Distributed trace using central performance counter memory

    Science.gov (United States)

    Satterfield, David L.; Sexton, James C.

    2013-01-22

    A plurality of processing cores, are central storage unit having at least memory connected in a daisy chain manner, forming a daisy chain ring layout on an integrated chip. At least one of the plurality of processing cores places trace data on the daisy chain connection for transmitting the trace data to the central storage unit, and the central storage unit detects the trace data and stores the trace data in the memory co-located in with the central storage unit.

  1. Probabilistic Connections for Bidirectional Path Tracing

    OpenAIRE

    Popov, Stefan; Ramamoorthi, Ravi; Durand, Fredo; Drettakis, George

    2015-01-01

    International audience; Bidirectional Path Tracing Probabilistic Connections for Bidirectional Path Tracing Figure 1: Our Probabilistic Connections for Bidirectional Path Tracing approach importance samples connections to an eye sub-path, and greatly reduces variance, by considering and reusing multiple light sub-paths at once. Our approach (right) achieves much higher quality than bidirectional path-tracing on the left for the same computation time (~8.4 min).. Abstract Bidirectional path tr...

  2. Tracing Utopia in 'Utopia Station'

    DEFF Research Database (Denmark)

    Schwarzbart, Judith

    Art (Publication due in 2012)). But it seems to differ noticeable from the ideologically driven concept(s) of the 20th century avant-garde. The paper will suggest that we in experiments with openness and structure, with an ambivalent engagement in popular culture and everyday life, and complex double...... at the many layers of discourse, ‘thinking-in-process’ and collaboration. These processes led to particular presentational formats (display) and architectural frameworks for activities, and gave way to a variety of other material and situated performative modes of audience encounters. Here, we can trace avant......-garde ideas about radical democracy through open processes and active involvement of audience-participants, we can trace formal (architectural) structures back to the Russian constructivism and many other links back in time, but the central question remains if theses ethical-political and aesthetic gestures...

  3. Trace elements and bone health.

    Science.gov (United States)

    Zofková, Ivana; Nemcikova, Petra; Matucha, Petr

    2013-08-01

    The importance of nutrition factors such as calcium, vitamin D and vitamin K for the integrity of the skeleton is well known. Moreover, bone health is positively influenced by certain elements (e.g., zinc, copper, fluorine, manganese, magnesium, iron and boron). Deficiency of these elements slows down the increase of bone mass in childhood and/or in adolescence and accelerates bone loss after menopause or in old age. Deterioration of bone quality increases the risk of fractures. Monitoring of homeostasis of the trace elements together with the measurement of bone density and biochemical markers of bone metabolism should be used to identify and treat patients at risk of non-traumatic fractures. Factors determining the effectivity of supplementation include dose, duration of treatment, serum concentrations, as well as interactions among individual elements. Here, we review the effect of the most important trace elements on the skeleton and evaluate their clinical importance.

  4. Using different approaches to assess the reproducibility of a ...

    African Journals Online (AJOL)

    Method: A previously developed and validated QFFQ was completed by trained fieldworkers. Portion sizes were estimated using different methods. Food intake was coded and analysed for nutrient intake per day for each subject. The first interview (n = 1 888) took place during the baseline data collection period. For the ...

  5. Performance Tuning of x86 OpenMP Codes with MAQAO

    Science.gov (United States)

    Barthou, Denis; Charif Rubial, Andres; Jalby, William; Koliai, Souad; Valensi, Cédric

    Failing to find the best optimization sequence for a given application code can lead to compiler generated codes with poor performances or inappropriate code. It is necessary to analyze performances from the assembly generated code to improve over the compilation process. This paper presents a tool for the performance analysis of multithreaded codes (OpenMP programs support at the moment). MAQAO relies on static performance evaluation to identify compiler optimizations and assess performance of loops. It exploits static binary rewriting for reading and instrumenting object files or executables. Static binary instrumentation allows the insertion of probes at instruction level. Memory accesses can be captured to help tune the code, but such traces require to be compressed. MAQAO can analyze the results and provide hints for tuning the code. We show on some examples how this can help users improve their OpenMP applications.

  6. The Reliability and Reproducibility of Corneal Confocal Microscopy in Children.

    Science.gov (United States)

    Pacaud, Danièle; Romanchuk, Kenneth G; Tavakoli, Mitra; Gougeon, Claire; Virtanen, Heidi; Ferdousi, Maryam; Nettel-Aguirre, Alberto; Mah, Jean K; Malik, Rayaz A

    2015-08-01

    To assess the image and patient level interrater agreement and repeatability within 1 month for corneal nerve fiber length (CNFL) measured using in vivo corneal confocal microscopy (IVCCM) in children. Seventy-one subjects (mean [SD] age 14.3 [2.6] years, range 8-18 years; 44 with type 1 diabetes and 27 controls; 36 males and 35 females) were included. 547 images (∼6 images per subject) were analyzed manually by two independent and masked observers. One-month repeat visit images were analyzed by a single masked observer in 21 patients. Automated image analysis was then performed using a specialized computerized software (ACCMetrics). For CNFL, the ICC (95% CI) were 0.94 (0.93-0.95) for image-level, 0.86 (0.78-0.91) for patient-level, and 0.88 (0.72-0.95) for the 1-month repeat assessment, and the Bland-Altman plots showed minimal bias between observers. Although there was excellent agreement between manual and automated analysis according to an ICC 0.89 (0.82-0.93), the Bland-Altman plot showed a consistent bias with manual measurements providing higher readings. In vivo corneal confocal microscopy image analysis shows good reproducibility with excellent intraindividual and interindividual variability in pediatric subjects. Since the image-level reproducibility is stronger than the patient-level reproducibility, refinement of the method for image selection will likely further increase the robustness of this novel, rapid, and noninvasive approach to detect early neuropathy in children with diabetes. Further studies on the use of IVCCM to identify early subclinical neuropathy in children are indicated.

  7. Can global chemistry-climate models reproduce air quality extremes?

    Science.gov (United States)

    Schnell, J.; Prather, M. J.; Holmes, C. D.

    2013-12-01

    We identify and characterize extreme ozone pollution episodes over the USA and EU through a novel analysis of ten years (2000-2010) of surface ozone measurements. An optimal interpolation scheme is developed to create grid-cell averaged values of surface ozone that can be compared with gridded model simulations. In addition, it also allows a comparison of two non-coincident observational networks in the EU. The scheme incorporates techniques borrowed from inverse distance weighting and Kriging. It uses all representative observational site data while still recognizing the heterogeneity of surface ozone. Individual, grid-cell level events are identified as an exceedance of historical percentile (10 worst days in a year, 97.3 percentile). A clustering algorithm is then used to construct the ozone episodes from the individual events. We then test the skill of the high-resolution (100 km) two-year (2005-2006) hindcast from the UCI global chemistry transport model in reproducing the events/episodes identified in the observations using the same identification criteria. Although the UCI CTM has substantial biases in surface ozone, we find that it has considerable skill in reproducing both individual grid-cell level extreme events and their connectedness in space and time with an overall skill of 24% (32%) for the US (EU). The grid-cell level extreme ozone events in both the observations and UCI CTM are found to occur mostly (~75%) in coherent, multi-day, connected episodes covering areas greater than 1000 x 1000 square km. In addition the UCI CTM has greater skill in reproducing these larger episodes. We conclude that even at relatively coarse resolution, global chemistry-climate models can be used to project major synoptic pollution episodes driven by large-scale climate and chemistry changes even with their known biases.

  8. Repeatability and reproducibility of decisions by latent fingerprint examiners.

    Directory of Open Access Journals (Sweden)

    Bradford T Ulery

    Full Text Available The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as "difficult" than for "easy" or "moderate" comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4; 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization. Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases.

  9. REPRODUCIBILITY OF MASKED HYPERTENSION AMONG ADULTS 30 YEARS AND OLDER

    Science.gov (United States)

    Viera, Anthony J.; Lin, Feng-Chang; Tuttle, Laura A.; Olsson, Emily; Stankevitz, Kristin; Girdler, Susan S.; Klein, J. Larry; Hinderliter, Alan L.

    2015-01-01

    Objective Masked hypertension (MH) refers to non-elevated office blood pressure (BP) with elevated out-of-office BP, but its reproducibility has not been conclusively established. We examined one-week reproducibility of MH by home BP monitoring (HBPM) and ambulatory BP monitoring (ABPM). Methods We recruited 420 adults not on BP-lowering medication with recent clinic BP between 120/80 and 149/95 mm Hg. For main comparisons, participants with office average ABPM average was ≥135/85 mm Hg; they were considered to have MH by HBPM if the average was ≥135/85 mm Hg. Percent agreements were quantified using kappa. We also examined prevalence of MH defined as office average ABPM average ≥130/80 mm Hg. We conducted sensitivity analyses using different threshold BP levels for ABPM-office pairings and HBPM-office pairings for defining MH. Results Prevalence rates of MH based on office-awake ABPM pairings were 44% and 43%, with agreement of 71% (kappa=0.40; 95% CI 0.31–0.49). MH was less prevalent (15% and 17%) using HBPM-office pairings, with agreement of 82% (kappa=0.30; 95% CI 0.16–0.44), and more prevalent when considering 24-hour average (50% and 48%). MH was also less prevalent when more stringent diagnostic criteria were applied. Office-HBPM pairings and office-awake ABPM pairings had fair agreement on MH classification on both occasions, with kappas of 0.36 and 0.30. Conclusions MH has fair short-term reproducibility, providing further evidence that for some people, out-of-office BP is systematically higher than when measured in the office setting. PMID:24842491

  10. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    International Nuclear Information System (INIS)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H.

    2010-01-01

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean ± SD values for GBEF1 and GBEF2 were 52±17% and 52±16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  11. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H. [School of Medicine, Jordan University of Science and Technology, Irbid (Jordan)

    2010-12-15

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean {+-} SD values for GBEF1 and GBEF2 were 52{+-}17% and 52{+-}16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  12. Assessment of precision and reproducibility of a new myograph

    Directory of Open Access Journals (Sweden)

    Piepenbrock Siegfried

    2007-12-01

    Full Text Available Abstract Background The physiological characteristics of muscle activity and the assessment of muscle strength represent important diagnostic information. There are many devices that measure muscle force in humans, but some require voluntary contractions, which are difficult to assess in weak or unconscious patients who are unable to complete a full range of voluntary force assessment tasks. Other devices, which obtain standard muscle contractions by electric stimulations, do not have the technology required to induce and measure reproducible valid contractions at the optimum muscle length. Methods In our study we used a newly developed diagnostic device which measures accurately the reproducibility and time-changed-variability of the muscle force in an individual muscle. A total of 500 in-vivo measurements of supra-maximal isometric single twitch contractions were carried out on the musculus adductor pollicis of 5 test subjects over 10 sessions, with ten repetitions per session. The same protocol was performed on 405 test subjects with two repetitions each to determine a reference-interval on healthy subjects. Results Using our test setting, we found a high reproducibility of the muscle contractions of each test subject. The precision of the measurements performed with our device was 98.74%. Only two consecutive measurements are needed in order to assess a real, representative individual value of muscle force. The mean value of the force of contraction was 9.51 N and the 95% reference interval was 4.77–14.25 N. Conclusion The new myograph is a highly reliable measuring device with which the adductor pollicis can be investigated at the optimum length. It has the potential to become a reliable and valid tool for diagnostic in the clinical setting and for monitoring neuromuscular diseases.

  13. Requirement Tracing using Term Extraction

    OpenAIRE

    Al-Saati, Najla; Abdul-Jaleel, Raghda

    2015-01-01

    Requirements traceability is an essential step in ensuring the quality of software during the early stages of its development life cycle. Requirements tracing usually consists of document parsing, candidate link generation and evaluation and traceability analysis. This paper demonstrates the applicability of Statistical Term Extraction metrics to generate candidate links. It is applied and validated using two data sets and four types of filters two for each data set, 0.2 and 0.25 for MODIS, 0...

  14. Trace Amines and Cocaine Abuse

    OpenAIRE

    Li, Jun-Xu

    2014-01-01

    Cocaine addiction remains a clinical challenge with no effective pharmacotherapy available. Trace amine associated receptor (TAAR) 1 represents a promising drug target for the modulation of dopaminergic system and stimulant abuse. This Viewpoint discusses the emerging data which strongly suggest that TAAR 1 functions as a molecular “brake” that controls the addiction-related effects of cocaine and could be a novel drug target for the development of efficacious pharmacothe...

  15. Precambrian biota: protistan origin of trace fossils?

    Science.gov (United States)

    Pawlowski, Jan; Gooday, Andrew J

    2009-01-13

    Some Precambrian trace fossils have been presented as evidence for the early origin of bilaterians; the recent finding that large amoeboid protists leave macroscopic traces at the bottom of the deep ocean questions the metazoan nature of early trace fossils, stressing the importance of single-cell organisms in Precambrian biota.

  16. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    diagnosis were: CD43, CD79b, CD81, CD200, CD10, and ROR1. Reproducible criteria for component reagents were assessed retrospectively in 14,643 cases from 13 different centres and showed >97% concordance with current approaches. A pilot study to validate staining quality was completed in eleven centres...... identified. Finally, a consensus "recommended" panel of markers to refine diagnosis in borderline cases (CD43, CD79b, CD81, CD200, CD10, ROR1) has been defined and will be prospectively evaluated. This article is protected by copyright. All rights reserved....

  17. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces......, has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise...

  18. Predictions of bubbly flows in vertical pipes using two-fluid models in CFDS-FLOW3D code

    Energy Technology Data Exchange (ETDEWEB)

    Banas, A.O.; Carver, M.B. [Chalk River Laboratories (Canada); Unrau, D. [Univ. of Toronto (Canada)

    1995-09-01

    This paper reports the results of a preliminary study exploring the performance of two sets of two-fluid closure relationships applied to the simulation of turbulent air-water bubbly upflows through vertical pipes. Predictions obtained with the default CFDS-FLOW3D model for dispersed flows were compared with the predictions of a new model (based on the work of Lee), and with the experimental data of Liu. The new model, implemented in the CFDS-FLOW3D code, included additional source terms in the {open_quotes}standard{close_quotes} {kappa}-{epsilon} transport equations for the liquid phase, as well as modified model coefficients and wall functions. All simulations were carried out in a 2-D axisymmetric format, collapsing the general multifluid framework of CFDS-FLOW3D to the two-fluid (air-water) case. The newly implemented model consistently improved predictions of radial-velocity profiles of both phases, but failed to accurately reproduce the experimental phase-distribution data. This shortcoming was traced to the neglect of anisotropic effects in the modelling of liquid-phase turbulence. In this sense, the present investigation should be considered as the first step toward the ultimate goal of developing a theoretically sound and universal CFD-type two-fluid model for bubbly flows in channels.

  19. Predictions of bubbly flows in vertical pipes using two-fluid models in CFDS-FLOW3D code

    International Nuclear Information System (INIS)

    Banas, A.O.; Carver, M.B.; Unrau, D.

    1995-01-01

    This paper reports the results of a preliminary study exploring the performance of two sets of two-fluid closure relationships applied to the simulation of turbulent air-water bubbly upflows through vertical pipes. Predictions obtained with the default CFDS-FLOW3D model for dispersed flows were compared with the predictions of a new model (based on the work of Lee), and with the experimental data of Liu. The new model, implemented in the CFDS-FLOW3D code, included additional source terms in the open-quotes standardclose quotes κ-ε transport equations for the liquid phase, as well as modified model coefficients and wall functions. All simulations were carried out in a 2-D axisymmetric format, collapsing the general multifluid framework of CFDS-FLOW3D to the two-fluid (air-water) case. The newly implemented model consistently improved predictions of radial-velocity profiles of both phases, but failed to accurately reproduce the experimental phase-distribution data. This shortcoming was traced to the neglect of anisotropic effects in the modelling of liquid-phase turbulence. In this sense, the present investigation should be considered as the first step toward the ultimate goal of developing a theoretically sound and universal CFD-type two-fluid model for bubbly flows in channels

  20. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  1. High-adhesive superhydrophobic 3D nanostructured silver films applied as sensitive, long-lived, reproducible and recyclable SERS substrates

    Science.gov (United States)

    Wu, Yunwen; Hang, Tao; Komadina, Jason; Ling, Huiqin; Li, Ming

    2014-07-01

    Silver films with different morphologies were chemically deposited by controlling the bath composition. It is found that the wettability and surface enhanced Raman scattering (SERS) properties were closely connected with the surface morphology. Due to the perfect 3D morphology and the 3D electromagnetic field enhanced by three types of nanogaps distributed uniformly, the 3D microball/nanosheet (MN) silver film shows better SERS properties than those of 2D nanosheets (NSs) and nanoparticles (NPs). The MN silver film showed high adhesive superhydrophobic properties after an oxidation process without any functionalization. It can hold the liquid droplet and trace the target molecules in a rather small volume. The SERS properties of the oxidized MN substrate were enhanced remarkably compared to those of the freshly prepared substrate because of the concentrating effect of the superhydrophobicity. The as-prepared 3D MN silver substrate has also exhibited good performances in reproducibility and reutilization which makes it a promising substrate for molecule tracing.Silver films with different morphologies were chemically deposited by controlling the bath composition. It is found that the wettability and surface enhanced Raman scattering (SERS) properties were closely connected with the surface morphology. Due to the perfect 3D morphology and the 3D electromagnetic field enhanced by three types of nanogaps distributed uniformly, the 3D microball/nanosheet (MN) silver film shows better SERS properties than those of 2D nanosheets (NSs) and nanoparticles (NPs). The MN silver film showed high adhesive superhydrophobic properties after an oxidation process without any functionalization. It can hold the liquid droplet and trace the target molecules in a rather small volume. The SERS properties of the oxidized MN substrate were enhanced remarkably compared to those of the freshly prepared substrate because of the concentrating effect of the superhydrophobicity. The as

  2. Physics behind the mechanical nucleosome positioning code

    Science.gov (United States)

    Zuiddam, Martijn; Everaers, Ralf; Schiessel, Helmut

    2017-11-01

    The positions along DNA molecules of nucleosomes, the most abundant DNA-protein complexes in cells, are influenced by the sequence-dependent DNA mechanics and geometry. This leads to the "nucleosome positioning code", a preference of nucleosomes for certain sequence motives. Here we introduce a simplified model of the nucleosome where a coarse-grained DNA molecule is frozen into an idealized superhelical shape. We calculate the exact sequence preferences of our nucleosome model and find it to reproduce qualitatively all the main features known to influence nucleosome positions. Moreover, using well-controlled approximations to this model allows us to come to a detailed understanding of the physics behind the sequence preferences of nucleosomes.

  3. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, however, that national views of good governance reflect different political cultures and institutional heritages. Fourteen national codes of conduct are analyzed. The findings suggest that public values converge and that they match model codes from the United Nations and the European Council as well...... as conceptions of good governance from other international organizations. While values converge, they are balanced and communicated differently, and seem to some extent to be translated into the national cultures. The set of global public values derived from this analysis include public interest, regime dignity...

  4. Coding isotropic images

    Science.gov (United States)

    Oneal, J. B., Jr.; Natarajan, T. R.

    1976-01-01

    Rate distortion functions for two-dimensional homogeneous isotropic images are compared with the performance of 5 source encoders designed for such images. Both unweighted and frequency weighted mean square error distortion measures are considered. The coders considered are differential PCM (DPCM) using six previous samples in the prediction, herein called 6 pel (picutre element) DPCM; simple DPCM using single sample prediction; 6 pel DPCM followed by entropy coding; 8 x 8 discrete cosine transform coder, and 4 x 4 Hadamard transform coder. Other transform coders were studied and found to have about the same performance as the two transform coders above. With the mean square error distortion measure DPCM with entropy coding performed best. The relative performance of the coders changes slightly when the distortion measure is frequency weighted mean square error. The performance of all the coders was separated by only about 4 dB.

  5. Efficient convolutional sparse coding

    Energy Technology Data Exchange (ETDEWEB)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  6. Computer code FIT

    International Nuclear Information System (INIS)

    Rohmann, D.; Koehler, T.

    1987-02-01

    This is a description of the computer code FIT, written in FORTRAN-77 for a PDP 11/34. FIT is an interactive program to decude position, width and intensity of lines of X-ray spectra (max. length of 4K channels). The lines (max. 30 lines per fit) may have Gauss- or Voigt-profile, as well as exponential tails. Spectrum and fit can be displayed on a Tektronix terminal. (orig.) [de

  7. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  8. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    Why do men continue to fill most of the senior executive positions and seats in the board of directors in Western corporations? Almost everyone agrees that diversity is good, many women are coming down the pipeline, and companies, states and international organizations and institutions have done...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  9. Hydra Code Release

    OpenAIRE

    Couchman, H. M. P.; Pearce, F. R.; Thomas, P. A.

    1996-01-01

    Comment: A new version of the AP3M-SPH code, Hydra, is now available as a tar file from the following sites; http://coho.astro.uwo.ca/pub/hydra/hydra.html , http://star-www.maps.susx.ac.uk/~pat/hydra/hydra.html . The release now also contains a cosmological initial conditions generator, documentation, an installation guide and installation tests. A LaTex version of the documentation is included here

  10. Tokamak simulation code manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs.

  11. Tokamak simulation code manual

    International Nuclear Information System (INIS)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs

  12. ER@CEBAF: Modeling code developments

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Roblin, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-04-13

    A proposal for a multiple-pass, high-energy, energy-recovery experiment using CEBAF is under preparation in the frame of a JLab-BNL collaboration. In view of beam dynamics investigations regarding this project, in addition to the existing model in use in Elegant a version of CEBAF is developed in the stepwise ray-tracing code Zgoubi, Beyond the ER experiment, it is also planned to use the latter for the study of polarization transport in the presence of synchrotron radiation, down to Hall D line where a 12 GeV polarized beam can be delivered. This Note briefly reports on the preliminary steps, and preliminary outcomes, based on an Elegant to Zgoubi translation.

  13. Reproducibility of suppression of Pythium wilt of cucumber by compost

    Directory of Open Access Journals (Sweden)

    Mauritz Vilhelm Vestberg

    2014-10-01

    Full Text Available There is increasing global interest in using compost to suppress soil-borne fungal and bacterial diseases and nematodes. We studied the reproducibility of compost suppressive capacity (SC against Pythium wilt of cucumber using nine composts produced by the same composting plant in 2008 and 2009. A bioassay was set up in a greenhouse using cucumber inoculated with two strains of Pythium. The composts were used as 20% mixtures (v:v of a basic steam-sterilized light Sphagnum peat and sand (3:1, v:v. Shoot height was measured weekly during the 5-week experiment. At harvest, the SC was calculated as the % difference in shoot dry weight (DW between non-inoculated and inoculated cucumbers. The SC was not affected by year of production (2008 or 2009, indicating reproducibility of SC when the raw materials and the composting method are not changed. Differences in shoot height were not as pronounced as those for shoot DW. The results were encouraging, but further studies are still needed for producing compost with guaranteed suppressiveness properties.

  14. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  15. An occlusal plaque index. Measurements of repeatability, reproducibility, and sensitivity.

    Science.gov (United States)

    Splieth, Christian H; Nourallah, Abduhl W

    2006-06-01

    To evaluate a new, computerized method of measuring dental plaque on occlusal surfaces which exhibit the highest caries prevalence. In 16 patients (6-9 years of age), plaque on the occlusal surfaces of permanent molars was stained (Mira-2-Tone) and photographed with an intra-oral camera. In a conventional picture editing program (PC/Adobe PhotoShop 6.0), the occlusal surface and plaque were measured in pixels and the relative proportion of occlusal plaque was calculated (ANALYSIS 3.0). The repeatability and reproducibility of the method were analyzed by re-taking and analyzing four images by two examiners four times via intra- and inter-examiner correlation coefficients and by re-analyzing 10 images. Sensitivity was tested by re-taking and analyzing the images of the same occlusal surfaces in all patients after instructed brushing with an electric toothbrush. Intra- and inter-examiner correlation coefficients for repeatability and reproducibility of the analysis were excellent (ICC> 0.997 and ICC=0.98, resp.; 95% confidence interval: 0.955-0.995). The inter- and intra-examiner coefficients for the whole procedure including the re-taking of images were also high (ICC > 0.90). The method was also highly sensitive, proving a statistically significant plaque reduction after brushing (before: mean 29.2% plaque, after: 14.7% plaque; t-test, P= 0.025).

  16. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  17. A reproducible oral microcosm biofilm model for testing dental materials.

    Science.gov (United States)

    Rudney, J D; Chen, R; Lenton, P; Li, J; Li, Y; Jones, R S; Reilly, C; Fok, A S; Aparicio, C

    2012-12-01

    Most studies of biofilm effects on dental materials use single-species biofilms, or consortia. Microcosm biofilms grown directly from saliva or plaque are much more diverse, but difficult to characterize. We used the Human Oral Microbial Identification Microarray (HOMIM) to validate a reproducible oral microcosm model. Saliva and dental plaque were collected from adults and children. Hydroxyapatite and dental composite discs were inoculated with either saliva or plaque, and microcosm biofilms were grown in a CDC biofilm reactor. In later experiments, the reactor was pulsed with sucrose. DNA from inoculums and microcosms was analysed by HOMIM for 272 species. Microcosms included about 60% of species from the original inoculum. Biofilms grown on hydroxyapatite and composites were extremely similar. Sucrose pulsing decreased diversity and pH, but increased the abundance of Streptococcus and Veillonella. Biofilms from the same donor, grown at different times, clustered together. This model produced reproducible microcosm biofilms that were representative of the oral microbiota. Sucrose induced changes associated with dental caries. This is the first use of HOMIM to validate an oral microcosm model that can be used to study the effects of complex biofilms on dental materials. © 2012 The Society for Applied Microbiology.

  18. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments.

    Directory of Open Access Journals (Sweden)

    Yuan Qi

    Full Text Available Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome (~3.2 Mb using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14 or triplicate (n=3 to assess concordance of all calls and single nucleotide variant (SNV calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC, variant allele frequency (VAF, variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA with accession number EGAS00001000826.

  19. Reproducible Research Practices and Transparency across the Biomedical Literature.

    Directory of Open Access Journals (Sweden)

    Shareen A Iqbal

    2016-01-01

    Full Text Available There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000-2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4, and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014; the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014 or no conflicts (5.6% in 2000, 50.0% in 2014 increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature.

  20. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments

    Science.gov (United States)

    Qi, Yuan; Liu, Xiuping; Liu, Chang-gong; Wang, Bailing; Hess, Kenneth R.; Symmans, W. Fraser; Shi, Weiwei; Pusztai, Lajos

    2015-01-01

    Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome) (~3.2 Mb) using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14) or triplicate (n=3) to assess concordance of all calls and single nucleotide variant (SNV) calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC), variant allele frequency (VAF), variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA) with accession number EGAS00001000826. PMID:26136146