WorldWideScience

Sample records for technology validation codes

  1. Polar Code Validation

    Science.gov (United States)

    1989-09-30

    SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3. POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL- ASMA Su ^"ru5 I1LS SH A...of this problem. 1.1. The Charge-2 Rocket The Charge-2 payload was launched on a Black Brant VB from White Sands Mis- sile Range in New Mexico in

  2. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  3. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set....

  4. The stellar atmosphere simulation code Bifrost. Code description and validation

    Science.gov (United States)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  5. Validation of Ray Tracing Code Refraction Effects

    Science.gov (United States)

    Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.

    2008-01-01

    NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.

  6. FACTAR 2.0 code validation

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, P.B.; Rock, R.C.K.; Wadsworth, S.L. [Ontario Hydro, Reactor Safety and Operational Analysis Dept., Toronto, Ontario (Canada)

    1997-07-01

    The FACTAR code models the thermal and mechanical behaviour of a CANDU fuel channel under degraded cooling conditions. FACTAR is currently undergoing a process of validation against various data sets in order to qualify its use in nuclear safety analysis. This paper outlines the methodology being followed in this effort. The BTF-104 and BTF-105A tests, conducted at Chalk River Laboratories, have been chosen as the first in reactor tests to be used for FACTAR validation. The BTF experiments were designed to represent CANDU fuel behaviour under typical large LOCA conditions. The two tests are summarized briefly, and the results of code comparisons to experimental data are outlined. The comparisons demonstrate that FACTAR is able to accurately predict the values of selected key parameters. As anticipated in the validation plan, further work is required to fully quantify simulation biases for all parameters of interest. (author)

  7. Building Technologies Program Multi-Year Program Plan Technology Validation and Market Introduction 2008

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2008-01-01

    Building Technologies Program Multi-Year Program Plan 2008 for technology validation and market introduction, including ENERGY STAR, building energy codes, technology transfer application centers, commercial lighting initiative, EnergySmart Schools, EnergySmar

  8. Construction of TH code development and validation environment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyungjun; Kim, Hee-Kyung; Bae, Kyoo-Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, each component of code development and validation system, i.e. IVS and Mercurial will be introduced and Redmine, the integrated platform of IVS and Mercurial, will be explained later. Integrated TH code validation system, IVS and code development and management environment are constructed. The code validation could be achieved by a comparison of results with corresponding experiments. The development of thermal-hydraulic (TH) system code for nuclear reactor requires much time and effort, also for its validation and verification(V and V). In previous, TASS/SMR-S code (hereafter TASS) for SMART is developed by KAERI through V and V process. On the way of code development, the version control of source code has great importance. Also, during the V and V process, the way to reduce repeated labor- and time-consuming work of running the code before releasing new version of TH code, is required. Therefore, the integrated platform for TH code development and validation environment is constructed. Finally, Redmine, the project management and issue tracking system, is selected as platform, Mercurial (hg) for source version control and IVS (Integrated Validation System) for TASS is constructed as a prototype for automated V and V. IVS is useful before release a new code version. The code developer can validate code result easily using IVS. Even during code development, IVS could be used for validation of code modification. Using Redmine and Mercurial, users and developers can use IVS result more effectively.

  9. Verification and Validation of Kinetic Codes

    Science.gov (United States)

    Christlieb, Andrew

    2014-10-01

    We review the last three workshops held on Validation and Verification of Kinetic Codes. The goal of the workshops was to highlight the need to develop benchmark test problems beyond traditional test problems such as Landau damping and the two-stream instability. These test problems provide a limited understanding how a code might perform and mask key issues in more complicated situations. Developing these test problems highlights the strengths and weaknesses of both mesh- and particle-based codes. One outcome is that designing test problems that clearly deliver a path forward for developing improved methods is complicated by the need to create a completely self-consistent model. For example, two test cases proposed by the authors as simple test cases turn out to be ill defined. The first case is the modeling of sheath formation in a 1D 1V collisionless plasma. We found that losses to the wall lead to discontinuous distribution functions, a challenge for high order mesh-based solvers. The semi-infinite case was problematic because the far field boundary condition poses difficulty in computing on a finite domain. Our second case was flow of a collisionless electron beam in a pipe. Here, numerical diffusion is a key problem we are testing; however, two-stream instability at the beam edges introduces other issues in terms of finding convergent solutions. For mesh-based codes, before particle trapping takes place, mesh-based methods find themselves outside of the asymptotic regime. Another conclusion we draw from this exercise is that including collisional models in benchmark test problems for mesh-based plasma simulation tools is an important step in providing robust test problems for mesh-based kinetic solvers. In collaboration with Yaman Guclu, David Seal, and John Verboncoeur, Michigan State University.

  10. On the role of code comparisons in verification and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  11. Methodology for computational fluid dynamics code verification/validation

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Blottner, F.G.; Aeschliman, D.P.

    1995-07-01

    The issues of verification, calibration, and validation of computational fluid dynamics (CFD) codes has been receiving increasing levels of attention in the research literature and in engineering technology. Both CFD researchers and users of CFD codes are asking more critical and detailed questions concerning the accuracy, range of applicability, reliability and robustness of CFD codes and their predictions. This is a welcomed trend because it demonstrates that CFD is maturing from a research tool to the world of impacting engineering hardware and system design. In this environment, the broad issue of code quality assurance becomes paramount. However, the philosophy and methodology of building confidence in CFD code predictions has proven to be more difficult than many expected. A wide variety of physical modeling errors and discretization errors are discussed. Here, discretization errors refer to all errors caused by conversion of the original partial differential equations to algebraic equations, and their solution. Boundary conditions for both the partial differential equations and the discretized equations will be discussed. Contrasts are drawn between the assumptions and actual use of numerical method consistency and stability. Comments are also made concerning the existence and uniqueness of solutions for both the partial differential equations and the discrete equations. Various techniques are suggested for the detection and estimation of errors caused by physical modeling and discretization of the partial differential equations.

  12. Validation of the G-PASS code : status report.

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Nuclear Engineering Division

    2009-03-12

    Validation is the process of determining whether the models in a computer code can describe the important phenomena in applications of interest. This report describes past work and proposed future work for validating the Gas Plant Analyzer and System Simulator (G-PASS) code. The G-PASS code was developed for simulating gas reactor and chemical plant system behavior during operational transients and upset events. Results are presented comparing code properties, individual component models, and integrated system behavior against results from four other computer codes. Also identified are two experiment facilities nearing completion that will provide additional data for individual component and integrated system model validation. The main goal of the validation exercise is to ready a version of G-PASS for use as a tool in evaluating vendor designs and providing guidance to vendors on design directions in nuclear-hydrogen applications.

  13. Experimental methodology for computational fluid dynamics code validation

    Energy Technology Data Exchange (ETDEWEB)

    Aeschliman, D.P.; Oberkampf, W.L.

    1997-09-01

    Validation of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. Typically, CFD code validation is accomplished through comparison of computed results to previously published experimental data that were obtained for some other purpose, unrelated to code validation. As a result, it is a near certainty that not all of the information required by the code, particularly the boundary conditions, will be available. The common approach is therefore unsatisfactory, and a different method is required. This paper describes a methodology developed specifically for experimental validation of CFD codes. The methodology requires teamwork and cooperation between code developers and experimentalists throughout the validation process, and takes advantage of certain synergisms between CFD and experiment. The methodology employs a novel uncertainty analysis technique which helps to define the experimental plan for code validation wind tunnel experiments, and to distinguish between and quantify various types of experimental error. The methodology is demonstrated with an example of surface pressure measurements over a model of varying geometrical complexity in laminar, hypersonic, near perfect gas, 3-dimensional flow.

  14. Verification and validation of XSDRNPM code for tank waste calculations

    Energy Technology Data Exchange (ETDEWEB)

    ROGERS, C.A.

    1999-07-14

    This validation study demonstrates that the XSDRNPM computer code accurately calculates the infinite neutron multiplication for water-moderated systems of low enriched uranium, plutonium, and iron. Calculations are made on a 200 MHz Brvo MS 5200M personal

  15. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  16. Criticality benchmarks validation of the Monte Carlo code TRIPOLI-2

    Energy Technology Data Exchange (ETDEWEB)

    Maubert, L. (Commissariat a l' Energie Atomique, Inst. de Protection et de Surete Nucleaire, Service d' Etudes de Criticite, 92 - Fontenay-aux-Roses (France)); Nouri, A. (Commissariat a l' Energie Atomique, Inst. de Protection et de Surete Nucleaire, Service d' Etudes de Criticite, 92 - Fontenay-aux-Roses (France)); Vergnaud, T. (Commissariat a l' Energie Atomique, Direction des Reacteurs Nucleaires, Service d' Etudes des Reacteurs et de Mathematique Appliquees, 91 - Gif-sur-Yvette (France))

    1993-04-01

    The three-dimensional energy pointwise Monte-Carlo code TRIPOLI-2 includes metallic spheres of uranium and plutonium, nitrate plutonium solutions, square and triangular pitch assemblies of uranium oxide. Results show good agreements between experiments and calculations, and avoid a part of the code and its ENDF-B4 library validation. (orig./DG)

  17. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the

  18. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the trans

  19. Results from the First Validation Phase of CAP code

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul [FNC Tech., SNU, Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  20. The Mistra experiment for field containment code validation first results

    Energy Technology Data Exchange (ETDEWEB)

    Caron-Charles, M.; Blumenfeld, L. [CEA Saclay, 91 - Gif sur Yvette (France)

    2001-07-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  1. Validation of numerical codes for the analysis of plasma discharges

    Energy Technology Data Exchange (ETDEWEB)

    Albanese, R. (Univ. di Salerno, Dipt. di Ingegneria Elettronica, Fisciano (Italy)); Bottura, L. (NET Team, Garching (Germany)); Chiocchio, S. (NET Team, Garching (Germany)); Coccorese, E. (Univ. di Reggio Calabria, Ist. di Ingegneria Elettronica (Italy)); Gernhardt, J. (Max Planck IPP, Garching (Germany)); Gruber, O. (Max Planck IPP, Garching (Germany)); Fresa, R. (Univ. di Salerno, Dipt. di Ingegneria Elettronica, Fisciano (Italy)); Martone, R. (Univ. di Salerno, Dipt. di Ingegneria Elettronica, Fisciano (Italy)); Portone, A. (NET Team, Garching (Germany)); Seidel, U. (Max Planck IPP, Garching (Germany))

    1994-01-01

    Electromagnetic aspects in the design of ITER-like reactors call for an extensive use of complex and advanced numerical codes. For this reason a strong attention has been paid within the NET-Team to the code development. In particular, through a cooperation with some Italian universities, during the last years a number of numerical procedures were developed and integrated. In order to assess the code reliability and to gain confidence on their predictions for next generation ITER-like reactors, the validation of the codes against experiments has to be considered as a strict requirement. Aim of this paper is to give a comprehensive presentation of this problem in the light of the results of a campaign of validation runs. The main outcome of this work is that the computational procedures, which have been developed for the NET project and then extensively used also for ITER studies, can be considered as experimentally validated in a sufficiently wide range of cases of interest. In particular, computed values are compared with experimental measurements made during some typical ASDEX-Upgrade discharges. From the electromagnetic point of view, many features of this machine are common to the ITER concept, so that the results of the validation can reasonably be extended to the ITER case. (orig.)

  2. Verification and Validation of The Tritium Transport Code TMAP7

    Energy Technology Data Exchange (ETDEWEB)

    Glen R. Longhurst; James Ambrosek

    2004-09-01

    The TMAP Code was written at the Idaho National Engineering and Environmental Laboratory in the late 1980s as a tool for safety analysis of systems involving tritium. Since then it has been upgraded several times and has been used in numerous applications including experiments supporting fusion safety, predictions for advanced systems such as the International Thermonuclear Experimental Reactor (ITER), and estimates involving tritium production technologies. Its most recent upgrade to TMAP7 was accomplished in response to several needs. Prior versions had the capacity to deal with only a single trap for diffusing gaseous species in solid structures. TMAP7 includes up to three separate traps and up to 10 diffusing species. The original code had difficulty dealing with heteronuclear molecule formation such as HD and DT. That has been removed. Under pre-specified boundary enclosure conditions and solution-law dependent diffusion boundary conditions, such as Sieverts' law, TMAP7 automatically generates heteronuclear molecular partial pressures when solubilities and partial pressures of the homonuclear molecular species are provided for law-dependent diffusion boundary conditions. A further sophistication is the addition of non-diffusing surface species. Atoms such as oxygen or nitrogen or formation of hydroxyl radicals on metal surfaces are sometimes important in molecule formation with diffusing hydrogen isotopes but do not, themselves, diffuse appreciably in the material. TMAP7 will accommodate up to 30 such surface species, allowing the user to specify relationships between those surface concentrations and partial pressures of gaseous species above the surfaces or to form them dynamically by combining diffusion species or other surface species. Additionally, TMAP7 allows the user to include a surface binding energy and an adsorption barrier energy and includes asymmetrical diffusion between the surface sites and regular diffusion sites in the bulk. All of the

  3. VALIDATION OF THE JRC TSUNAMI PROPAGATION AND INUNDATION CODES

    Directory of Open Access Journals (Sweden)

    N. Zamora

    2014-07-01

    Full Text Available In the last years several numerical codes have been developed to analyse tsunami waves. Most of these codes use a finite difference numerical approach giving good results for tsunami wave propagation, but with limitations in modelling inundation processes. The HyFlux2 model has been developed to simulate inundation scenario due to dam break, flash flood and tsunami-wave run-up. The model solves the conservative form of the two-dimensional shallow water equations using a finite volume method. The implementation of a shoreline-tracking method provides reliable results. HyFlux2 robustness has been tested using several tsunami events. The main aim of this study is code validation by means of comparing different code results with available measurements. Another objective of the study is to evaluate how the different fault models could generate different results that should be considered for coastal planning. Several simulations have been performed to compare HyFlux2 code with SWAN-JRC code and the TUNAMI-N2. HyFlux2 has been validated taking advantage of the extensive seismic, geodetic measurements and post-tsunami field surveys performed after the Nias March 28th tsunami. Although more detailed shallow bathymetry is needed to assess the inundation, diverse results in the wave heights have been revealed when comparing the different fault mechanism. Many challenges still exist for tsunami researchers especially when concern to early warning systems as shown in this Nias March 28th tsunami.

  4. A Comprehensive Validation Approach Using The RAVEN Code

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Rinaldi, Ivan; Giannetti, Fabio; Caruso, Gianfranco

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics needed to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.

  5. The Fast Scattering Code (FSC): Validation Studies and Program Guidelines

    Science.gov (United States)

    Tinetti, Ana F.; Dunn, Mark H.

    2011-01-01

    The Fast Scattering Code (FSC) is a frequency domain noise prediction program developed at the NASA Langley Research Center (LaRC) to simulate the acoustic field produced by the interaction of known, time harmonic incident sound with bodies of arbitrary shape and surface impedance immersed in a potential flow. The code uses the equivalent source method (ESM) to solve an exterior 3-D Helmholtz boundary value problem (BVP) by expanding the scattered acoustic pressure field into a series of point sources distributed on a fictitious surface placed inside the actual scatterer. This work provides additional code validation studies and illustrates the range of code parameters that produce accurate results with minimal computational costs. Systematic noise prediction studies are presented in which monopole generated incident sound is scattered by simple geometric shapes - spheres (acoustically hard and soft surfaces), oblate spheroids, flat disk, and flat plates with various edge topologies. Comparisons between FSC simulations and analytical results and experimental data are presented.

  6. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  7. Validation of system codes for plant application on selected experiments

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Marco K.; Risken, Tobias; Agethen, Kathrin; Bratfisch, Christoph [Bochum Univ. (Germany). Reactor Simulation and Safety Group

    2016-05-15

    For decades, the Reactor Simulation and Safety Group at Ruhr-Universitaet Bochum (RUB) contributes to nuclear safety by computer code validation and model development for nuclear safety analysis. Severe accident analysis codes are relevant tools for the understanding and the development of accident management measures. The accidents in the plants Three Mile Island (USA) in 1979 and Fukushima Daiichi (Japan) in 2011 influenced these research activities significantly due to the observed phenomena, such as molten core concrete interaction and hydrogen combustion. This paper gives a brief outline of recent research activities at RUB in the named fields, contributing to code preparation for plant applications. Simulations of the molten core concrete interaction tests CCI-2 and CCI-3 with ASTEC and the hydrogen combustion test Ix9 with COCOSYS are presented exemplarily. Additionally, the application on plants is demonstrated on chosen results of preliminary Fukushima calculations.

  8. A Radiation Shielding Code for Spacecraft and Its Validation

    Science.gov (United States)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  9. Validation and Application of the Thermal Hydraulic System Code TRACE for Analysis of BWR Transients

    Directory of Open Access Journals (Sweden)

    V. H. Sánchez

    2012-01-01

    Full Text Available The Karlsruhe Institute of Technology (KIT is participating on (Code Applications and Maintenance Program CAMP of the US Nuclear Regulatory Commission (NRC to validate TRACE code for LWR transient analysis. The application of TRACE for the safety assessment of BWR requires a throughout verification and validation using experimental data from separate effect and integral tests but also using plant data. The validation process is normally focused on safety-relevant phenomena for example, pressure drop, void fraction, heat transfer, and critical power models. The purpose of this paper is to validate selected BWR-relevant TRACE-models using both data of bundle tests such as the (Boiling Water Reactor Full-Size Fine-Mesh Bundle Test BFBT and plant data recorded during a turbine trip event (TUSA occurred in a Type-72 German BWR plant. For the validation, TRACE models of the BFBT bundle and of the BWR plant were developed. The performed investigations have shown that the TRACE code is appropriate to describe main BWR-safety-relevant phenomena (pressure drop, void fraction, and critical power with acceptable accuracy. The comparison of the predicted global BWR plant parameters for the TUSA event with the measured plant data indicates that the code predictions are following the main trends of the measured parameters such as dome pressure and reactor power.

  10. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  11. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, Charles W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bartel, Timothy James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34D accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.

  12. Criticality Safety Code Validation with LWBR’s SB Cores

    Energy Technology Data Exchange (ETDEWEB)

    Putman, Valerie Lee

    2003-01-01

    The first set of critical experiments from the Shippingport Light Water Breeder Reactor Program included eight, simple geometry critical cores built with 233UO2-ZrO2, 235UO2-ZrO2, ThO2, and ThO2-233UO2 nuclear materials. These cores are evaluated, described, and modeled to provide benchmarks and validation information for INEEL criticality safety calculation methodology. In addition to consistency with INEEL methodology, benchmark development and nuclear data are consistent with International Criticality Safety Benchmark Evaluation Project methodology.Section 1 of this report introduces the experiments and the reason they are useful for validating some INEEL criticality safety calculations. Section 2 provides detailed experiment descriptions based on currently available experiment reports. Section 3 identifies criticality safety validation requirement sources and summarizes requirements that most affect this report. Section 4 identifies relevant hand calculation and computer code calculation methodologies used in the experiment evaluation, benchmark development, and validation calculations. Section 5 provides a detailed experiment evaluation. This section identifies resolutions for currently unavailable and discrepant information. Section 5 also reports calculated experiment uncertainty effects. Section 6 describes the developed benchmarks. Section 6 includes calculated sensitivities to various benchmark features and parameters. Section 7 summarizes validation results. Appendices describe various assumptions and their bases, list experimenter calculations results for items that were independently calculated for this validation work, report other information gathered and developed by SCIENTEC personnel while evaluating these same experiments, and list benchmark sample input and miscellaneous supplementary data.

  13. System code improvements for modelling passive safety systems and their validation

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, Sebastian; Cron, Daniel von der; Schaffrath, Andreas [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    GRS has been developing the system code ATHLET over many years. Because ATHLET, among other codes, is widely used in nuclear licensing and supervisory procedures, it has to represent the current state of science and technology. New reactor concepts such as Generation III+ and IV reactors and SMR are using passive safety systems intensively. The simulation of passive safety systems with the GRS system code ATHLET is still a big challenge, because of non-defined operation points and self-setting operation conditions. Additionally, the driving forces of passive safety systems are smaller and uncertainties of parameters have a larger impact than for active systems. This paper addresses the code validation and qualification work of ATHLET on the example of slightly inclined horizontal heat exchangers, which are e. g. used as emergency condensers (e. g. in the KERENA and the CAREM) or as heat exchanger in the passive auxiliary feed water systems (PAFS) of the APR+.

  14. Validation of the Monte Carlo code MCNP-DSP

    Energy Technology Data Exchange (ETDEWEB)

    Valentine, T.E.; Mihalczo, J.T. [Oak Ridge National Lab., TN (United States)

    1996-09-12

    Several calculations were performed to validate MCNP-DSP, which is a Monte Carlo code that calculates all the time and frequency analysis parameters associated with the {sup 252}Cf-source-driven time and frequency analysis method. The frequency analysis parameters are obtained in two ways: directly by Fourier transforming the detector responses and indirectly by taking the Fourier transform of the autocorrelation and cross-correlation functions. The direct and indirect Fourier processing methods were shown to produce the same frequency spectra and convergence, thus verifying the way to obtain the frequency analysis parameters from the time sequences of detector pulses. (Author).

  15. Benchmark Solutions for Computational Aeroacoustics (CAA) Code Validation

    Science.gov (United States)

    Scott, James R.

    2004-01-01

    NASA has conducted a series of Computational Aeroacoustics (CAA) Workshops on Benchmark Problems to develop a set of realistic CAA problems that can be used for code validation. In the Third (1999) and Fourth (2003) Workshops, the single airfoil gust response problem, with real geometry effects, was included as one of the benchmark problems. Respondents were asked to calculate the airfoil RMS pressure and far-field acoustic intensity for different airfoil geometries and a wide range of gust frequencies. This paper presents the validated that have been obtained to the benchmark problem, and in addition, compares them with classical flat plate results. It is seen that airfoil geometry has a strong effect on the airfoil unsteady pressure, and a significant effect on the far-field acoustic intensity. Those parts of the benchmark problem that have not yet been adequately solved are identified and presented as a challenge to the CAA research community.

  16. Code-to-Code Validation and Application of a Building Dynamic Simulation Tool for the Building Energy Performance Analysis

    Directory of Open Access Journals (Sweden)

    Annamaria Buonomano

    2016-04-01

    Full Text Available In this paper details about the results of a code-to-code validation procedure of an in-house developed building simulation model, called DETECt, are reported. The tool was developed for research purposes in order to carry out dynamic building energy performance and parametric analyses by taking into account new building envelope integrated technologies, novel construction materials and innovative energy saving strategies. The reliability and accuracy of DETECt was appropriately tested by means of the standard BESTEST validation procedure. In the paper, details of this validation process are accurately described. A good agreement between the obtained results and all the reference data of the BESTEST qualification cases is achieved. In particular, the obtained results vs. standard BESTEST output are always within the provided ranges of confidence. In addition, several test cases output obtained by DETECt (e.g., dynamic profiles of indoor air and building surfaces temperature and heat fluxes and spatial trends of temperature across walls are provided.

  17. NEAMS Experimental Support for Code Validation, INL FY2009

    Energy Technology Data Exchange (ETDEWEB)

    G. Youinou; G. Palmiotti; M. Salvatore; C. Rabiti

    2009-09-01

    The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Whereas the Verification part of the process does not rely on experiment, the Validation part, on the contrary, necessitates as many relevant and precise experimental data as possible to make sure the models reproduce reality as closely as possible. Hence, this report presents a limited selection of experimental data that could be used to validate the codes devoted mainly to Fast Neutron Reactor calculations in the US. Emphasis has been put on existing data for thermal-hydraulics, fuel and reactor physics. The principles of a new “smart” experiment that could be used to improve our knowledge of neutron cross-sections are presented as well. In short, it consists in irradiating a few milligrams of actinides and analyzing the results with Accelerator Mass Spectroscopy to infer the neutron cross-sections. Finally, the wealth of experimental data relevant to Fast Neutron Reactors in the US should not be taken for granted and efforts should be put on saving these 30-40 years old data and on making sure they are validation-worthy, i.e. that the experimental conditions and uncertainties are well documented.

  18. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  19. Flight testing vehicles for verification and validation of hypersonics technology

    Science.gov (United States)

    Sacher, Peter W.

    1995-03-01

    Hypersonics technology has obtained renewed interest since various concepts for future completely reusable Space Transportation Systems (STS) using airbreathing propulsion for the parts of atmospheric flight have been proposed in different countries (e.g. US, CIS, Japan, France, Germany, and UK). To cover major developments in those countries, AGARD FDP has formed the Working Group 18 on 'Hypersonic Experimental and Computational Capabilities - Improvement and Validation'. Of major importance for the proof of feasibility for all these concepts is the definition of an overall convincing philosophy for a 'hypersonics technology development and verification concept' using ground simulation facilities (both experimental and numerical) and flight testing vehicles. Flying at hypersonic Mach numbers using airbreathing propulsion requires highly sophisticated design tools to provide reliable prediction of thrust minus aerodynamic drag to accelerate the vehicle during ascent. Using these design tools, existing uncertainties have to be minimized by a carefully performed code validation process. To a large degree the database required for this validation cannot be obtained on ground. In addition thermal loads due to hypersonic flow have to be predicted accurately by aerothermodynamic flow codes to provide the inputs needed to decide on materials and structures. Heat management for hypersonic flight vehicles is one of the key-issues for any kind of successful flight demonstration. This paper identifies and discusses the role of flight testing during the verification and validation process of advanced hypersonic technology needed for flight in the atmosphere with hypersonic Mach numbers using airbreathing propulsion systems both for weapons and space transportation systems.

  20. Testing the validity of the ray-tracing code GYOTO

    CERN Document Server

    Grould, Marion; Perrin, Guy

    2016-01-01

    In the next few years, the near-infrared interferometer GRAVITY will be able to observe the Galactic center. Astrometric data will be obtained with an anticipated accuracy of 10 $\\mu$as. To analyze these future data, we have developed a code called GYOTO to compute orbits and images. We want to assess the validity and accuracy of GYOTO in a variety of contexts, in particular for stellar astrometry in the Galactic center. Furthermore, we want to tackle and complete a study made on the astrometric displacements that are due to lensing effects of a star of the central parsec with GYOTO. We first validate GYOTO in the weak-deflection limit (WDL) by studying primary caustics and primary critical curves obtained for a Kerr black hole. We compare GYOTO results to available analytical approximations and estimate GYOTO errors using an intrinsic estimator. In the strong-deflection limit (SDL), we choose to compare null geodesics computed by GYOTO and the ray-tracing code named Geokerr. Finally, we use GYOTO to estimate...

  1. New pre-coded food record form validation

    Directory of Open Access Journals (Sweden)

    Víctor Manuel Rodríguez

    2014-09-01

    Full Text Available Introduction: For some research fields, simple and accurate food intake quantification tools are needed. The aim of the present work was to design a new self-administered and pre-coded food intake record form and assess its reliability and validity when quantifying the food intake of adult population, in terms of food or food-groups portions.Material and Methods: First of all, a new food-record form was designed, which included food usually consumed and which sought to be easy-to-use, short, and intuitive. The validation process consisted in analyzing both the reliability and validity of the tool’s design in a representative population sample (n=330; age: 19-77. Reliability was checked by comparing (Spearman’s CC, ICC food intake (mean value of portions between two series of five-day food records in a one-month period. Validity was checked by comparing the food intake mean value of two records to results obtained from a gold standard (24-hour recall.Results: 73.7% of the food from the record presented correlations higher than 0.5 for reliability (ICCs from 0.38 to 0.94 and 97.4% showed higher values than 0.7 and 68.4% than 0.8 for validity (ICCs from 0.28 to 0.97.Conclusions: The solid correlation coefficients and ICCs obtained indicate that this is a reliable tool for the quantification of food intake in adults in terms of food or food group portions.

  2. An efficient adaptive arithmetic coding image compression technology

    Institute of Scientific and Technical Information of China (English)

    Wang Xing-Yuan; Yun Jiao-Jiao; Zhang Yong-Lei

    2011-01-01

    This paper proposes an efficient lossless image compression scheme for still images based on an adaptive arithmetic coding compression algorithm.The algorithm increases the image coding compression rate and ensures the quality of the decoded image combined with the adaptive probability model and predictive coding.The use of adaptive models for each encoded image block dynamically estimates the probability of the relevant image block.The decoded image block can accurately recover the encoded image according to the code book information.We adopt an adaptive arithmetic coding algorithm for image compression that greatly improves the image compression rate.The results show that it is an effective compression technology.

  3. Software and codes for analysis of concentrating solar power technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2008-12-01

    This report presents a review and evaluation of software and codes that have been used to support Sandia National Laboratories concentrating solar power (CSP) program. Additional software packages developed by other institutions and companies that can potentially improve Sandia's analysis capabilities in the CSP program are also evaluated. The software and codes are grouped according to specific CSP technologies: power tower systems, linear concentrator systems, and dish/engine systems. A description of each code is presented with regard to each specific CSP technology, along with details regarding availability, maintenance, and references. A summary of all the codes is then presented with recommendations regarding the use and retention of the codes. A description of probabilistic methods for uncertainty and sensitivity analyses of concentrating solar power technologies is also provided.

  4. AVS 3D Video Coding Technology and System

    Institute of Scientific and Technical Information of China (English)

    Siwei Ma; Shiqi Wang; Wen Gao

    2012-01-01

    Following the success of the audio video standard (AVS) for 2D video coding, in 2008, the China AVS workgroup started developing 3D video (3DV) coding techniques. In this paper, we discuss the background, technical features, and applications of AVS 3DV coding technology. We introduce two core techniques used in AVS 3DV coding: inter-view prediction and enhanced stereo packing coding. We elaborate on these techniques, which are used in the AVS real-time 3DV encoder. An application of the AVS 3DV coding system is presented to show the great practical value of this system. Simulation results show that the advanced techniques used in AVS 3DV coding provide remarkable coding gain compared with techniques used in a simulcast scheme.

  5. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related

  6. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  7. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Directory of Open Access Journals (Sweden)

    David A Springate

    Full Text Available Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs. If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1% were accompanied by a full set of published clinical codes and 32 (8.6% stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  8. Reactor Fuel Isotopics and Code Validation for Nuclear Applications

    Energy Technology Data Exchange (ETDEWEB)

    Francis, Matthew W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Weber, Charles F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pigni, Marco T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-02-01

    Experimentally measured isotopic concentrations of well characterized spent nuclear fuel (SNF) samples have been collected and analyzed by previous researchers. These sets of experimental data have been used extensively to validate the accuracy of depletion code predictions for given sets of burnups, initial enrichments, and varying power histories for different reactor types. The purpose of this report is to present the diversity of data in a concise manner and summarize the current accuracy of depletion modeling. All calculations performed for this report were done using the Oak Ridge Isotope GENeration (ORIGEN) code, an internationally used irradiation and decay code solver within the SCALE comprehensive modeling and simulation code. The diversity of data given in this report includes key actinides, stable fission products, and radioactive fission products. In general, when using the current ENDF/B-VII.0 nuclear data libraries in SCALE, the major actinides are predicted to within 5% of the measured values. Large improvements were seen for several of the curium isotopes when using improved cross section data found in evaluated nuclear data file ENDF/B-VII.0 as compared to ENDF/B-V-based results. The impact of the flux spectrum on the plutonium isotope concentrations as a function of burnup was also shown. The general accuracy noted for the actinide samples for reactor types with burnups greater than 5,000 MWd/MTU was not observed for the low-burnup Hanford B samples. More work is needed in understanding these large discrepancies. The stable neodymium and samarium isotopes were predicted to within a few percent of the measured values. Large improvements were seen in prediction for a few of the samarium isotopes when using the ENDF/B-VII.0 libraries compared to results obtained with ENDF/B-V libraries. Very accurate predictions were obtained for 133Cs and 153Eu. However, the predicted values for the stable ruthenium and rhodium isotopes varied

  9. Real-World Hydrogen Technology Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Sprik, S.; Kurtz, J.; Wipke, K.; Ramsden, T.; Ainscough, C.; Eudy, L.; Saur, G.

    2012-03-01

    The Department of Energy, the Department of Defense's Defense Logistics Agency, and the Department of Transportation's Federal Transit Administration have funded learning demonstrations and early market deployments to provide insight into applications of hydrogen technologies on the road, in the warehouse, and as stationary power. NREL's analyses validate the technology in real-world applications, reveal the status of the technology, and facilitate the development of hydrogen and fuel cell technologies, manufacturing, and operations. This paper presents the maintenance, safety, and operation data of fuel cells in multiple applications with the reported incidents, near misses, and frequencies. NREL has analyzed records of more than 225,000 kilograms of hydrogen that have been dispensed through more than 108,000 hydrogen fills with an excellent safety record.

  10. Validation of the thermal-hydraulic system code ATHLET based on selected pressure drop and void fraction BFBT tests

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Escalante, Javier Jimenez; Espinoza, Victor Sanchez

    2015-07-15

    Highlights: • Simulation of BFBT-BWR steady-state and transient tests with ATHLET. • Validation of thermal-hydraulic models based on pressure drops and void fraction measurements. • TRACE system code is used for the comparative study. • Predictions result in a good agreement with the experiments. • Discrepancies are smaller or comparable with respect to the measurements uncertainty. - Abstract: Validation and qualification of thermal-hydraulic system codes based on separate effect tests are essential for the reliability of numerical tools when applied to nuclear power plant analyses. To this purpose, the Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in various validation and qualification activities of different CFD, sub-channel and system codes. In this paper, the capabilities of the thermal-hydraulic code ATHLET are assessed based on the experimental results provided within the NUPEC BFBT benchmark related to key Boiling Water Reactors (BWR) phenomena. Void fraction and pressure drops measurements in the BFBT bundle performed under steady-state and transient conditions which are representative for e.g. turbine trip and recirculation pump trip events, are compared with the numerical results of ATHLET. The comparison of code predictions with the BFBT data has shown good agreement given the experimental uncertainty and the results are consistent with the trends obtained with similar thermal-hydraulic codes.

  11. On the use of the MISTRA coupled effect test facility for the validation of containment thermal-hydraulics codes

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E.; Dabbene, F.; Magnaud, J. P.; Blumenfeld, L.; Quillico, J. J.; Paillere, H. [CEA Saclay, Gif-sur-Yvette (France)

    2003-07-01

    Twenty four years after the Three Mile Island Accident, Hydrogen risk remains a safety issue for current and future Pressurized Water Reactors (PWR). The formation of a combustible gas mixture in the complex geometry of a reactor containment depends on the understanding of hydrogen production, complex 3D flow due to gas/steam injection, natural convection, heat transfer by condensation on walls and effect of mitigation devices. Lumped parameter safety codes mainly developed for full containment analysis are not able to accurately predict the local gas mixing within the containment. 3D CFD codes are required but a thorough validation process on well-instrumented experimental data is necessary before they can be used with a high degree of confidence. The MISTRA coupled effect test facility has been recently built at CEA to fulfill these objectives: numerous measurement points in the gaseous volume (temperature and gas concentration) and the use of Laser technology (L.D.V. and P.I.V.) provide suitable experimental data for code validation. The in-house CEA-IRSN CAST3M/TONUS code is developed and validated against experimental data provided by this facility. Some of these tests have been proposed to the international community for code benchmarking (MICOCO benchmark and OECD/ISP47 exercise). Finally, extrapolation to global containment scale requires the validation of the code on more complex flow patterns and a detailed investigation of scaling effects. These two items will be the guidelines of future MISTRA tests.

  12. Application and Preliminary Validation of Sub-node Technology in Core Code of COSINE Package%COSINE软件包堆芯物理分析程序子网格技术的应用及初步验证

    Institute of Scientific and Technical Information of China (English)

    全国萍; 刘占权; 王苏; 王常辉; 胡啸宇; 许花; 陈义学

    2013-01-01

    The core code of COSINE package uses NEM to solve diffusion equation and NEM is based on the node homogenization theory .In the AP1000 core design process , there are some reasons w hich cause material heterogeneity in the axial direction ,such as the fuel assembly dividing into several areas with different enrichments , the arrangement of spacer grid and the insertion of control rod , and it is needed to be rehomogenization .The re-homogenization can bring error into NEM calculation result because of some hypothesis . The directive phenomenon of the error is control rod cusping effect .Sub-node technology was proposed to solve the material heterogeneity in the axial direction problem in this paper ,and the differential worth of control rod was calculated .The results show that the control rod cusping effect greatly reduces because of the sub-node technology .%COSINE软件包堆芯物理分析程序采用节块展开法(NEM )进行中子扩散方程的求解,NEM 以节块均匀化理论为基础。在A P1000堆芯核设计过程中,由于燃料组件的多种轴向材料分区方式、定位格架分布及控制棒插入均会引起节块轴向材料不均匀,需进行再均匀化处理。由于再均匀化过程基于一些假设,给NEM计算带来了一定误差。这种误差最为直接的表现形式为控制棒的尖端效应。本文采用了子网格技术来处理轴向材料不均匀性的问题,并给出了初步验证结果,计算了控制棒的微分价值曲线。计算结果表明,子网格的应用有效地缓解了控制棒尖端效应的影响。

  13. Perspectives on Validation and Uncertainty Evaluation of SFR Nuclear Design Code

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Moohoon; Choi, Yong Won; Shin, Andong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    Fast reactors such as PGSFR (Prototype Gen-IV Sodium-cooled Fast Reactor) developed by KAERI have fundamental differences in terms of core characteristics and associated fuel cycle compared to thermal reactors, which need specific new effort for code validation. In current PWRs, nuclear design code systems have been validated using numerous data accumulated by wide operating experience, and its uncertainty can be assessed by statistical methods. However, in order to validate code systems for SFRs with little operating experience, and particularly prototype reactor, new approaches are required. In this study, a current procedure for validation and uncertainty evaluation is reviewed in nuclear design code systems for PWRs, and global approaches for validation of SFR code systems are surveyed. Through these reviews, perspectives on nuclear design code validation for SFRs are identified. In case of neutronics code V and V, current procedure for PWRs and global approaches for SFRs were reviewed and surveyed. Though this review, perspectives on nuclear design code V and V and uncertainty evaluation for SFRs were identified. Further study will be implemented to obtain more insight on code validation.

  14. Unfolding code for neutron spectrometry based on neural nets technology

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Apdo. Postal 336, 98000 Zacatecas (Mexico)

    2012-10-15

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the {sup R}obust Design of Artificial Neural Networks Methodology{sup .} The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a {sup 6}Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  15. Validation of the Information/Communications Technology Literacy Test

    Science.gov (United States)

    2016-10-01

    Technical Report 1360 Validation of the Information / Communications Technology Literacy Test D. Matthew Trippe Human Resources Research...TITLE AND SUBTITLE Validation of the Information / Communications Technology Literacy Test 5a. CONTRACT OR GRANT NUMBER W91WAS-09-D-0013 5b...validate a measure of cyber aptitude, the Information / Communications Technology Literacy Test (ICTL), in predicting trainee performance in Information

  16. Biometric iris image acquisition system with wavefront coding technology

    Science.gov (United States)

    Hsieh, Sheng-Hsun; Yang, Hsi-Wen; Huang, Shao-Hung; Li, Yung-Hui; Tien, Chung-Hao

    2013-09-01

    Biometric signatures for identity recognition have been practiced for centuries. Basically, the personal attributes used for a biometric identification system can be classified into two areas: one is based on physiological attributes, such as DNA, facial features, retinal vasculature, fingerprint, hand geometry, iris texture and so on; the other scenario is dependent on the individual behavioral attributes, such as signature, keystroke, voice and gait style. Among these features, iris recognition is one of the most attractive approaches due to its nature of randomness, texture stability over a life time, high entropy density and non-invasive acquisition. While the performance of iris recognition on high quality image is well investigated, not too many studies addressed that how iris recognition performs subject to non-ideal image data, especially when the data is acquired in challenging conditions, such as long working distance, dynamical movement of subjects, uncontrolled illumination conditions and so on. There are three main contributions in this paper. Firstly, the optical system parameters, such as magnification and field of view, was optimally designed through the first-order optics. Secondly, the irradiance constraints was derived by optical conservation theorem. Through the relationship between the subject and the detector, we could estimate the limitation of working distance when the camera lens and CCD sensor were known. The working distance is set to 3m in our system with pupil diameter 86mm and CCD irradiance 0.3mW/cm2. Finally, We employed a hybrid scheme combining eye tracking with pan and tilt system, wavefront coding technology, filter optimization and post signal recognition to implement a robust iris recognition system in dynamic operation. The blurred image was restored to ensure recognition accuracy over 3m working distance with 400mm focal length and aperture F/6.3 optics. The simulation result as well as experiment validates the proposed code

  17. Validation of the JRC Tsunami Propagation and Inundation Codes

    OpenAIRE

    2014-01-01

    In the last years several numerical codes have been developed to analyse tsunami waves. Most of these codes use a finite difference numerical approach giving good results for tsunami wave propagation, but with limitations in modelling inundation processes. The HyFlux2 model has been developed to simulate inundation scenario due to dam break, flash flood and tsunami-wave run-up. The model solves the conservative form of the two-dimensional shallow water equations using a finite volume method. ...

  18. Innovative application of bar coding technology to breast milk administration.

    Science.gov (United States)

    Fleischman, Ellen K

    2013-01-01

    Hospitalized infants often receive expressed breast milk, either from their mother or from banked milk. Breast milk provides optimal nutrition for infants but because it is a body fluid it carries the risk of disease transmission. Therefore, administering the correct breast milk to hospitalized infants is essential. Bar coding technology, used in hospitals to prevent errors related to medication administration, can be proactively applied to prevent breast milk administration errors. Bar coding systems offer advantages over manual verification processes, including decreasing errors due to human factors and providing for automated entry of feedings in the electronic health record. However, potential barriers to successful implementation must be addressed. These barriers include equipment and training costs, increased time to perform the additional steps with bar coding, and work-arounds.

  19. Convergent Validity of O*NET Holland Code Classifications

    Science.gov (United States)

    Eggerth, Donald E.; Bowles, Shannon M.; Tunick, Roy H.; Andrew, Michael E.

    2005-01-01

    The interpretive ease and intuitive appeal of the Holland RIASEC typology have made it nearly ubiquitous in vocational guidance settings. Its incorporation into the Occupational Information Network (O*NET) has moved it another step closer to reification. This research investigated the rates of agreement between Holland code classifications from…

  20. Applications of Bar Code Technology in the Construction Industry

    Science.gov (United States)

    1991-01-01

    per week. This is an indication that we have much better control of our inventory " ( Ryan 87 ). Producto Machine Company of Bridgeport, Connecticut... marketing and patenting advanced technologies to encourage firms to take on the risks involved. Our country, however, has no such policy. Until it does...assistance in marketing and patenting their achievements. 2) The Department of Defense, who already requires bar codes on all supplies accepted into

  1. GUIDELESS SPATIAL COORDINATE MEASUREMENT TECHNOLOGY BASED ON CODING POLE

    Institute of Scientific and Technical Information of China (English)

    ZHAO Min; QIU Zongming; QU Jiamin; LIU Hongzhao

    2008-01-01

    A new method of guideless spatial coordinate measurement technology based on coding pole and vision measurement is proposed. Unequal spacing of bar code is adopted to pole, so that the code combination of pole image in measuring field is unique. Holographic characteristics of numeric coding pole are adopted to obtain pole pose and pole probe position by any section of bar code on the pole. Spatial coordinates of measuring points can be obtained by coordinate transform. The contradiction between high resolution and large visual field of image sensor is resolved, thereby providing a new concept for surface shape measurement of large objects with high precision. The measurement principles of the system are expounded and mathematic model is established. The measurement equation is evaluated by simulation experiments and the measurement precision is analyzed. Theoretical analysis and simulation experiments prove that this system is characterized by simple structure and wide measurement range. Therefore it can be used in the 3-dimentional coordinate measurement of large objects.

  2. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    Science.gov (United States)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  3. First steps towards a validation of the new burnup and depletion code TNT

    Energy Technology Data Exchange (ETDEWEB)

    Herber, S.C.; Allelein, H.J. [RWTH Aachen (Germany). Inst. for Reactor Safety and Reactor Technology; Research Center Juelich (Germany). Inst. for Energy and Climate Research - Nuclear Waste Disposal and Reactor Safety (IEK-6); Friege, N. [RWTH Aachen (Germany). Inst. for Reactor Safety and Reactor Technology; Kasselmann, S. [Research Center Juelich (Germany). Inst. for Energy and Climate Research - Nuclear Waste Disposal and Reactor Safety (IEK-6)

    2012-11-01

    In the frame of the fusion of the core design calculation capabilities, represented by V.S.O.P., and the accident calculation capabilities, represented by MGT(-3D), the successor of the TINTE code, difficulties were observed in defining an interface between a program backbone and the ORIGEN code respectively the ORIGENJUEL code. The estimation of the effort of refactoring the ORIGEN code or to write a new burnup code from scratch, led to the decision that it would be more efficient writing a new code, which could benefit from existing programming and software engineering tools from the computer code side and which can use the latest knowledge of nuclear reactions, e.g. consider all documented reaction channels. Therefore a new code with an object-oriented approach was developed at IEK-6. Object-oriented programming is currently state of the art and provides mostly an improved extensibility and maintainability. The new code was named TNT which stands for Topological Nuclide Transformation, since the code makes use of the real topology of the nuclear reactions. Here we want to present some first validation results from code to code benchmarks with the codes ORIGEN V2.2 and FISPACT2005 and whenever possible analytical results also used for the comparison. The 2 reference codes were chosen due to their high reputation in the field of fission reactor analysis (ORIGEN) and fusion facilities (FISPACT). (orig.)

  4. Are VHA administrative location codes valid indicators of specialty substance use disorder treatment?

    Science.gov (United States)

    Harris, Alex H S; Reeder, Rachelle N; Ellerbe, Laura; Bowe, Thomas

    2010-01-01

    Healthcare quality managers and researchers often need to identify specific healthcare events from administrative data. In this study, we examined whether Veterans Health Administration (VHA) clinic stop and bed section codes are reliable indicators of substance use disorder (SUD) treatment as documented in clinical progress notes. For outpatient records with a progress note, SUD clinic stop code, SUD diagnosis code, and mental health procedure code, we found chart documentation of SUD care in 92.0% of 601 records: 82.5% of 372 records with a SUD clinic stop code and SUD diagnosis code but no mental health procedure code, 21.9% of 379 records with a SUD clinic stop code and mental health procedure code but no SUD diagnosis code, and 55.3% of 318 records with a SUD clinic stop code but no SUD diagnosis or mental health procedure code. For inpatient stays with a SUD bed section code and a progress note, we found chart documentation of SUD care in 99.0% of 699 records accompanied by a SUD diagnosis but 0% of 39 records without a SUD diagnosis. These results provide validity evidence and caveats to researchers and VHA quality managers who might use SUD specialty location codes as indicators of SUD specialty care.

  5. Validation of a plant dynamics code for 4S - Test analysis of natural circulation behavior

    Energy Technology Data Exchange (ETDEWEB)

    Sebe, F.; Horie, H.; Matsumiya, H. [Toshiba Corporation, 8 Shinsugita-Cho, Isogo-Ku, Yokohama, 235-8523 (Japan); Fanning, T. H. [Argonne National Laboratory, 9700 S Cass Ave, Argonne, IL 60439 (United States)

    2012-07-01

    A plant transient dynamics code for a sodium-cooled fast reactor was developed by Toshiba. The code is used to evaluate the safety performance of Super-Safe, Small, and Simple reactor (4S) for Anticipated Operational Occurrences (AOOs), Design Basis Accident (DBA) and Beyond DBA (BDBA). The code is currently undergoing verification and validation (V and V). As one of the validation, test analysis of the Shutdown Heat Removal Test (SHRT)-17 performed in the Experimental Breeder Reactor (EBR)-II was conducted. The SHRT-17 is protected loss of flow test. The purpose of this validation is to confirm capability of the code to simulate natural circulation behavior of the plant. As a result, good agreements are shown between the analytical results and the measured data which were available from instrumented subassembly. The detailed validation result of the natural circulation behavior is described in this paper. (authors)

  6. Validation Study of CODES Dragonfly Network Model with Theta Cray XC System

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, Misbah [Argonne National Lab. (ANL), Argonne, IL (United States); Ross, Robert B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-31

    This technical report describes the experiments performed to validate the MPI performance measurements reported by the CODES dragonfly network simulation with the Theta Cray XC system at the Argonne Leadership Computing Facility (ALCF).

  7. Validation of the THIRMAL-1 melt-water interaction code

    Energy Technology Data Exchange (ETDEWEB)

    Chu, C.C.; Sienicki, J.J.; Spencer, B.W. [Argonne National Lab., IL (United States)

    1995-09-01

    The THIRMAL-1 computer code has been used to calculate nonexplosive LWR melt-water interactions both in-vessel and ex-vessel. To support the application of the code and enhance its acceptability, THIRMAL-1 has been compared with available data from two of the ongoing FARO experiments at Ispra and two of the Corium Coolant Mixing (CCM) experiments performed at Argonne. THIRMAL-1 calculations for the FARO Scoping Test and Quenching Test 2 as well as the CCM-5 and -6 experiments were found to be in excellent agreement with the experiment results. This lends confidence to the modeling that has been incorporated in the code describing melt stream breakup due to the growth of both Kelvin-Helmholtz and large wave instabilities, the sizes of droplets formed, multiphase flow and heat transfer in the mixing zone surrounding and below the melt metallic phase. As part of the analysis of the FARO tests, a mechanistic model was developed to calculate the prefragmentation as it may have occurred when melt relocated from the release vessel to the water surface and the model was compared with the relevant data from FARO.

  8. Validation of InnoSPICE for technology transfer

    OpenAIRE

    Mitašiūnas, Antanas; Besson, Jeremy Daniel; Boronowsky, Michael; Woronowicz, Tanja

    2015-01-01

    Innovation and technology transfer consist mainly of process-oriented activities and can be described in process-oriented terms by an innovation and technology transfer process capability model such as InnoSPICE. To verify such a thesis, an extended validation of the InnoSPICE adequacy for different factual innovation and technology transfer activities is needed. The purpose of this paper is to validate the InnoSPICE model for technology transfer led by a technology developer based on capabil...

  9. Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor

    Science.gov (United States)

    Ajmani, Kumud; Liu, Nan-Suey

    2005-01-01

    The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.

  10. Automated facial coding: validation of basic emotions and FACS AUs in FaceReader

    NARCIS (Netherlands)

    P. Lewinski; T.M. den Uyl; C. Butler

    2014-01-01

    In this study, we validated automated facial coding (AFC) software—FaceReader (Noldus, 2014)—on 2 publicly available and objective datasets of human expressions of basic emotions. We present the matching scores (accuracy) for recognition of facial expressions and the Facial Action Coding System (FAC

  11. Standards, building codes, and certification programs for solar technology applicatons

    Energy Technology Data Exchange (ETDEWEB)

    Riley, J. D.; Odland, R.; Barker, H.

    1979-07-01

    This report is a primer on solar standards development. It explains the development of standards, building code provisions, and certification programs and their relationship to the emerging solar technologies. These areas are important in the commercialization of solar technology because they lead to the attainment of two goals: the development of an industry infrastructure and consumer confidence. Standards activities in the four phases of the commercialization process (applied research, development, introduction, and diffusion) are discussed in relation to institutional issues. Federal policies have been in operation for a number of years to accelerate the development process for solar technology. These policies are discussed in light of the Office of Management and Budget (OMB) Circular on federal interaction with the voluntary consensus system, and in light of current activities of DOE, HUD, and other interested federal agencies. The appendices cover areas of specific interest to different audiences: activities on the state and local level; and standards, building codes, and certification programs for specific technologies. In addition, a contract for the development of a model solar document let by DOE to a model consortium is excerpted in the Appendix.

  12. Needs and opportunities for CFD-code validation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B.L. [Paul Scherrer Institute, Villigen (Switzerland)]|[Paul Scherrer Instiute, Wuerenlingen (Switzerland)

    1996-06-01

    The conceptual design for the ESS target consists of a horizontal cylinder containing a liquid metal - mercury is considered in the present study - which circulates by forced convection and carries away the waste heat generated by the spallation reactions. The protons enter the target via a beam window, which must withstand the thermal, mechanical and radiation loads to which it is subjected. For a beam power of 5MW, it is estimated that about 3.3MW of waste heat would be deposited in the target material and associated structures. it is intended to confirm, by detailed thermal-hydraulics calculations, that a convective flow of the liquid metal target material can effectively remove the waste heat. The present series of Computational Fluid Dynamics (CFD) calculations has indicated that a single-inlet Target design leads to excessive local overheating, but a multiple-inlet design, is coolable. With this option, inlet flow streams, two from the sides and one from below, merge over the target window, cooling the window itself in crossflow and carrying away the heat generated volumetrically in the mercury with a strong axial flow down the exit channel. The three intersecting streams form a complex, three-dimensional, swirling flow field in which critical heat transfer processes are taking place. In order to produce trustworthy code simulations, it is necessary that the mesh resolution is adequate for the thermal-hydraulic conditions encountered and that the physical models used by the code are appropriate to the fluid dynamic environment. The former relies on considerable user experience in the application of the code, and the latter assurance is best gained in the context of controlled benchmark activities where measured data are available. Such activities will serve to quantify the accuracy of given models and to identify potential problem area for the numerical simulation which may not be obvious from global heat and mass balance considerations.

  13. Test Data for USEPR Severe Accident Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Rempe

    2007-05-01

    This document identifies data that can be used for assessing various models embodied in severe accident analysis codes. Phenomena considered in this document, which were limited to those anticipated to be of interest in assessing severe accidents in the USEPR developed by AREVA, include: • Fuel Heatup and Melt Progression • Reactor Coolant System (RCS) Thermal Hydraulics • In-Vessel Molten Pool Formation and Heat Transfer • Fuel/Coolant Interactions during Relocation • Debris Heat Loads to the Vessel • Vessel Failure • Molten Core Concrete Interaction (MCCI) and Reactor Cavity Plug Failure • Melt Spreading and Coolability • Hydrogen Control Each section of this report discusses one phenomenon of interest to the USEPR. Within each section, an effort is made to describe the phenomenon and identify what data are available modeling it. As noted in this document, models in US accident analysis codes (MAAP, MELCOR, and SCDAP/RELAP5) differ. Where possible, this report identifies previous assessments that illustrate the impact of modeling differences on predicting various phenomena. Finally, recommendations regarding the status of data available for modeling USEPR severe accident phenomena are summarized.

  14. Improving radiopharmaceutical supply chain safety by implementing bar code technology.

    Science.gov (United States)

    Matanza, David; Hallouard, François; Rioufol, Catherine; Fessi, Hatem; Fraysse, Marc

    2014-11-01

    The aim of this study was to describe and evaluate an approach for improving radiopharmaceutical supply chain safety by implementing bar code technology. We first evaluated the current situation of our radiopharmaceutical supply chain and, by means of the ALARM protocol, analysed two dispensing errors that occurred in our department. Thereafter, we implemented a bar code system to secure selected key stages of the radiopharmaceutical supply chain. Finally, we evaluated the cost of this implementation, from overtime, to overheads, to additional radiation exposure to workers. An analysis of the events that occurred revealed a lack of identification of prepared or dispensed drugs. Moreover, the evaluation of the current radiopharmaceutical supply chain showed that the dispensation and injection steps needed to be further secured. The bar code system was used to reinforce product identification at three selected key stages: at usable stock entry; at preparation-dispensation; and during administration, allowing to check conformity between the labelling of the delivered product (identity and activity) and the prescription. The extra time needed for all these steps had no impact on the number and successful conduct of examinations. The investment cost was reduced (2600 euros for new material and 30 euros a year for additional supplies) because of pre-existing computing equipment. With regard to the radiation exposure to workers there was an insignificant overexposure for hands with this new organization because of the labelling and scanning processes of radiolabelled preparation vials. Implementation of bar code technology is now an essential part of a global securing approach towards optimum patient management.

  15. HD Photo: a new image coding technology for digital photography

    Science.gov (United States)

    Srinivasan, Sridhar; Tu, Chengjie; Regunathan, Shankar L.; Sullivan, Gary J.

    2007-09-01

    This paper introduces the HD Photo coding technology developed by Microsoft Corporation. The storage format for this technology is now under consideration in the ITU-T/ISO/IEC JPEG committee as a candidate for standardization under the name JPEG XR. The technology was developed to address end-to-end digital imaging application requirements, particularly including the needs of digital photography. HD Photo includes features such as good compression capability, high dynamic range support, high image quality capability, lossless coding support, full-format 4:4:4 color sampling, simple thumbnail extraction, embedded bitstream scalability of resolution and fidelity, and degradation-free compressed domain support of key manipulations such as cropping, flipping and rotation. HD Photo has been designed to optimize image quality and compression efficiency while also enabling low-complexity encoding and decoding implementations. To ensure low complexity for implementations, the design features have been incorporated in a way that not only minimizes the computational requirements of the individual components (including consideration of such aspects as memory footprint, cache effects, and parallelization opportunities) but results in a self-consistent design that maximizes the commonality of functional processing components.

  16. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Wren, D.J.; Popov, N.; Snell, V.G. [Atomic Energy of Canada Ltd, (Canada)

    2004-07-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  17. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  18. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  19. NASA's First New Millenium Deep-Space Technology Validation Flight

    Science.gov (United States)

    Lehman, David H.; Rayman, Marc D.

    1996-01-01

    Planned for launch in 1998, the first flight of NASA's New Millenium Program will validate selected breakthrough technologies required for future low-cost, low-mass, space science missions. The principal objective is to validate these advanced technologies thoroughly enough that subsequent users may be confident of their performance, thus reducing the cost and risk of science missions in the 21st century.

  20. A proposed framework for computational fluid dynamics code calibration/validation

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1993-12-31

    The paper reviews the terminology and methodology that have been introduced during the last several years for building confidence n the predictions from Computational Fluid Dynamics (CID) codes. Code validation terminology developed for nuclear reactor analyses and aerospace applications is reviewed and evaluated. Currently used terminology such as ``calibrated code,`` ``validated code,`` and a ``validation experiment`` is discussed along with the shortcomings and criticisms of these terms. A new framework is proposed for building confidence in CFD code predictions that overcomes some of the difficulties of past procedures and delineates the causes of uncertainty in CFD predictions. Building on previous work, new definitions of code verification and calibration are proposed. These definitions provide more specific requirements for the knowledge level of the flow physics involved and the solution accuracy of the given partial differential equations. As part of the proposed framework, categories are also proposed for flow physics research, flow modeling research, and the application of numerical predictions. The contributions of physical experiments, analytical solutions, and other numerical solutions are discussed, showing that each should be designed to achieve a distinctively separate purpose in building confidence in accuracy of CFD predictions. A number of examples are given for each approach to suggest methods for obtaining the highest value for CFD code quality assurance.

  1. How Developments in Psychology and Technology Challenge Validity Argumentation

    Science.gov (United States)

    Mislevy, Robert J.

    2016-01-01

    Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…

  2. A Revised Validation Process for Ice Accretion Codes

    Science.gov (United States)

    Wright, William B.; Porter, Christopher E.

    2017-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. A quantitative comparison of the results against a database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will extend the comparison of ice shapes between LEWICE 3.5 and experimental data from a previous paper. Comparisons of lift and drag are made between experimentally collected data from experimentally obtained ice shapes and simulated (CFD) data on simulated (LEWICE) ice shapes. Comparisons are also made between experimentally collected and simulated performance data on select experimental ice shapes to ensure the CFD solver, FUN3D, is valid within the flight regime. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  3. Validation and comparison of two-phase flow modeling capabilities of CFD, sub channel and system codes by means of post-test calculations of BFBT transient tests

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim; Manes, Jorge Perez; Imke, Uwe; Escalante, Javier Jimenez; Espinoza, Victor Sanchez, E-mail: victor.sanchez@kit.edu

    2013-10-15

    Highlights: • Simulation of BFBT turbine and pump transients at multiple scales. • CFD, sub-channel and system codes are used for the comparative study. • Heat transfer models are compared to identify difference between the code predictions. • All three scales predict results in good agreement to experiment. • Sub cooled boiling models are identified as field for future research. -- Abstract: The Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in the validation and qualification of modern thermo hydraulic simulations tools at various scales. In the present paper, the prediction capabilities of four codes from three different scales – NEPTUNE{sub C}FD as fine mesh computational fluid dynamics code, SUBCHANFLOW and COBRA-TF as sub channels codes and TRACE as system code – are assessed with respect to their two-phase flow modeling capabilities. The subject of the investigations is the well-known and widely used data base provided within the NUPEC BFBT benchmark related to BWRs. Void fraction measurements simulating a turbine and a re-circulation pump trip are provided at several axial levels of the bundle. The prediction capabilities of the codes for transient conditions with various combinations of boundary conditions are validated by comparing the code predictions with the experimental data. In addition, the physical models of the different codes are described and compared to each other in order to explain the different results and to identify areas for further improvements.

  4. Development of an MCNP-tally based burnup code and validation through PWR benchmark exercises

    Energy Technology Data Exchange (ETDEWEB)

    El Bakkari, B. [ERSN-LMR, Department of physics, Faculty of Sciences P.O.Box 2121, Tetuan (Morocco)], E-mail: bakkari@gmail.com; El Bardouni, T.; Merroun, O.; El Younoussi, Ch.; Boulaich, Y. [ERSN-LMR, Department of physics, Faculty of Sciences P.O.Box 2121, Tetuan (Morocco); Chakir, E. [EPTN-LPMR, Faculty of Sciences Kenitra (Morocco)

    2009-05-15

    The aim of this study is to evaluate the capabilities of a newly developed burnup code called BUCAL1. The code provides the full capabilities of the Monte Carlo code MCNP5, through the use of the MCNP tally information. BUCAL1 uses the fourth order Runge Kutta method with the predictor-corrector approach as the integration method to determine the fuel composition at a desired burnup step. Validation of BUCAL1 was done by code vs. code comparison. Results of two different kinds of codes are employed. The first one is CASMO-4, a deterministic multi-group two-dimensional transport code. The second kind is MCODE and MOCUP, a link MCNP-ORIGEN codes. These codes use different burnup algorithms to solve the depletion equations system. Eigenvalue and isotope concentrations were compared for two PWR uranium and thorium benchmark exercises at cold (300 K) and hot (900 K) conditions, respectively. The eigenvalue comparison between BUCAL1 and the aforementioned two kinds of codes shows a good prediction of the systems'k-inf values during the entire burnup history, and the maximum difference is within 2%. The differences between the BUCAL1 isotope concentrations and the predictions of CASMO-4, MCODE and MOCUP are generally better, and only for a few sets of isotopes these differences exceed 10%.

  5. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  6. San Onofre PWR Data for Code Validation of MOX Fuel Depletion Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    1999-09-01

    The isotopic composition of mixed-oxide fuel (fabricated with both uranium and plutonium isotope) discharged from reactors is of interest to the Fissile Material Disposition Program. The validation of depletion codes used to predict isotopic compositions of MOX fuel, similar to studies concerning uranium-only fueled reactors, thus, is very important. The EEI-Westinghouse Plutonium Recycle Demonstration Program was conducted to examine the use of MOX fuel in the San Onofre PWR, Unit I, during cycles 2 and 3. The data usually required as input to depletion codes, either one-dimensional or lattice codes, were taken from various sources and compiled into this report. Where data were either lacking or determined inadequate, the appropriate data were supplied from other references. The scope of the reactor operations and design data, in addition to the isotopic analyses, were considered to be of sufficient quality for depletion code validation.

  7. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    Science.gov (United States)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  8. Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding.

    Science.gov (United States)

    Cohn, J F; Zlochower, A J; Lien, J; Kanade, T

    1999-01-01

    The face is a rich source of information about human behavior. Available methods for coding facial displays, however, are human-observer dependent, labor intensive, and difficult to standardize. To enable rigorous and efficient quantitative measurement of facial displays, we have developed an automated method of facial display analysis. In this report, we compare the results with this automated system with those of manual FACS (Facial Action Coding System, Ekman & Friesen, 1978a) coding. One hundred university students were videotaped while performing a series of facial displays. The image sequences were coded from videotape by certified FACS coders. Fifteen action units and action unit combinations that occurred a minimum of 25 times were selected for automated analysis. Facial features were automatically tracked in digitized image sequences using a hierarchical algorithm for estimating optical flow. The measurements were normalized for variation in position, orientation, and scale. The image sequences were randomly divided into a training set and a cross-validation set, and discriminant function analyses were conducted on the feature point measurements. In the training set, average agreement with manual FACS coding was 92% or higher for action units in the brow, eye, and mouth regions. In the cross-validation set, average agreement was 91%, 88%, and 81% for action units in the brow, eye, and mouth regions, respectively. Automated face analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.

  9. Validation of system codes RELAP5 and SPECTRA for natural convection boiling in narrow channels

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M.M., E-mail: stempniewicz@nrg.eu; Slootman, M.L.F.; Wiersema, H.T.

    2016-10-15

    Highlights: • Computer codes RELAP5/Mod3.3 and SPECTRA 3.61 validated for boiling in narrow channels. • Validated codes can be used for LOCA analyses in research reactors. • Code validation based on natural convection boiling in narrow channels experiments. - Abstract: Safety analyses of LOCA scenarios in nuclear power plants are performed with so called thermal–hydraulic system codes, such as RELAP5. Such codes are validated for typical fuel geometries applied in nuclear power plants. The question considered by this article is if the codes can be applied for LOCA analyses in research reactors, in particular exceeding CHF in very narrow channels. In order to answer this question, validation calculations were performed with two thermal–hydraulic system codes: RELAP and SPECTRA. The validation was based on natural convection boiling in narrow channels experiments, performed by Prof. Monde et al. in the years 1990–2000. In total 42 vertical tube and annulus experiments were simulated with both codes. A good agreement of the calculated values with the measured data was observed. The main conclusions are: • The computer codes RELAP5/Mod 3.3 (US NRC version) and SPECTRA 3.61 have been validated for natural convection boiling in narrow channels using experiments of Monde. The dimensions applied in the experiments were performed for a range that covers the values observed in typical research reactors. Therefore it is concluded that both codes are validated and can be used for LOCA analyses in research reactors, including natural convection boiling. The applicability range of the present validation is: hydraulic diameters of 1.1 ⩽ D{sub hyd} ⩽ 9.0 mm, heated lengths of 0.1 ⩽ L ⩽ 1.0 m, pressures of 0.10 ⩽ P ⩽ 0.99 MPa. In most calculations the burnout was predicted to occur at lower power than that observed in the experiments. In several cases the burnout was observed at higher power. The overprediction was not larger than 16% in RELAP and 15% in

  10. Validation of the Subchannel Code SUBCHANFLOW Using the NUPEC PWR Tests (PSBT

    Directory of Open Access Journals (Sweden)

    Uwe Imke

    2012-01-01

    Full Text Available SUBCHANFLOW is a computer code to analyze thermal-hydraulic phenomena in the core of pressurized water reactors, boiling water reactors, and innovative reactors operated with gas or liquid metal as coolant. As part of the ongoing assessment efforts, the code has been validated by using experimental data from the NUPEC PWR Subchannel and Bundle Tests (PSBT. The database includes single-phase flow bundle outlet temperature distributions, steady state and transient void distributions and critical power measurements. The performed validation work has demonstrated that the two-phase flow empirical knowledge base implemented in SUBCHANFLOW is appropriate to describe key mechanisms of the experimental investigations with acceptable accuracy.

  11. Some Examples of the Application and Validation of the NUFT Subsurface Flow and Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Nitao, J J

    2001-08-01

    This report was written as partial fulfillment of a subcontract from DOD/DOE Strategic Environmental Research and Development Program (SERDP) as part of a project directed by the U.S. Army Engineer Research and Development Center, Waterways Experiment Station (WES), Vicksburg, Mississippi. The report documents examples of field validation of the Non-isothermal Unsaturated-saturated Flow and Transport model (NUFT) code for environmental remediation, with emphasis on soil vapor extraction, and describes some of the modifications needed to integrate the code into the DOD Groundwater Modeling System (GMS, 2000). Note that this report highlights only a subset of the full capabilities of the NUFT code.

  12. Overview of NASA Multi-Dimensional Stirling Convertor Code Development and Validation Effort

    Science.gov (United States)

    Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David

    2003-01-01

    A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and ``two space'' test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow rig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this multi-D code development effort.

  13. VULCAN: an Open-Source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    OpenAIRE

    2016-01-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K using a reduced C- H-O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing ...

  14. Validation and application of the system code ATHLET-CD for BWR severe accident analyses

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Imke, Uwe; Sanchez, Victor

    2016-10-15

    Highlights: • We present the application of the system code ATHLET-CD code for BWR safety analyses. • Validation of core in-vessel models is performed based on KIT CORA experiments. • A SB-LOCA scenario is simulated on a generic German BWR plant up to vessel failure. • Different core reflooding possibilities are investigated to mitigate the accident consequences. • ATHLET-CD modelling features reflect the current state of the art of severe accident codes. - Abstract: This paper is aimed at the validation and application of the system code ATHLET-CD for the simulation of severe accident phenomena in Boiling Water Reactors (BWR). The corresponding models for core degradation behaviour e.g., oxidation, melting and relocation of core structural components are validated against experimental data available from the CORA-16 and -17 bundle tests. Model weaknesses are discussed along with needs for further code improvements. With the validated ATHLET-CD code, calculations are performed to assess the code capabilities for the prediction of in-vessel late phase core behaviour and reflooding of damaged fuel rods. For this purpose, a small break LOCA scenario for a generic German BWR with postulated multiple failures of the safety systems was selected. In the analysis, accident management measures represented by cold water injection into the damaged reactor core are addressed to investigate the efficacy in avoiding or delaying the failure of the reactor pressure vessel. Results show that ATHLET-CD is applicable to the description of BWR plant behaviour with reliable physical models and numerical methods adopted for the description of key in-vessel phenomena.

  15. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Mark D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Podgorney, Robert [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelkar, Sharad M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McClure, Mark W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Danko, George [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ghassemi, Ahmad [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fu, Pengcheng [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bahrami, Davood [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Barbier, Charlotte [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cheng, Qinglu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chiu, Kit-Kwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Detournay, Christine [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elsworth, Derek [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yi [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Furtney, Jason K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gan, Quan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gao, Qian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Guo, Bin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hao, Yue [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Horne, Roland N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, Kai [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Im, Kyungjae [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Norbeck, Jack [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rutqvist, Jonny [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Safari, M. R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sesetty, Varahanaresh [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sonnenthal, Eric [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tao, Qingfeng [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); White, Signe K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wong, Yang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xia, Yidong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-12-02

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems

  16. Code it rite the first time : automated invoice processing solution designed to ensure validity to field ticket coding

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, G.

    2010-03-15

    An entrepreneur who ran 55 rigs for a major oilfield operator in Calgary has developed a solution for the oil industry that reduces field ticketing errors from 40 per cent to almost none. The Code-Rite not only simplifies field ticketing but can eliminate weeks of trying to balance authorization for expenditure (AFE) numbers. A service provider who wants a field ticket signed for billing purposes following a service call to a well site receives all pertinent information on a barcode that includes AFE number, location, routing, approval authority and mailing address. Attaching the label to the field ticket provides all the invoicing information needed. This article described the job profile, education and life experiences and opportunities that led the innovator to develop this technology that solves an industry-wide problem. Code-Rite is currently being used by 3 large upstream oil and gas operators and plans are underway to automate the entire invoice processing system. 1 fig.

  17. Overview of NASA Multi-dimensional Stirling Convertor Code Development and Validation Effort

    Science.gov (United States)

    Tew, Roy C.; Cairelli, James E.; Ibrahim, Mounir B.; Simon, Terrence W.; Gedeon, David

    2002-01-01

    A NASA grant has been awarded to Cleveland State University (CSU) to develop a multi-dimensional (multi-D) Stirling computer code with the goals of improving loss predictions and identifying component areas for improvements. The University of Minnesota (UMN) and Gedeon Associates are teamed with CSU. Development of test rigs at UMN and CSU and validation of the code against test data are part of the effort. The one-dimensional (1-D) Stirling codes used for design and performance prediction do not rigorously model regions of the working space where abrupt changes in flow area occur (such as manifolds and other transitions between components). Certain hardware experiences have demonstrated large performance gains by varying manifolds and heat exchanger designs to improve flow distributions in the heat exchangers. 1-D codes were not able to predict these performance gains. An accurate multi-D code should improve understanding of the effects of area changes along the main flow axis, sensitivity of performance to slight changes in internal geometry, and, in general, the understanding of various internal thermodynamic losses. The commercial CFD-ACE code has been chosen for development of the multi-D code. This 2-D/3-D code has highly developed pre- and post-processors, and moving boundary capability. Preliminary attempts at validation of CFD-ACE models of MIT gas spring and "two space" test rigs were encouraging. Also, CSU's simulations of the UMN oscillating-flow fig compare well with flow visualization results from UMN. A complementary Department of Energy (DOE) Regenerator Research effort is aiding in development of regenerator matrix models that will be used in the multi-D Stirling code. This paper reports on the progress and challenges of this

  18. Preliminary validation of the MATRA-LMR-FB code for the flow blockage in a subassembly

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, H. Y.; Ha, K. S.; Kwon, Y. M.; Chang, W. P.; Lee, Y. B.; Heo, S

    2005-01-01

    To analyze the flow blockage in a subassembly of a Liquid Metal-cooled Reactor (LMR), the MATRA-LMR-FB code has been developed and validated for the existing experimental data. Compared to the MATRA-LMR code, which had been successfully applied for the core thermal-hydraulic design of KALIMER, the MATRA-LMR-FB code includes some advanced modeling features. Firstly, the Distributed Resistance Model (DRM), which enables a very accurate description of the effects of wire-wrap and blockage in a flow path, is developed for the MATRA-LMR-FB code. Secondly, the hybrid difference method is used to minimize the numerical diffusion especially at the low flow region such as recirculating wakes after blockage. In addition, the code is equipped with various turbulent mixing models to describe the active mixing due to the turbulent motions as accurate as possible. For the validation of the MATRA-LMR-FB code the ORNL THORS test and KOS 169-pin test are analyzed. Based on the analysis results for the temperature data, the accuracy of the code is evaluated quantitatively. The MATRA-LMR-FB code predicts very accurately the exit temperatures measured in the subassembly with wire-wrap. However, the predicted temperatures for the experiment with spacer grid show some deviations from the measured. To enhance the accuracy of the MATRA-LMR-FB for the flow path with grid spacers, it is suggested to improve the models for pressure loss due to spacer grid and the modeling method for blockage itself. The developed MATRA-LMR-FB code is evaluated to be applied to the flow blockage analysis of KALIMER-600 which adopts the wire-wrapped subassemblies.

  19. Neonatal Facial Coding System for Assessing Postoperative Pain in Infants: Item Reduction is Valid and Feasible

    NARCIS (Netherlands)

    Peters, J.W.B.; Koot, H.M.; Grunau, R.E.; Boer, J. de; Druenen, M.J. van; Tibboel, D.; Duivenvoorden, H.J.

    2003-01-01

    Objective: The objectives of this study were to: (1) evaluate the validity of the Neonatal Facial Coding System (NFCS) for assessment of postoperative pain and (2) explore whether the number of NFCS facial actions could be reduced for assessing postoperative pain. Design: Prospective, observational

  20. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  1. Validation of a pre-coded food record for infants and young children

    DEFF Research Database (Denmark)

    Gondolf, Ulla Holmboe; Tetens, Inge; Hills, A. P.;

    2012-01-01

    Background/Objectives:To assess the validity of a 7-day pre-coded food record (PFR) method in 9-month-old infants against metabolizable energy intake (ME(DLW)) measured by doubly labeled water (DLW); additionally to compare PFR with a 7-day weighed food record (WFR) in 9-month-old infants and 36-...

  2. Neonatal Facial Coding System for Assessing Postoperative Pain in Infants: Item Reduction is Valid and Feasible

    NARCIS (Netherlands)

    Peters, J.W.B.; Koot, H.M.; Grunau, R.E.; Boer, J. de; Druenen, M.J. van; Tibboel, D.; Duivenvoorden, H.J.

    2003-01-01

    Objective: The objectives of this study were to: (1) evaluate the validity of the Neonatal Facial Coding System (NFCS) for assessment of postoperative pain and (2) explore whether the number of NFCS facial actions could be reduced for assessing postoperative pain. Design: Prospective, observational

  3. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    Directory of Open Access Journals (Sweden)

    F. Terzuoli

    2008-01-01

    Full Text Available Pressurized thermal shock (PTS modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV lifetime is the cold water emergency core cooling (ECC injection into the cold leg during a loss of coolant accident (LOCA. Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mécanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX, and a research code (NEPTUNE CFD. The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling.

  4. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  5. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    DEFF Research Database (Denmark)

    Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen;

    2016-01-01

    BACKGROUND: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR......). METHODS: The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10), from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database......, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV) of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we...

  6. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    Directory of Open Access Journals (Sweden)

    Jørgensen LK

    2016-05-01

    Full Text Available Laura Krogh Jørgensen,1 Lars Skov Dalgaard,1 Lars Jørgen Østergaard,1 Nanna Skaarup Andersen,2 Mette Nørgaard,3 Trine Hyrup Mogensen1 1Department of Infectious Diseases, Aarhus University Hospital, Aarhus, 2Department of Clinical Microbiology, Odense University Hospital, Odense, 3Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark Background: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE in the Danish National Patient Registry (DNPR. Methods: The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10, from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we estimated the proportion of HSE cases coded with nonspecific ICD-10 codes of viral encephalitis and also the sensitivity of the HSE diagnosis coding. Results: We were able to validate 398 (94.3% of the 422 HSE diagnoses identified via the DNPR. Hereof, 202 (50.8% were classified as confirmed cases and 29 (7.3% as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0–62.9. For “Encephalitis due to herpes simplex virus” (ICD-10 code B00.4, the PPV was 56.6% (95% CI: 51.1–62.0. Similarly, the PPV for “Meningoencephalitis due to herpes simplex virus” (ICD-10 code B00.4A was 56.8% (95% CI: 39.5–72.9. “Herpes viral encephalitis” (ICD-10 code G05.1E had a PPV

  7. Trends in EFL Technology and Educational Coding: A Case Study of an Evaluation Application Developed on LiveCode

    Science.gov (United States)

    Uehara, Suwako; Noriega, Edgar Josafat Martinez

    2016-01-01

    The availability of user-friendly coding software is increasing, yet teachers might hesitate to use this technology to develop for educational needs. This paper discusses studies related to technology for educational uses and introduces an evaluation application being developed. Through questionnaires by student users and open-ended discussion by…

  8. Elaboration and validation of an assistive technology assessment questionnaire

    Directory of Open Access Journals (Sweden)

    Fernanda Jorge Guimarães

    2015-06-01

    Full Text Available Assistive Technologies consists of resources, methods, and strategies favoring autonomy and inclusion of elderly and people with disabilities, being scarce in the literature instruments assessing them. A methodology study conducted with a panel of specialists and people with visual impairment, aimed to elaborate and validate a questionnaire to assess educational assistive technology. To consider an item as valid, we used 80% as agreement percentage, and validity and reliability of the questionnaire were calculated. Assistive Technology was characterized in six attributes: objectives, access, clarity, structure and presentation, relevance and efficacy, interactivity, and 19 items were elaborated to compose the questionnaire. From those, 11 obtained percentages higher than 80%, seven were modified and one was excluded. The instrument Cronbach’s alpha was 0,822, guaranteeing validity and reliability of the tool to assess health education Assistive Technology, and therefore, its use is indicated.

  9. Results from the Deep Space One Technology Validation Mission

    Science.gov (United States)

    Rayman, M.; Varghese, P.; Lehman, D.; Livesay, L.

    1999-01-01

    Launched on October 25, 1998, Deep Space 1 (DS1) is the first mission of NASA's New Millennium Program, chartered to flight validate high-risk, new technologies important for future space and Earth science programs.

  10. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  11. Validation and benchmarking of two particle-in-cell codes for a glow discharge

    Science.gov (United States)

    Carlsson, Johan; Khrabrov, Alexander; Kaganovich, Igor; Sommerer, Timothy; Keating, David

    2017-01-01

    The two particle-in-cell codes EDIPIC and LSP are benchmarked and validated for a parallel-plate glow discharge in helium, in which the axial electric field had been carefully measured, primarily to investigate and improve the fidelity of their collision models. The scattering anisotropy of electron-impact ionization, as well as the value of the secondary-electron emission yield, are not well known in this case. The experimental uncertainty for the emission yield corresponds to a factor of two variation in the cathode current. If the emission yield is tuned to make the cathode current computed by each code match the experiment, the computed electric fields are in excellent agreement with each other, and within about 10% of the experimental value. The non-monotonic variation of the width of the cathode fall with the applied voltage seen in the experiment is reproduced by both codes. The electron temperature in the negative glow is within experimental error bars for both codes, but the density of slow trapped electrons is underestimated. A more detailed code comparison done for several synthetic cases of electron-beam injection into helium gas shows that the codes are in excellent agreement for ionization rate, as well as for elastic and excitation collisions with isotropic scattering pattern. The remaining significant discrepancies between the two codes are due to differences in their electron binary-collision models, and for anisotropic scattering due to elastic and excitation collisions.

  12. Validating Advanced Supply-Chain Technology (VAST)

    Science.gov (United States)

    2004-06-01

    Use philosophy that is so important in today’s procurement environment. Electronic Data Interchange (EDI) and eCommerce is proving to be a major...the STEPwise methodology are particularly encouraging. These new EDI and eCommerce technologies are becoming more important with the customers who...critical assumption is based upon the 55 fact that eCommerce is growing throughout the commercial and military sector and those who are not

  13. Update on the Development and Validation of MERCURY: A Modern, Monte Carlo Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Procassini, R J; Taylor, J M; McKinley, M S; Greenman, G M; Cullen, D E; O' Brien, M J; Beck, B R; Hagmann, C A

    2005-06-06

    An update on the development and validation of the MERCURY Monte Carlo particle transport code is presented. MERCURY is a modern, parallel, general-purpose Monte Carlo code being developed at the Lawrence Livermore National Laboratory. During the past year, several major algorithm enhancements have been completed. These include the addition of particle trackers for 3-D combinatorial geometry (CG), 1-D radial meshes, 2-D quadrilateral unstructured meshes, as well as a feature known as templates for defining recursive, repeated structures in CG. New physics capabilities include an elastic-scattering neutron thermalization model, support for continuous energy cross sections and S ({alpha}, {beta}) molecular bound scattering. Each of these new physics features has been validated through code-to-code comparisons with another Monte Carlo transport code. Several important computer science features have been developed, including an extensible input-parameter parser based upon the XML data description language, and a dynamic load-balance methodology for efficient parallel calculations. This paper discusses the recent work in each of these areas, and describes a plan for future extensions that are required to meet the needs of our ever expanding user base.

  14. Validation of a Subchannel Analysis Code MATRA Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, Kyung Won; Kwon, Hyouk

    2008-10-15

    A subchannel analysis code MATRA has been developed for the thermal hydraulic analysis of SMART core. The governing equations and important models were established, and validation calculations have been performed for subchannel flow and enthalpy distributions in rod bundles under steady-state conditions. The governing equations of the MATRA were on the basis of integral balance equation of the two-phase mixture. The effects of non-homogeneous and non-equilibrium states were considered by employing the subcooled boiling model and the phasic slip model. Solution scheme and main structure of the MATRA code, as well as the difference of MATRA and COBRA-IV-I codes, were summarized. Eight different test data sets were employed for the validation of the MATRA code. The collected data consisted of single-phase subchannel flow and temperature distribution data, single-phase inlet flow maldistribution data, single-phase partial flow blockage data, and two-phase subchannel flow and enthalpy distribution data. The prediction accuracy as well as the limitation of the MATRA code was evaluated from this analysis.

  15. Development of PIRT and Assessment Matrix for Verification and Validation of Sodium Fire Analysis Codes

    Science.gov (United States)

    Ohno, Shuji; Ohshima, Hiroyuki; Tajima, Yuji; Ohki, Hiroshi

    Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V&V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V&V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the ‘Phenomena Identification and Ranking Table (PIRT)’ is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an ‘Assessment Matrix’ is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%.

  16. Use of the ETA-1 reactor for the validation of the multi-group APOLLO2-MORET 5 code and the Monte Carlo continuous energy MORET 5 code

    Science.gov (United States)

    Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.

    2014-06-01

    The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.

  17. Validity of ICD-9-CM codes for the identification of complications related to central venous catheterization.

    Science.gov (United States)

    Tukey, Melissa H; Borzecki, Ann M; Wiener, Renda Soylemez

    2015-01-01

    Two complications of central venous catheterization (CVC), iatrogenic pneumothorax and central line-associated bloodstream infection (CLABSI), have dedicated International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes. Despite increasing use of ICD-9-CM codes for research and pay-for-performance purposes, their validity for detecting complications of CVC has not been established. Complications of CVCs placed between July 2010 and December 2011 were identified by ICD-9-CM codes in discharge records from a single hospital and compared with those revealed by medical record abstraction. The ICD-9-CM code for iatrogenic pneumothorax had a sensitivity of 66.7%, specificity of 100%, positive predictive value (PPV) of 100%, and negative predictive value (NPV) of 99.5%. The ICD-9-CM codes for CLABSI had a sensitivity of 33.3%, specificity of 99.0%, PPV of 28.6%, and NPV of 99.2%. The low sensitivity and variable PPV of ICD-9-CM codes for detection of complications of CVC raise concerns about their use for research or pay-for-performance purposes.

  18. Validation of the 3D finite element transport theory code EVENT for shielding applications

    Energy Technology Data Exchange (ETDEWEB)

    Warner, Paul [Rolls Royce Power Engineering Plc., Derby (United Kingdom); Oliveira, R.E. de [T. H. Huxley School of Environment, Earth Science and Engineering, Imperial College of Science Technology and Medicine, London (United Kingdom)

    2000-03-01

    This paper is concerned with the validation of the 3D deterministic neutral-particle transport theory code EVENT for shielding applications. The code is based on the finite element-spherical harmonics (FE-P{sub N}) method which has been extensively developed over the last decade. A general multi-group, anisotropic scattering formalism enables the code to address realistic steady state and time dependent, multi-dimensional coupled neutron/gamma radiation transport problems involving high scattering and deep penetration alike. The powerful geometrical flexibility and competitive computational effort makes the code an attractive tool for shielding applications. In recognition of this, EVENT is currently in the process of being adopted by the UK nuclear industry. The theory behind EVENT is described and its numerical implementation is outlined. Numerical results obtained by the code are compared with predictions of the Monte Carlo code MCBEND and also with the results from benchmark shielding experiments. In particular, results are presented for the ASPIS experimental configuration for both neutron and gamma ray calculations using the BUGLE 96 nuclear data library. (author)

  19. Validation of deterministic and Monte Carlo codes for neutronics calculation of the IRT-type research reactor

    Science.gov (United States)

    Shchurovskaya, M. V.; Alferov, V. P.; Geraskin, N. I.; Radaev, A. I.

    2017-01-01

    The results of the validation of a research reactor calculation using Monte Carlo and deterministic codes against experimental data and based on code-to-code comparison are presented. The continuous energy Monte Carlo code MCU-PTR and the nodal diffusion-based deterministic code TIGRIS were used for full 3-D calculation of the IRT MEPhI research reactor. The validation included the investigations for the reactor with existing high enriched uranium (HEU, 90 w/o) fuel and low enriched uranium (LEU, 19.7 w/o, U-9%Mo) fuel.

  20. Understanding Student Teachers' Behavioural Intention to Use Technology: Technology Acceptance Model (TAM) Validation and Testing

    Science.gov (United States)

    Wong, Kung-Teck; Osman, Rosma bt; Goh, Pauline Swee Choo; Rahmat, Mohd Khairezan

    2013-01-01

    This study sets out to validate and test the Technology Acceptance Model (TAM) in the context of Malaysian student teachers' integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA), and structural equation…

  1. A method for detecting code security vulnerability based on variables tracking with validated-tree

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    SQL injection poses a major threat to the application level security of the database and there is no systematic solution to these attacks.Different from traditional run time security strategies such as IDS and fire wall,this paper focuses on the solution at the outset;it presents a method to find vulnerabilities by analyzing the source codes.The concept of validated tree is developed to track variables referenced by database operations in scripts.By checking whether these variables are influenced by outside inputs,the database operations are proved to be secure or not.This method has advantages of high accuracy and efficiency as well as low costs,and it is universal to any type of web application platforms.It is implemented by the SOftware code vulnerabilities of SQL injection detector(CVSID).The validity and efficiency are demonstrated with an example.

  2. Systematic review of validated case definitions for diabetes in ICD-9-coded and ICD-10-coded data in adult populations.

    Science.gov (United States)

    Khokhar, Bushra; Jette, Nathalie; Metcalfe, Amy; Cunningham, Ceara Tess; Quan, Hude; Kaplan, Gilaad G; Butalia, Sonia; Rabi, Doreen

    2016-08-05

    With steady increases in 'big data' and data analytics over the past two decades, administrative health databases have become more accessible and are now used regularly for diabetes surveillance. The objective of this study is to systematically review validated International Classification of Diseases (ICD)-based case definitions for diabetes in the adult population. Electronic databases, MEDLINE and Embase, were searched for validation studies where an administrative case definition (using ICD codes) for diabetes in adults was validated against a reference and statistical measures of the performance reported. The search yielded 2895 abstracts, and of the 193 potentially relevant studies, 16 met criteria. Diabetes definition for adults varied by data source, including physician claims (sensitivity ranged from 26.9% to 97%, specificity ranged from 94.3% to 99.4%, positive predictive value (PPV) ranged from 71.4% to 96.2%, negative predictive value (NPV) ranged from 95% to 99.6% and κ ranged from 0.8 to 0.9), hospital discharge data (sensitivity ranged from 59.1% to 92.6%, specificity ranged from 95.5% to 99%, PPV ranged from 62.5% to 96%, NPV ranged from 90.8% to 99% and κ ranged from 0.6 to 0.9) and a combination of both (sensitivity ranged from 57% to 95.6%, specificity ranged from 88% to 98.5%, PPV ranged from 54% to 80%, NPV ranged from 98% to 99.6% and κ ranged from 0.7 to 0.8). Overall, administrative health databases are useful for undertaking diabetes surveillance, but an awareness of the variation in performance being affected by case definition is essential. The performance characteristics of these case definitions depend on the variations in the definition of primary diagnosis in ICD-coded discharge data and/or the methodology adopted by the healthcare facility to extract information from patient records. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Systematic review of validated case definitions for diabetes in ICD-9-coded and ICD-10-coded data in adult populations

    Science.gov (United States)

    Khokhar, Bushra; Jette, Nathalie; Metcalfe, Amy; Cunningham, Ceara Tess; Kaplan, Gilaad G; Butalia, Sonia; Rabi, Doreen

    2016-01-01

    Objectives With steady increases in ‘big data’ and data analytics over the past two decades, administrative health databases have become more accessible and are now used regularly for diabetes surveillance. The objective of this study is to systematically review validated International Classification of Diseases (ICD)-based case definitions for diabetes in the adult population. Setting, participants and outcome measures Electronic databases, MEDLINE and Embase, were searched for validation studies where an administrative case definition (using ICD codes) for diabetes in adults was validated against a reference and statistical measures of the performance reported. Results The search yielded 2895 abstracts, and of the 193 potentially relevant studies, 16 met criteria. Diabetes definition for adults varied by data source, including physician claims (sensitivity ranged from 26.9% to 97%, specificity ranged from 94.3% to 99.4%, positive predictive value (PPV) ranged from 71.4% to 96.2%, negative predictive value (NPV) ranged from 95% to 99.6% and κ ranged from 0.8 to 0.9), hospital discharge data (sensitivity ranged from 59.1% to 92.6%, specificity ranged from 95.5% to 99%, PPV ranged from 62.5% to 96%, NPV ranged from 90.8% to 99% and κ ranged from 0.6 to 0.9) and a combination of both (sensitivity ranged from 57% to 95.6%, specificity ranged from 88% to 98.5%, PPV ranged from 54% to 80%, NPV ranged from 98% to 99.6% and κ ranged from 0.7 to 0.8). Conclusions Overall, administrative health databases are useful for undertaking diabetes surveillance, but an awareness of the variation in performance being affected by case definition is essential. The performance characteristics of these case definitions depend on the variations in the definition of primary diagnosis in ICD-coded discharge data and/or the methodology adopted by the healthcare facility to extract information from patient records. PMID:27496226

  4. Validation of the THIRST steam generator thermalhydraulic code against the CLOTAIRE phase II experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Pietralik, J.M.; Campagna, A.O.; Frisina, V.C

    1999-04-01

    Steam generator thermalhydraulic codes are frequently used to calculate both global and local parameters inside a stern generator. The global parameters include heat transfer output, recirculation ratio, outlet temperatures, and pressure drops for operating and abnormal conditions. The local parameters are used in further analyses of flow-induced vibration, fretting wear, sludge deposition, and flow-accelerated corrosion. For these purposes, detailed, 3-dimensional 2-phase flow and heat transfer parameters are needed. To make the predictions more accurate and reliable, the codes need to be validated in geometries representative of real conditions. One such study is an international co-operative experimental program called CLOTAIRE, which is based in France. The CANDU Owners Group(COG) participated in the first two phases of the program. The results of the validation of Phase 1 were presented at the 1994 Steam Generator and Heat Exchanger Conference, and the results of the validation of Phase II are the subject of this report. THIRST is a thermalhydraulic, finite-volume code used to predict flow and heat transfer in steam generators. The local results of CLOTAIRE Phase II were used to validate the code. The results consist of the measurements of void fraction and axial gas-phase velocity in the U-bend region. The measurements were done using bi-optical probes. A comparison of global results indicates that the THIRST predictions, with the Chisholm void fraction model, are within 2% to 3% of the experimental results. Using THIRST with the homogeneous void fraction model, the global results were less accurate but still gave very good predictions; the greatest error was 10% for the separator pressure drop. Comparisons of the local predictions for void fraction and axial gas-phase velocity show good agreement. The Chisholm void fraction model generally gives better agreement with the experimental data, whereas the homogeneous model tends to overpredict the void fraction

  5. Validation of the THIRST steam generator thermalhydraulic code against the CLOTAIRE phase II experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Pietralik, J.M.; Campagna, A.O.; Frisina, V.C. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    1998-07-01

    Steam generator thermalhydraulic codes are used frequently to calculate both global and local parameters inside the steam generator. The former include heat transfer output, recirculation ratio, outlet temperatures, and pressure drops for operating and abnormal conditions. The latter are used in further analyses of flow-induced vibration, fretting wear, sludge deposition, and flow accelerated corrosion. For these purposes, detailed, three-dimensional two-phase flow and heat transfer parameters are needed. To make the predictions more accurate and reliable, the codes need to be validated in geometries representative of real conditions. One such study is an international cooperative experimental program called CLOTAIRE based in France. COG participated in the first two phases of the program; the results of the validation of Phase 1 were presented at the 1994 Steam Generator and Heat Exchanger Conference, and the results of the validation of Phase II are the subject of this paper. THIRST is a thermalhydraulic, finite volume code to predict the flow and heat transfer in steam generators. The local results of CLOTAIRE Phase II have been used to validate the code. These consist of the measurements of void fraction and axial gas-phase velocity in the U-bend region. The measurements were done using bi-optical probes. A comparison of global results indicates that the THIRST predictions, with the Chisholm void fraction model, are within 2 to 3% of the experimental results. Using THIRST with the homogeneous void fraction model, the global results were less accurate but still well predicted with the greatest error of 10% for the separator pressure drop. Comparisons of the local predictions for void fraction and axial gas-phase show good agreement. The Chisholm void fraction model generally gives better agreement with the experimental data while the homogeneous model tends to overpredict the void fraction and underpredict the gas velocity. (author)

  6. Results and code predictions for ABCOVE (aerosol behavior code validation and evaluation) aerosol code validation with low concentration NaOH and NaI aerosol: CSTF test AB7

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, R.K.; McCormack, J.D.; Muhlestein, L.D.

    1985-10-01

    A program for aerosol behavior validation and evaluation (ABCOVE) has been developed in accordance with the LMFBR Safety Program Plan. The ABCOVE program is a cooperative effort between the USDOE, the USNRC, and their contractor organizations currently involved in aerosol code development, testing or application. The third large-scale test in the ABCOVE program, AB7, was performed in the 850-m/sup 3/ CSTF vessel with a two-species test aerosol. The test conditions involved the release of a simulated fission product aerosol, NaI, into the containment atmosphere after the end of a small sodium pool fire. Four organizations made pretest predictions of aerosol behavior using five computer codes. Two of the codes (QUICKM and CONTAIN) were discrete, multiple species codes, while three (HAA-3, HAA-4, and HAARM-3) were log-normal codes which assume uniform coagglomeration of different aerosol species. Detailed test results are presented and compared with the code predictions for eight key aerosol behavior parameters. 11 refs., 44 figs., 35 tabs.

  7. Codes and Standards Requirements for Deployment of Emerging Fuel Cell Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Burgess, R.; Buttner, W.; Riykin, C.

    2011-12-01

    The objective of this NREL report is to provide information on codes and standards (of two emerging hydrogen power fuel cell technology markets; forklift trucks and backup power units), that would ease the implementation of emerging fuel cell technologies. This information should help project developers, project engineers, code officials and other interested parties in developing and reviewing permit applications for regulatory compliance.

  8. Verification and Validation of the BISON Fuel Performance Code for PCMI Applications

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle Allan Lawrence [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Gardner, Russell James [Idaho National Laboratory; Perez, Danielle Marie [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-06-01

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described. Validation for application to light water reactor (LWR) PCMI problems is assessed by comparing predicted and measured rod diameter following base irradiation and power ramps. Results indicate a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. Initial rod diameter comparisons have led to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.

  9. Verification & Validation Toolkit to Assess Codes: Is it Theory Limitation, Numerical Method Inadequacy, Bug in the Code or a Serious Flaw?

    Science.gov (United States)

    Bombardelli, F. A.; Zamani, K.

    2014-12-01

    We introduce and discuss an open-source, user friendly, numerical post-processing piece of software to assess reliability of the modeling results of environmental fluid mechanics' codes. Verification and Validation, Uncertainty Quantification (VAVUQ) is a toolkit developed in Matlab© for general V&V proposes. In this work, The VAVUQ implementation of V&V techniques and user interfaces would be discussed. VAVUQ is able to read Excel, Matlab, ASCII, and binary files and it produces a log of the results in txt format. Next, each capability of the code is discussed through an example: The first example is the code verification of a sediment transport code, developed with the Finite Volume Method, with MES. Second example is a solution verification of a code for groundwater flow, developed with the Boundary Element Method, via MES. Third example is a solution verification of a mixed order, Compact Difference Method code of heat transfer via MMS. Fourth example is a solution verification of a 2-D, Finite Difference Method code of floodplain analysis via Complete Richardson Extrapolation. In turn, application of VAVUQ in quantitative model skill assessment studies (validation) of environmental codes is given through two examples: validation of a two-phase flow computational modeling of air entrainment in a free surface flow versus lab measurements and heat transfer modeling in the earth surface versus field measurement. At the end, we discuss practical considerations and common pitfalls in interpretation of V&V results.

  10. Development of the Verification and Validation Matrix for Safety Analysis Code SPACE

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Han; Ha, Sang Jun; Yang, Chang Keun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    Korea Electric Power Research Institute (KEPRI) has been developed the safety analysis code, called as SPACE (Safety and Performance Analysis CodE for Nuclear Power Plant), for typical pressurized water reactors (PWR). Current safety analysis codes were conducted from foreign vendors, such as Westinghouse Electric Corp., ABB Combustion Engineering Inc., Kraftwerk Union, etc. Considering the conservatism and inflexibility of the foreign code systems, it is difficult to expand the application areas and analysis scopes. To overcome the mentioned problems KEPRI has launched the project to develop the native safety analysis code with Korea Power Engineering Co.(KOPEC), Korea Atomic Energy Research Inst.(KAERI), Korea Nuclear Fuel(KNF), and Korea Hydro and Nuclear Power Co.(KHNP) under the funding of Ministry of Knowledge Economy (MKE). As a result of the project, the demo-version of SPACE has been released in July 2009. As an advance preparation of the next step, KEPRI and colleagues have developed the verification and validation (V and V) matrix for SPACE. To develop the matrix, the preceding studies and experiments were reviewed. After mature consideration, the V and V matrix has been developed and the experiment plans were designed for the next step to compensate the lack of data.

  11. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    DEFF Research Database (Denmark)

    Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen

    2016-01-01

    BACKGROUND: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR...... (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0-62.9). For "Encephalitis due to herpes simplex virus" (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1-62.0). Similarly, the PPV for "Meningoencephalitis due to herpes simplex virus" (ICD-10 code B00.4A......) was 56.8% (95% CI: 39.5-72.9). "Herpes viral encephalitis" (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5-89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. CONCLUSION: The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence...

  12. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  13. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    Science.gov (United States)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  14. Preliminary validation of RELAP5/Mod4.0 code for LBE cooled NACIE facility

    Energy Technology Data Exchange (ETDEWEB)

    Kumari, Indu; Khanna, Ashok, E-mail: akhanna@iitk.ac.in

    2017-04-01

    Highlights: • Detail discussion of thermo physical properties of Lead Bismuth Eutectic incorporated in the code RELAP5/Mod4.0 included. • Benchmarking of LBE properties in RELAP5/Mod4.0 against literature. • NACIE facility for three different power levels (10.8, 21.7 and 32.5 kW) under natural circulation considered for benchmarking. • Preliminary validation of the LBE properties against experimental data. • NACIE facility for power level 22.5 kW considered for validation. - Abstract: The one-dimensional thermal hydraulic computer code RELAP5 was developed for thermal hydraulic study of light water reactor as well as for nuclear research reactors. The purpose of this work is to evaluate the code RELAP5/Mod4.0 for analysis of research reactors. This paper consists of three major sections. The first section presents detailed discussions on thermo-physical properties of Lead Bismuth Eutectic (LBE) incorporated in RELAP5/Mod4.0 code. In the second section, benchmarking of RELAP5/Mod4.0 has been done with the Natural Circulation Experimental (NACIE) facility in comparison with Barone’s simulations using RELAP5/Mod3.3. Three different power levels (10.8 kW, 21.7 kW and 32.5 kW) under natural circulation conditions are considered. Results obtained for LBE temperatures, temperature difference across heat section, pin surface temperatures, mass flow rates and heat transfer coefficients in heat section heat exchanger are in agreement with Barone’s simulation results within 7% of average relative error. Third section presents validation of RELAP5/Mod4.0 against the experimental data of NACIE facility performed by Tarantino et al. test number 21 at power of 22.5 kW comparing the profiles of temperatures, mass flow rate and velocity of LBE. Simulation and experimental results agree within 7% of average relative error.

  15. Engine Validation of Noise and Emission Reduction Technology Phase I

    Science.gov (United States)

    Weir, Don (Editor)

    2008-01-01

    This final report has been prepared by Honeywell Aerospace, Phoenix, Arizona, a unit of Honeywell International, Inc., documenting work performed during the period December 2004 through August 2007 for the NASA Glenn Research Center, Cleveland, Ohio, under the Revolutionary Aero-Space Engine Research (RASER) Program, Contract No. NAS3-01136, Task Order 8, Engine Validation of Noise and Emission Reduction Technology Phase I. The NASA Task Manager was Dr. Joe Grady of the NASA Glenn Research Center. The NASA Contract Officer was Mr. Albert Spence of the NASA Glenn Research Center. This report is for a test program in which NASA funded engine validations of integrated technologies that reduce aircraft engine noise. These technologies address the reduction of engine fan and jet noise, and noise associated with propulsion/airframe integration. The results of these tests will be used by NASA to identify the engineering tradeoffs associated with the technologies that are needed to enable advanced engine systems to meet stringent goals for the reduction of noise. The objectives of this program are to (1) conduct system engineering and integration efforts to define the engine test-bed configuration; (2) develop selected noise reduction technologies to a technical maturity sufficient to enable engine testing and validation of those technologies in the FY06-07 time frame; (3) conduct engine tests designed to gain insight into the sources, mechanisms and characteristics of noise in the engines; and (4) establish baseline engine noise measurements for subsequent use in the evaluation of noise reduction.

  16. A Hypervelocity Experimental Research Database (HERD): Support for the Wright Laboratory Armament Directorate Code Validation Program (COVAL)

    Science.gov (United States)

    Mullin, Scott A.; Anderson, Charles E., Jr.; Hertel, Eugene S., Jr.; Hunt, Ronald D.

    The Hypervelocity Experimental Research Database (HERD) described in this paper was developed to aid researchers with code validation for impacts that occur at velocities faster than the testable regime. Codes of concern include both hydrocodes and fast-running analytical or semi-empirical models used to predict the impact phenomenology and damage that results to projectiles and targets. There are several well documented experimental programs that can serve as benchmarks for code validation; these are identified and described. Recommendations for further experimentation (a canonical problem) to provide validation data are also discussed.

  17. Verification and validation of the simulated radar image (SRIM) code radar cross section predictions

    Science.gov (United States)

    Stanley, Dale A.

    1991-12-01

    The objectives of this study were to verify and validate the Simulated Radar Image (SRIM) Code Version 4.0 monostatic radar cross section (RCS) predictions. SRIM, uses the theory of Physical Optics (PO) to predict backscatter for a user specified aspect angle. Target obscuration and multiple reflections are taken into account by sampling the target with ray tracing. The software verification and validation technique followed in this study entailed comparing the code predictions to closed form PO equations, other RCS prediction software packages, and measured data. The targets analyzed were a sphere, rectangular flat plate, circular flat plate, solid right circular cylinder, dihedral and trihedral corner reflectors, top hat, cone, prolate spheroid, and generic missile. SRIM RCS predictions are shown for each target as a function of frequency, aspect angle, and ray density. Also presented is an automation technique that enables the user to run SRIM sequentially over a range of azimuth angles. The FORTRAN code written by the author for the PO equations is also provided.

  18. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  19. Validation of coupled neutronic / thermal-hydraulic codes for VVER reactors. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mittag, S.; Grundmann, U.; Kliem, S.; Kozmenkov, Y.; Rindelhardt, U.; Rohde, U.; Weiss, F.-P.; Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K.-D.; Vanttola, T.; Haemaelaeinen, A.; Kaloinen, E.; Kereszturi, A.; Hegyi, G.; Panka, I.; Hadek, J.; Strmensky, C.; Darilek, P.; Petkov, P.; Stefanova, S.; Kuchin, A.; Khalimonchuk, V.; Hlbocky, P.; Sico, D.; Danilin, S.; Ionov, V.; Nikonov, S.; Powney, D.

    2004-08-01

    thermal-hydraulic feedback effects. Thus, in VALCO work package 3 (WP 3) stand-alone three-dimensional neutron-kinetic codes have been validated. Measurements carried out in an original-size VVER-1000 mock-up (V-1000 facility, Kurchatov Institute Moscow) were used for the validation of the codes DYN3D, HEXTRAN, KIKO3D and BIPR-8, which are chiefly designed for VVER safety calculations. The significant neutron flux tilt measured in the V-1000 core, which is caused only by radial-reflector asymmetries, was successfully modelled. A good agreement between calculated and measured steady-state powers has been achieved, for relative assembly powers and inner-assembly pin power distributions. Calculated effective multiplication factors exceed unity in all cases. (orig.)

  20. Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics

    Science.gov (United States)

    Ingraham, Daniel; Bridges, James

    2017-01-01

    The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.

  1. Network Coding is the 5G Key Enabling Technology

    DEFF Research Database (Denmark)

    Compta, Pol Torres; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    The exponential growth of the mobile devices market, not only smartphones, but also tablets, laptops or wearables, poses a serious challenge for 5G communications. Random Linear Network Coding (RLNC) is a promising solution for present and future networks as it has been shown to provide increased...... throughput, security, and robustness for the transmission of data through the network. Most of the analysis and the demonstrators have focused on the study of data packets with the same size (number of bytes). This constitutes a best case scenario as coded packets will incur little overhead to handle...

  2. A Complex-Geometry Validation Experiment for Advanced Neutron Transport Codes

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Anthony W. LaPorta; Joseph W. Nielsen; James Parry; Mark D. DeHart; Samuel E. Bays; William F. Skerjanc

    2013-11-01

    The Idaho National Laboratory (INL) has initiated a focused effort to upgrade legacy computational reactor physics software tools and protocols used for support of core fuel management and experiment management in the Advanced Test Reactor (ATR) and its companion critical facility (ATRC) at the INL.. This will be accomplished through the introduction of modern high-fidelity computational software and protocols, with appropriate new Verification and Validation (V&V) protocols, over the next 12-18 months. Stochastic and deterministic transport theory based reactor physics codes and nuclear data packages that support this effort include MCNP5[1], SCALE/KENO6[2], HELIOS[3], SCALE/NEWT[2], and ATTILA[4]. Furthermore, a capability for sensitivity analysis and uncertainty quantification based on the TSUNAMI[5] system has also been implemented. Finally, we are also evaluating the Serpent[6] and MC21[7] codes, as additional verification tools in the near term as well as for possible applications to full three-dimensional Monte Carlo based fuel management modeling in the longer term. On the experimental side, several new benchmark-quality code validation measurements based on neutron activation spectrometry have been conducted using the ATRC. Results for the first four experiments, focused on neutron spectrum measurements within the Northwest Large In-Pile Tube (NW LIPT) and in the core fuel elements surrounding the NW LIPT and the diametrically opposite Southeast IPT have been reported [8,9]. A fifth, very recent, experiment focused on detailed measurements of the element-to-element core power distribution is summarized here and examples of the use of the measured data for validation of corresponding MCNP5, HELIOS, NEWT, and Serpent computational models using modern least-square adjustment methods are provided.

  3. The Impact of Bar Code Medication Administration Technology on Reported Medication Errors

    Science.gov (United States)

    Holecek, Andrea

    2011-01-01

    The use of bar-code medication administration technology is on the rise in acute care facilities in the United States. The technology is purported to decrease medication errors that occur at the point of administration. How significantly this technology affects actual rate and severity of error is unknown. This descriptive, longitudinal research…

  4. Secret Codes: The Hidden Curriculum of Semantic Web Technologies

    Science.gov (United States)

    Edwards, Richard; Carmichael, Patrick

    2012-01-01

    There is a long tradition in education of examination of the hidden curriculum, those elements which are implicit or tacit to the formal goals of education. This article draws upon that tradition to open up for investigation the hidden curriculum and assumptions about students and knowledge that are embedded in the coding undertaken to facilitate…

  5. Secret Codes: The Hidden Curriculum of Semantic Web Technologies

    Science.gov (United States)

    Edwards, Richard; Carmichael, Patrick

    2012-01-01

    There is a long tradition in education of examination of the hidden curriculum, those elements which are implicit or tacit to the formal goals of education. This article draws upon that tradition to open up for investigation the hidden curriculum and assumptions about students and knowledge that are embedded in the coding undertaken to facilitate…

  6. The TALL-3D facility design and commissioning tests for validation of coupled STH and CFD codes

    Energy Technology Data Exchange (ETDEWEB)

    Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se; Jeltsov, Marti, E-mail: marti@safety.sci.kth.se; Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se; Karbojian, Aram, E-mail: karbojan@kth.se; Villanueva, Walter, E-mail: walter@safety.sci.kth.se; Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se

    2015-08-15

    Highlights: • Design of a heavy liquid thermal-hydraulic loop for CFD/STH code validation. • Description of the loop instrumentation and assessment of measurement error. • Experimental data from forced to natural circulation transient. - Abstract: Application of coupled CFD (Computational Fluid Dynamics) and STH (System Thermal Hydraulics) codes is a prerequisite for computationally affordable and sufficiently accurate prediction of thermal-hydraulics of complex systems. Coupled STH and CFD codes require validation for understanding and quantification of the sources of uncertainties in the code prediction. TALL-3D is a liquid Lead Bismuth Eutectic (LBE) loop developed according to the requirements for the experimental data for validation of coupled STH and CFD codes. The goals of the facility design are to provide (i) mutual feedback between natural circulation in the loop and complex 3D mixing and stratification phenomena in the pool-type test section, (ii) a possibility to validate standalone STH and CFD codes for each subsection of the facility, and (iii) sufficient number of experimental data to separate the process of input model calibration and code validation. Description of the facility design and its main components, approach to estimation of experimental uncertainty and calibration of model input parameters that are not directly measured in the experiment are discussed in the paper. First experimental data from the forced to natural circulation transient is also provided in the paper.

  7. Validating an infrared thermal switch as a novel access technology

    Directory of Open Access Journals (Sweden)

    Memarian Negar

    2010-08-01

    Full Text Available Abstract Background Recently, a novel single-switch access technology based on infrared thermography was proposed. The technology exploits the temperature differences between the inside and surrounding areas of the mouth as a switch trigger, thereby allowing voluntary switch activation upon mouth opening. However, for this technology to be clinically viable, it must be validated against a gold standard switch, such as a chin switch, that taps into the same voluntary motion. Methods In this study, we report an experiment designed to gauge the concurrent validity of the infrared thermal switch. Ten able-bodied adults participated in a series of 3 test sessions where they simultaneously used both an infrared thermal and conventional chin switch to perform multiple trials of a number identification task with visual, auditory and audiovisual stimuli. Participants also provided qualitative feedback about switch use. User performance with the two switches was quantified using an efficiency measure based on mutual information. Results User performance (p = 0.16 and response time (p = 0.25 with the infrared thermal switch were comparable to those of the gold standard. Users reported preference for the infrared thermal switch given its non-contact nature and robustness to changes in user posture. Conclusions Thermal infrared access technology appears to be a valid single switch alternative for individuals with disabilities who retain voluntary mouth opening and closing.

  8. The BOUT Project; Validation and Benchmark of BOUT Code and Experimental Diagnostic Tools for Fusion Boundary Turbulence

    Institute of Scientific and Technical Information of China (English)

    徐学桥

    2001-01-01

    A boundary plasma turbulence code BOUT is presented. The preliminary encour aging results have been obtained when comparing with probe measurements for a typical Ohmic discharge in HT-7 tokamak. The validation and benchmark of BOUT code and experimental diagnostic tools for fusion boundary plasma turbulence is proposed.

  9. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Science.gov (United States)

    2011-10-26

    ... code such as another symbol, standard, or technology (Id. at 12510 and 12529). In response to the Bar... to FDA, they are not required to do so. In recognition of these challenges, in the Federal Register...

  10. Validation of diagnostic codes for Charcot–Marie–Tooth disease in the Danish National Patient Registry

    Science.gov (United States)

    Vaeth, Signe; Jensen, Uffe Birk; Christensen, Rikke; Andersen, Henning

    2016-01-01

    Purpose To validate the diagnostic codes for Charcot–Marie–Tooth disease (CMT) in the Danish National Patient Registry (DNPR) using positive predictive value (PPV) as a measure of validity. Patients and methods We used the DNPR to identify all patients diagnosed with at least one primary CMT diagnosis at a specialized department in the Central Denmark Region during the period 1977–2012. From this population, we randomly selected 123 patients for the validation study. Medical files were reviewed and used as reference standard. We estimated the PPV of the CMT diagnoses and stratified the analysis according to age at diagnosis, gender, and calendar time. Results In the DNPR, 275 patients were identified. We were able to retrieve 96 medical files from the random sample of 123 patients, and 85 CMT diagnoses were confirmed. The average age at diagnosis was 42.5 years, and 34% were female. The PPV was 88.5% (95% confidence interval: 80.4–94.1). Conclusion The CMT diagnoses in the DNPR have high validity. The DNPR can be used as a data source for epidemiologic research on CMT. PMID:27920579

  11. Code division multiple access signaling for modulated reflector technology

    Science.gov (United States)

    Briles, Scott D [Los Alamos, NM

    2012-05-01

    A method and apparatus for utilizing code division multiple access in modulated reflectance transmissions comprises the steps of generating a phase-modulated reflectance data bit stream; modifying the modulated reflectance data bit stream; providing the modified modulated reflectance data bit stream to a switch that connects an antenna to an infinite impedance in the event a "+1" is to be sent, or connects the antenna to ground in the event a "0" or a "-1" is to be sent.

  12. Emerging technologies for 3D video creation, coding, transmission and rendering

    CERN Document Server

    Dufaux, Frederic; Cagnazzo, Marco

    2013-01-01

    With the expectation of greatly enhanced user experience, 3D video is widely perceived as the next major advancement in video technology. In order to fulfil the expectation of enhanced user experience, 3D video calls for new technologies addressing efficient content creation, representation/coding, transmission and display. Emerging Technologies for 3D Video will deal with all aspects involved in 3D video systems and services, including content acquisition and creation, data representation and coding, transmission, view synthesis, rendering, display technologies, human percepti

  13. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    Energy Technology Data Exchange (ETDEWEB)

    Hilmy, N. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)], E-mail: nazly@batan.go.id; Febrida, A.; Basril, A. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)

    2007-11-15

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  14. RELAP5/MOD3.3 Code Validation with Plant Abnormal Event

    Directory of Open Access Journals (Sweden)

    Andrej Prošek

    2008-07-01

    Full Text Available Measured plant data from various abnormal events are of great importance for code validation. The purpose of the study was to validate the RELAP5/MOD3.3 Patch 03 computer code with the abnormal event which occurred at Krško Nuclear Power Plant (NPP on April 10, 2005. The event analyzed was a malfunction, which occurred during a power reduction sequence when regular periodic testing of the turbine valves was performed. Unexpected turbine valve closing caused safety injection signal, followed by reactor trip. The RELAP5 input model delivered by Krško NPP was used. In short term, the calculation agrees very well with the plant measured data. In the long term, this is also true when operator actions and special plant systems are modeled. In the opposite, the transient would progress quite differently. Finally, the calculated data may be supplemental to plant measured data when the information is missing or the measurement is questionable.

  15. Decay heat experiment and validation of calculation code systems for fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of {+-}10%. (author)

  16. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  17. Introduction of Code Overlay Technology%Code Overlay技术综述

    Institute of Scientific and Technical Information of China (English)

    吴佩华; 蔡亮

    2008-01-01

    多核高性能处理器的编程模型和编译技术近年来成了研究热点.如何利用现有的编程模型,让大程序也能在有限的局存上自如地运行,是当前编译技术必须解决的一个重要问题,Code Overlay就是这样一种技术.本文详述了Overlay技术的基本理论,并着重分析了两种典型的Overlay结构.

  18. Network Coding is the 5G Key Enabling Technology

    DEFF Research Database (Denmark)

    Compta, Pol Torres; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    such packets. However, packet lengths are quite heterogeneous in real networks, which can cause a high overhead or, alternatively, a high delay in the transmission of data packets. As we show, this can have a severe effect on a variety of applications. This paper proposes a series of mechanisms to manage......The exponential growth of the mobile devices market, not only smartphones, but also tablets, laptops or wearables, poses a serious challenge for 5G communications. Random Linear Network Coding (RLNC) is a promising solution for present and future networks as it has been shown to provide increased...

  19. Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.

    Science.gov (United States)

    Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N

    2016-04-01

    The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy.

  20. VULCAN: an Open-Source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    CERN Document Server

    Tsai, Shang-Min; Grosheintz, Luc; Rimmer, Paul B; Kitzmann, Daniel; Heng, Kevin

    2016-01-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K using a reduced C- H-O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer & Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. Further validation of VULCAN is made by examining the theoretical trends produced when the temperature-pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching ap...

  1. Steps towards verification and validation of the Fetch code for Level 2 analysis, design, and optimization of aqueous homogeneous reactors

    Energy Technology Data Exchange (ETDEWEB)

    Nygaard, E. T. [Babcock and Wilcox Technical Services Group, 800 Main Street, Lynchburg, VA 24504 (United States); Pain, C. C.; Eaton, M. D.; Gomes, J. L. M. A.; Goddard, A. J. H.; Gorman, G.; Tollit, B.; Buchan, A. G.; Cooling, C. M. [Applied Modelling and Computation Group, Dept. of Earth Science and Engineering, Imperial College London, SW7 2AZ (United Kingdom); Angelo, P. L. [Y-12 National Security Complex, Oak Ridge, TN 37831 (United States)

    2012-07-01

    Babcock and Wilcox Technical Services Group (B and W) has identified aqueous homogeneous reactors (AHRs) as a technology well suited to produce the medical isotope molybdenum 99 (Mo-99). AHRs have never been specifically designed or built for this specialized purpose. However, AHRs have a proven history of being safe research reactors. In fact, in 1958, AHRs had 'a longer history of operation than any other type of research reactor using enriched fuel' and had 'experimentally demonstrated to be among the safest of all various type of research reactor now in use [1].' While AHRs have been modeled effectively using simplified 'Level 1' tools, the complex interactions between fluids, neutronics, and solid structures are important (but not necessarily safety significant). These interactions require a 'Level 2' modeling tool. Imperial College London (ICL) has developed such a tool: Finite Element Transient Criticality (FETCH). FETCH couples the radiation transport code EVENT with the computational fluid dynamics code (Fluidity), the result is a code capable of modeling sub-critical, critical, and super-critical solutions in both two-and three-dimensions. Using FETCH, ICL researchers and B and W engineers have studied many fissioning solution systems include the Tokaimura criticality accident, the Y12 accident, SILENE, TRACY, and SUPO. These modeling efforts will ultimately be incorporated into FETCH'S extensive automated verification and validation (V and V) test suite expanding FETCH'S area of applicability to include all relevant physics associated with AHRs. These efforts parallel B and W's engineering effort to design and optimize an AHR to produce Mo99. (authors)

  2. Phenomenological modeling of critical heat flux: The GRAMP code and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, M. [Pakistan Institute of Engineering and Applied Sciences (PIEAS), Islamabad (Pakistan); Chandraker, D.K. [Bhabha Atomic Research Centre, Mumbai (India); Hewitt, G.F. [Imperial College, London SW7 2BX (United Kingdom); Vijayan, P.K. [Bhabha Atomic Research Centre, Mumbai (India); Walker, S.P., E-mail: s.p.walker@imperial.ac.uk [Imperial College, London SW7 2BX (United Kingdom)

    2013-01-15

    Highlights: Black-Right-Pointing-Pointer Assessment of CHF limits is vital for LWR optimization and safety analysis. Black-Right-Pointing-Pointer Phenomenological modeling is a valuable adjunct to pure empiricism. Black-Right-Pointing-Pointer It is based on empirical representations of the (several, competing) phenomena. Black-Right-Pointing-Pointer Phenomenological modeling codes making 'aggregate' predictions need careful assessment against experiments. Black-Right-Pointing-Pointer The physical and mathematical basis of a phenomenological modeling code GRAMP is presented. Black-Right-Pointing-Pointer The GRAMP code is assessed against measurements from BARC (India) and Harwell (UK), and the Look Up Tables. - Abstract: Reliable knowledge of the critical heat flux is vital for the design of light water reactors, for both safety and optimization. The use of wholly empirical correlations, or equivalently 'Look Up Tables', can be very effective, but is generally less so in more complex cases, and in particular cases where the heat flux is axially non-uniform. Phenomenological models are in principle more able to take into account of a wider range of conditions, with a less comprehensive coverage of experimental measurements. These models themselves are in part based upon empirical correlations, albeit of the more fundamental individual phenomena occurring, rather than the aggregate behaviour, and as such they too require experimental validation. In this paper we present the basis of a general-purpose phenomenological code, GRAMP, and then use two independent 'direct' sets of measurement, from BARC in India and from Harwell in the United Kingdom, and the large dataset embodied in the Look Up Tables, to perform a validation exercise on it. Very good agreement between predictions and experimental measurements is observed, adding to the confidence with which the phenomenological model can be used. Remaining important uncertainties in the

  3. Development and validation of educational technology for venous ulcer care.

    Science.gov (United States)

    Benevides, Jéssica Lima; Coutinho, Janaina Fonseca Victor; Pascoal, Liliane Chagas; Joventino, Emanuella Silva; Martins, Mariana Cavalcante; Gubert, Fabiane do Amaral; Alves, Allana Mirella

    2016-04-01

    To develop and validate an educational technology venous ulcers care. Methodological study conducted in five steps: Situational diagnosis; literature review; development of texts, illustrations and layout; apparent and content validity by the Content Validity Index, assessment of Flesch Readability Index; and pilot testing. The developed technology was a type of booklet entitled Booklet for Venous Ulcers Care, consisting of seven topics: Diet and food intake, walking and light exercise, resting with elevated leg, bandage care, compression therapy, family support, and keeping healthy habits. The apparent validity revealed minimal agreement of 85.7% in the clarity and comprehensibility. The total content validity index was 0.97, the Flesch Readability Index was 75%, corresponding to the reading "fairly easy". The pilot test showed that 100% of people with venous ulcers evaluated the text and the illustrations as understandable, as appropriate. The educational technology proved to be valid for the appearance and content with potential for use in clinical practice. Construir e validar uma tecnologia educativa para cuidados com úlcera venosa. Estudo metodológico realizado em cinco fases: diagnóstico situacional; revisão da literatura; desenvolvimento de textos, ilustrações e diagramação; validade de aparência e de conteúdo pelo Índice de Validade de Conteúdo, avaliação do Índice de Legibilidade de Flesch; e teste piloto. A tecnologia desenvolvida foi do tipo cartilha intitulada Cartilha para cuidados com úlcera venosa, constituída de sete tópicos: Alimentação, Caminhadas e exercícios leves, Repouso com a perna elevada, Cuidados com o curativo, Terapia compressiva, Apoio familiar, e manter hábitos saudáveis. A validade aparente revelou concordância mínima de 85,7% na clareza e compreensibilidade. O Índice de Validade de Conteúdo total foi de 0,97, o Índice de legibilidade de Flesch foi de 75%, o que correspondeu à leitura "razoavelmente f

  4. A verification and validation of the new implementation of subcooled flow boiling in a CFD code

    Energy Technology Data Exchange (ETDEWEB)

    Braz Filho, Francisco A.; Ribeiro, Guilherme B.; Caldeira, Alexandre D., E-mail: fbraz@ieav.cta.br, E-mail: gbribeiro@ieav.cta.br, E-mail: alexdc@ieav.cta.br [Instituto de Estudos Avancados (IEAv), Sao Jose dos Campos, SP (Brazil). Divisao de Energia Nuclear

    2015-07-01

    Subcooled flow boiling in a heated channel occurs when the liquid bulk temperature is lower than the saturation temperature and the wall temperature is higher. FLUENT computational fluid dynamics code uses Eulerian Multiphase Model to analyze this phenomenon. In FLUENT previous versions, the heat transfer correlations and the source terms of the conservation equations were added to the model using User Defined Functions (UDFs). Currently, these models are among the options of the FLUENT without the need to use UDFs. The comparison of the FLUENT calculations with experimental data for the void fraction presented a wide range of variation in the results, with values from satisfactory to poor results. There was the same problem in the previous versions. The fit factors of the FLUENT that control condensation and boiling in the system can be used to improve the results. This study showed a strong need for verification and validation of these calculations, along with a sensitivity analysis of the main parameters. (author)

  5. Apar-T: code, validation, and physical interpretation of particle-in-cell results

    CERN Document Server

    Melzani, Mickaël; Walder, Rolf; Folini, Doris; Favre, Jean M; Krastanov, Stefan; Messmer, Peter

    2013-01-01

    We present the parallel particle-in-cell (PIC) code Apar-T and, more importantly, address the fundamental question of the relations between the PIC model, the Vlasov-Maxwell theory, and real plasmas. First, we present four validation tests: spectra from simulations of thermal plasmas, linear growth rates of the relativistic tearing instability and of the filamentation instability, and non-linear filamentation merging phase. For the filamentation instability we show that the effective growth rates measured on the total energy can differ by more than 50% from the linear cold predictions and from the fastest modes of the simulation. Second, we detail a new method for initial loading of Maxwell-J\\"uttner particle distributions with relativistic bulk velocity and relativistic temperature, and explain why the traditional method with individual particle boosting fails. Third, we scrutinize the question of what description of physical plasmas is obtained by PIC models. These models rely on two building blocks: coarse...

  6. ASTEC V2 severe accident integral code: Fission product modelling and validation

    Energy Technology Data Exchange (ETDEWEB)

    Cantrel, L., E-mail: laurent.cantrel@irsn.fr; Cousin, F.; Bosland, L.; Chevalier-Jabet, K.; Marchetto, C.

    2014-06-01

    One main goal of the severe accident integral code ASTEC V2, jointly developed since almost more than 15 years by IRSN and GRS, is to simulate the overall behaviour of fission products (FP) in a damaged nuclear facility. ASTEC applications are source term determinations, level 2 Probabilistic Safety Assessment (PSA2) studies including the determination of uncertainties, accident management studies and physical analyses of FP experiments to improve the understanding of the phenomenology. ASTEC is a modular code and models of a part of the phenomenology are implemented in each module: the release of FPs and structural materials from degraded fuel in the ELSA module; the transport through the reactor coolant system approximated as a sequence of control volumes in the SOPHAEROS module; and the radiochemistry inside the containment nuclear building in the IODE module. Three other modules, CPA, ISODOP and DOSE, allow respectively computing the deposition rate of aerosols inside the containment, the activities of the isotopes as a function of time, and the gaseous dose rate which is needed to model radiochemistry in the gaseous phase. In ELSA, release models are semi-mechanistic and have been validated for a wide range of experimental data, and noticeably for VERCORS experiments. For SOPHAEROS, the models can be divided into two parts: vapour phase phenomena and aerosol phase phenomena. For IODE, iodine and ruthenium chemistry are modelled based on a semi-mechanistic approach, these FPs can form some volatile species and are particularly important in terms of potential radiological consequences. The models in these 3 modules are based on a wide experimental database, resulting for a large part from international programmes, and they are considered at the state of the art of the R and D knowledge. This paper illustrates some FPs modelling capabilities of ASTEC and computed values are compared to some experimental results, which are parts of the validation matrix.

  7. Zero-point motion effect on the bandgap of diamond: validation of codes

    Science.gov (United States)

    Poncé, Samuel; Antonius, Gabriel; Boulanger, Paul; Cannuccia, Elena; Marini, Andrea; Côté, Michel; Gonze, Xavier

    2014-03-01

    Verification and validation of codes, as well as new theoretical methods, are of utmost importance if one wants to provide reliable results. In this work we present a rigorous and careful study of all the quantities that enters into the calculation of the zero point motion renormalization of the direct band gap of diamond due to electron-phonon coupling. This study has been done within the Allen-Heine-Cardona (AHC) formalism as implemented into Abinit and Yambo on top of Quantum Espresso. We aim at quantifying the agreement between the codes for the different quantities of interest. This study shows that one can get less than 10-5 Ha / at differences on the total energy, 0.07 cm-1 on the phonon frequencies, 0.5% on the electron-phonon matrix elements and less than 4 meV on the zero-point motion renormalization. At the LDA level, the converged direct bandgap renormalization in diamond due to electron-phonon coupling in the AHC formalism is -409 meV (reduction of the band gap). This work was supported by the FRS-FNRS through a FRIA grant (S.P.). A. M. acknowledges funding by MIUR FIRB Grant No. RBFR12SW0J.

  8. Location Based Service in Indoor Environment Using Quick Response Code Technology

    Science.gov (United States)

    Hakimpour, F.; Zare Zardiny, A.

    2014-10-01

    Today by extensive use of intelligent mobile phones, increased size of screens and enriching the mobile phones by Global Positioning System (GPS) technology use of location based services have been considered by public users more than ever.. Based on the position of users, they can receive the desired information from different LBS providers. Any LBS system generally includes five main parts: mobile devices, communication network, positioning system, service provider and data provider. By now many advances have been gained in relation to any of these parts; however the users positioning especially in indoor environments is propounded as an essential and critical issue in LBS. It is well known that GPS performs too poorly inside buildings to provide usable indoor positioning. On the other hand, current indoor positioning technologies such as using RFID or WiFi network need different hardware and software infrastructures. In this paper, we propose a new method to overcome these challenges. This method is using the Quick Response (QR) Code Technology. QR Code is a 2D encrypted barcode with a matrix structure which consists of black modules arranged in a square grid. Scanning and data retrieving process from QR Code is possible by use of different camera-enabled mobile phones only by installing the barcode reader software. This paper reviews the capabilities of QR Code technology and then discusses the advantages of using QR Code in Indoor LBS (ILBS) system in comparison to other technologies. Finally, some prospects of using QR Code are illustrated through implementation of a scenario. The most important advantages of using this new technology in ILBS are easy implementation, spending less expenses, quick data retrieval, possibility of printing the QR Code on different products and no need for complicated hardware and software infrastructures.

  9. Validity of ICD-9-CM Coding for Identifying Incident Methicillin-Resistant Staphylococcus aureus (MRSA) Infections: Is MRSA Infection Coded as a Chronic Disease?

    Science.gov (United States)

    Schweizer, Marin L.; Eber, Michael R.; Laxminarayan, Ramanan; Furuno, Jon P.; Popovich, Kyle J.; Hota, Bala; Rubin, Michael A.; Perencevich, Eli N.

    2013-01-01

    BACKGROUND AND OBJECTIVE Investigators and medical decision makers frequently rely on administrative databases to assess methicillin-resistant Staphylococcus aureus (MRSA) infection rates and outcomes. The validity of this approach remains unclear. We sought to assess the validity of the International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) code for infection with drug-resistant microorganisms (V09) for identifying culture-proven MRSA infection. DESIGN Retrospective cohort study. METHODS All adults admitted to 3 geographically distinct hospitals between January 1, 2001, and December 31, 2007, were assessed for presence of incident MRSA infection, defined as an MRSA-positive clinical culture obtained during the index hospitalization, and presence of the V09 ICD-9-CM code. The k statistic was calculated to measure the agreement between presence of MRSA infection and assignment of the V09 code. Sensitivities, specificities, positive predictive values, and negative predictive values were calculated. RESULTS There were 466,819 patients discharged during the study period. Of the 4,506 discharged patients (1.0%) who had the V09 code assigned, 31% had an incident MRSA infection, 20% had prior history of MRSA colonization or infection but did not have an incident MRSA infection, and 49% had no record of MRSA infection during the index hospitalization or the previous hospitalization. The V09 code identified MRSA infection with a sensitivity of 24% (range, 21%–34%) and positive predictive value of 31% (range, 22%–53%). The agreement between assignment of the V09 code and presence of MRSA infection had a κ coefficient of 0.26 (95% confidence interval, 0.25–0.27). CONCLUSIONS In its current state, the ICD-9-CM code V09 is not an accurate predictor of MRSA infection and should not be used to measure rates of MRSA infection. PMID:21460469

  10. Assessing Attachment in Psychotherapy: Validation of the Patient Attachment Coding System (PACS).

    Science.gov (United States)

    Talia, Alessandro; Miller-Bottome, Madeleine; Daniel, Sarah I F

    2017-01-01

    The authors present and validate the Patient Attachment Coding System (PACS), a transcript-based instrument that assesses clients' in-session attachment based on any session of psychotherapy, in multiple treatment modalities. One-hundred and sixty clients in different types of psychotherapy (cognitive-behavioural, cognitive-behavioural-enhanced, psychodynamic, relational, supportive) and from three different countries were administered the Adult Attachment Interview (AAI) prior to treatment, and one session for each client was rated with the PACS by independent coders. Results indicate strong inter-rater reliability, and high convergent validity of the PACS scales and classifications with the AAI. These results present the PACS as a practical alternative to the AAI in psychotherapy research and suggest that clinicians using the PACS can assess clients' attachment status on an ongoing basis by monitoring clients' verbal activity. These results also provide information regarding the ways in which differences in attachment status play out in therapy sessions and further the study of attachment in psychotherapy from a pre-treatment client factor to a process variable. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Technology validation of the PLATO CCD at ESA

    Science.gov (United States)

    Prod'homme, Thibaut; Verhoeve, Peter; Beaufort, Thierry; Duvet, Ludovic; Lemmel, Frederic; Smit, Hans; Blommaert, Sander; Oosterbroek, Tim; van der Luijt, Cornelis; Visser, Ivo; Heijnen, Jerko; Butler, Bart

    2016-07-01

    PLATO { PLAnetary Transits and Oscillations of stars { is the third medium-class mission to be selected in the European Space Agency (ESA) Science and Robotic Exploration Cosmic Vision programme. Due for launch in 2025, the payload makes use of a large format (8 cm x 8 cm) Charge-Coupled Devices (CCDs) the e2v CCD270 operated at 4 MHz. The manufacture of such large device in large quantity constitutes an unprecedented effort. To de-risk the PLATO CCD procurement and aid the mission definition process, ESA's Payload Technology Validation team is characterizing the electro-optical performance of a number of PLATO devices before and after proton irradiation.

  12. National Combustion Code Validated Against Lean Direct Injection Flow Field Data

    Science.gov (United States)

    Iannetti, Anthony C.

    2003-01-01

    Most combustion processes have, in some way or another, a recirculating flow field. This recirculation stabilizes the reaction zone, or flame, but an unnecessarily large recirculation zone can result in high nitrogen oxide (NOx) values for combustion systems. The size of this recirculation zone is crucial to the performance of state-of-the-art, low-emissions hardware. If this is a large-scale combustion process, the flow field will probably be turbulent and, therefore, three-dimensional. This research dealt primarily with flow fields resulting from lean direct injection (LDI) concepts, as described in Research & Technology 2001. LDI is a concept that depends heavily on the design of the swirler. The LDI concept has the potential to reduce NOx values from 50 to 70 percent of current values, with good flame stability characteristics. It is cost effective and (hopefully) beneficial to do most of the design work for an LDI swirler using computer-aided design (CAD) and computer-aided engineering (CAE) tools. Computational fluid dynamics (CFD) codes are CAE tools that can calculate three-dimensional flows in complex geometries. However, CFD codes are only beginning to correctly calculate the flow fields for complex devices, and the related combustion models usually remove a large portion of the flow physics.

  13. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    Science.gov (United States)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  14. Remote sensing validation through SOOP technology: implementation of Spectra system

    Science.gov (United States)

    Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco

    2017-04-01

    The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to

  15. Next generation video coding for mobile applications: industry requirements and technologies

    Science.gov (United States)

    Budagavi, Madhukar; Zhou, Minhua

    2007-01-01

    Handheld battery-operated consumer electronics devices such as camera phones, digital still cameras, digital camcorders, and personal media players have become very popular in recent years. Video codecs are extensively used in these devices for video capture and/or playback. The annual shipment of such devices already exceeds a hundred million units and is growing, which makes mobile battery-operated video device requirements very important to focus in video coding research and development. This paper highlights the following unique set of requirements for video coding for these applications: low power consumption, high video quality at low complexity, and low cost, and motivates the need for a new video coding standard that enables better trade-offs of power consumption, complexity, and coding efficiency to meet the challenging requirements of portable video devices. This paper also provides a brief overview of some of the video coding technologies being presented in the ITU-T Video Coding Experts Group (VCEG) standardization body for computational complexity reduction and for coding efficiency improvement in a future video coding standard.

  16. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  17. Preliminary design of a small air loop for system analysis and validation of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Marchand, M.; Saez, M.; Tauveron, N.; Tenchine, D.; Germain, T.; Geffraye, G.; Ruby, G.P. [CEA Grenoble (DEN/DER/SSTH), 38 (France)

    2007-07-01

    The French Atomic Energy Commission (Cea) is carrying on the design of a Small Air Loop for System Analysis (SALSA), devoted to the study of gas cooled nuclear reactors behaviour in normal and incidental/accidental operating conditions. The reduced size of the SALSA components compared to a full-scale reactor and air as gaseous coolant instead of Helium will allow an easy management of the loop. The main purpose of SALSA will be the validation of the associated thermal hydraulic safety simulation codes, like CATHARE. The main goal of this paper is to present the methodology used to define the characteristics of the loop. In a first step, the study has been focused on a direct-cycle system for the SALSA loop with few global constraints using a similarity analysis to support the definition and design of the loop. Similarity requirements have been evaluated to determine the scale factors which have to be applied to the SALSA loop components. The preliminary conceptual design of the SALSA plant with a definition of each component has then be carried out. The whole plant has been modelled using the CATHARE code. Calculations of the SALSA steady-state in nominal conditions and of different plant transients in direct-cycle have been made. The first system results obtained on the global behaviour of the loop confirm that SALSA can be representative of a Gas-Cooled nuclear reactor with some minor design modifications. In a second step, the current prospects focus on the SALSA loop capability to reproduce correctly the heat transfer occurring in specific incidental situations. Heat decay removal by natural convection is a crucial point of interest. The first results show that the behaviour and the efficiency of the loop are strongly influenced by the definition of the main parameters for each component. A complete definition of SALSA is under progress. (authors)

  18. Validating billing/encounter codes as indicators of lung, colorectal, breast, and prostate cancer recurrence using 2 large contemporary cohorts.

    Science.gov (United States)

    Hassett, Michael J; Ritzwoller, Debra P; Taback, Nathan; Carroll, Nikki; Cronin, Angel M; Ting, Gladys V; Schrag, Deb; Warren, Joan L; Hornbrook, Mark C; Weeks, Jane C

    2014-10-01

    A substantial proportion of cancer-related mortality is attributable to recurrent, not de novo metastatic disease, yet we know relatively little about these patients. To fill this gap, investigators often use administrative codes for secondary malignant neoplasm or chemotherapy to identify recurrent cases in population-based datasets. However, these algorithms have not been validated in large, contemporary, routine care cohorts. To evaluate the validity of secondary malignant neoplasm and chemotherapy codes as indicators of recurrence after definitive local therapy for stage I-III lung, colorectal, breast, and prostate cancer. We assessed the sensitivity, specificity, and positive predictive value (PPV) of these codes 14 and 60 months after diagnosis using 2 administrative datasets linked with gold-standard recurrence status information: CanCORS/Medicare (diagnoses 2003-2005) and HMO/Cancer Research Network (diagnoses 2000-2005). We identified 929 CanCORS/Medicare patients and 5298 HMO/CRN patients. Sensitivity, specificity, and PPV ranged widely depending on which codes were included and the type of cancer. For patients with lung, colorectal, and breast cancer, the combination of secondary malignant neoplasm and chemotherapy codes was the most sensitive (75%-85%); no code-set was highly sensitive and highly specific. For prostate cancer, no code-set offered even moderate sensitivity (≤ 19%). Secondary malignant neoplasm and chemotherapy codes could not identify recurrent cancer without some risk of misclassification. Findings based on existing algorithms should be interpreted with caution. More work is needed to develop a valid algorithm that can be used to characterize outcomes and define patient cohorts for comparative effectiveness research studies.

  19. Validating Billing/Encounter Codes as Indicators of Lung, Colorectal, Breast, and Prostate Cancer Recurrence using Two Large Contemporary Cohorts

    Science.gov (United States)

    Hassett, Michael J.; Ritzwoller, Debra P.; Taback, Nathan; Carroll, Nikki; Cronin, Angel M.; Ting, Gladys V.; Schrag, Deb; Warren, Joan L.; Hornbrook, Mark C.; Weeks, Jane C.

    2012-01-01

    Background A substantial proportion of cancer-related mortality is attributable to recurrent, not de novo metastatic disease, yet we know relatively little about these patients. To fill this gap, investigators often use administrative codes for secondary malignant neoplasm or chemotherapy to identify recurrent cases in population-based datasets. However, these algorithms have not been validated in large, contemporary, routine care cohorts. Objective To evaluate the validity of secondary malignant neoplasm and chemotherapy codes as indicators of recurrence after definitive local therapy for stage I-III lung, colorectal, breast, and prostate cancer. Research Design, Subjects & Measures We assessed the sensitivity, specificity, and positive predictive value (PPV) of these codes 14- and 60-months after diagnosis using two administrative datasets linked with gold-standard recurrence status information: CanCORS/Medicare (diagnoses 2003-2005) and HMO/Cancer Research Network (diagnoses 2000-2005). Results We identified 929 CanCORS/Medicare patients and 5298 HMO/CRN patients. Sensitivity, specificity, and PPV ranged widely depending on which codes were included and the type of cancer. For patients with lung, colorectal, and breast cancer, the combination of secondary malignant neoplasm and chemotherapy codes was the most sensitive (75%-85%); no code-set was highly sensitive and highly specific. For prostate cancer, no code-set offered even moderate sensitivity (≤19%). Conclusions Secondary malignant neoplasm and chemotherapy codes could not identify recurrent cancer without some risk of misclassification. Findings based on existing algorithms should be interpreted with caution. More work is needed to develop a valid algorithm that can be used to characterize outcomes and define patient cohorts for comparative effectiveness research studies. PMID:23222531

  20. VULCAN: An Open-source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    Science.gov (United States)

    Tsai, Shang-Min; Lyons, James R.; Grosheintz, Luc; Rimmer, Paul B.; Kitzmann, Daniel; Heng, Kevin

    2017-02-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K, using a reduced C–H–O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer & Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. We also use VULCAN to examine the theoretical trends produced when the temperature–pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching approximation and find that it is accurate for methane but breaks down for acetylene, because the disequilibrium abundance of acetylene is not directly determined by transport-induced quenching, but is rather indirectly controlled by the disequilibrium abundance of methane. Therefore we suggest that the quenching approximation should be used with caution and must always be checked against a chemical kinetics calculation. A one-dimensional model atmosphere with 100 layers, computed using VULCAN, typically takes several minutes to complete. VULCAN is part of the Exoclimes Simulation Platform (ESP; exoclime.net) and publicly available at https://github.com/exoclime/VULCAN.

  1. Validation of CONTAIN-LMR code for accident analysis of sodium-cooled fast reactor containments

    Energy Technology Data Exchange (ETDEWEB)

    Gordeev, S.; Hering, W.; Schikorr, M.; Stieglitz, R. [Inst. for Neutron Physic and Reactor Technology, Karlsruhe Inst. of Technology, Campus Nord (Germany)

    2012-07-01

    CONTAIN-LMR 1 is an analytical tool for the containment performance of sodium cooled fast reactors. In this code, the modelling for the sodium fire is included: the oxygen diffusion model for the sodium pool fire, and the liquid droplet model for the sodium spray fire. CONTAIN-LMR is also able to model the interaction of liquid sodium with concrete structure. It may be applicable to different concrete compositions. Testing and validation of these models will help to qualify the simulation results. Three experiments with sodium performed in the FAUNA facility at FZK have been used for the validation of CONTAIN-LMR. For pool fire tests, calculations have been performed with two models. The first model consists of one gas cell representing the volume of the burn compartment. The volume of the second model is subdivided into 32 coupled gas cells. The agreement between calculations and experimental data is acceptable. The detailed pool fire model shows less deviation from experiments. In the spray fire, the direct heating from the sodium burning in the media is dominant. Therefore, single cell modeling is enough to describe the phenomena. Calculation results have reasonable agreement with experimental data. Limitations of the implemented spray model can cause the overestimation of predicted pressure and temperature in the cell atmosphere. The ability of the CONTAIN-LMR to simulate the sodium pool fire accompanied by sodium-concrete reactions was tested using the experimental study of sodium-concrete interactions for construction concrete as well as for shielding concrete. The model provides a reasonably good representation of chemical processes during sodium-concrete interaction. The comparison of time-temperature profiles of sodium and concrete shows, that the model requires modifications for predictions of the test results. (authors)

  2. The use of QR Code as a learning technology: an exploratory study

    Directory of Open Access Journals (Sweden)

    Stefano Besana

    2010-12-01

    Full Text Available This paper discusses a pilot study on the potential benefits of QR (Quick Response Codes as a tool for facilitating and enhancing learning processes. An analysis is given of the strengths and added value of QR technologies applied to museum visits, with precautions regarding the design of learning environments like the one presented. Some possible future scenarios are identified for implementing these technologies in contexts more strictly related to teaching and education.

  3. Understanding Student Teachers’ Behavioural Intention to Use Technology: Technology Acceptance Model (TAM Validation and Testing

    Directory of Open Access Journals (Sweden)

    Kung-Teck, Wong

    2013-01-01

    Full Text Available This study sets out to validate and test the Technology Acceptance Model (TAM in the context of Malaysian student teachers’ integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA, and structural equation modelling (SEM was used for model comparison and hypotheses testing. The goodness-of-fit test of the analysis shows partial support of the applicability of the TAM in a Malaysian context. Overall, the TAM accounted for 37.3% of the variance in intention to use technology among student teachers and of the five hypotheses formulated, four are supported. Perceived usefulness is a significant influence on attitude towards computer use and behavioural intention. Perceived ease of use significantly influences perceived usefulness, and finally, behavioural intention is found to be influenced by attitude towards computer use. The findings of this research contribute to the literature by validating the TAM in the Malaysian context and provide several prominent implications for the research and practice of technology integration development.

  4. Characteristics of Academic Language Register Occurring in Caretaker-Child Interaction: Development and Validation of a Coding Scheme

    Science.gov (United States)

    Aarts, Rian; Demir, Serpil; Vallen, Ton

    2011-01-01

    This article aims at validating a coding scheme designed to investigate the precursors of academic language occurring in early caretaker-child interactions. Exposure to the academic dimensions of language is an important asset for children to be successful in academic settings. The proposed analytical framework, based on systemic functional…

  5. Design and implementation of H.264 based embedded video coding technology

    Science.gov (United States)

    Mao, Jian; Liu, Jinming; Zhang, Jiemin

    2016-03-01

    In this paper, an embedded system for remote online video monitoring was designed and developed to capture and record the real-time circumstances in elevator. For the purpose of improving the efficiency of video acquisition and processing, the system selected Samsung S5PV210 chip as the core processor which Integrated graphics processing unit. And the video was encoded with H.264 format for storage and transmission efficiently. Based on S5PV210 chip, the hardware video coding technology was researched, which was more efficient than software coding. After running test, it had been proved that the hardware video coding technology could obviously reduce the cost of system and obtain the more smooth video display. It can be widely applied for the security supervision [1].

  6. A Method and Its Practice for Teaching the Fundamental Technology of Communication Protocols and Coding

    Science.gov (United States)

    Kobayashi, Tetsuji

    The education of information and communication technologies is important for engineering, and it includes terminals, communication media, transmission, switching, software, communication protocols, coding, etc. The proposed teaching method for protocols is based on the HDLC (High-level Data Link Control) procedures using our newly developed software “HDLC trainer” , and includes the extensions for understanding other protocols such as TCP/IP. As for teaching the coding theory that is applied for the error control in protocols, we use both of a mathematical programming language and a general-purpose programming language. We have practiced and evaluated the proposed teaching method in our college, and it is shown that the method has remarkable effects for understanding the fundamental technology of protocols and coding.

  7. NEPHTIS: 2D/3D validation elements using MCNP4c and TRIPOLI4 Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Courau, T.; Girardi, E. [EDF R and D/SINETICS, 1av du General de Gaulle, F92141 Clamart CEDEX (France); Damian, F.; Moiron-Groizard, M. [DEN/DM2S/SERMA/LCA, CEA Saclay, F91191 Gif-sur-Yvette CEDEX (France)

    2006-07-01

    High Temperature Reactors (HTRs) appear as a promising concept for the next generation of nuclear power applications. The CEA, in collaboration with AREVA-NP and EDF, is developing a core modeling tool dedicated to the prismatic block-type reactor. NEPHTIS (Neutronics Process for HTR Innovating System) is a deterministic codes system based on a standard two-steps Transport-Diffusion approach (APOLLO2/CRONOS2). Validation of such deterministic schemes usually relies on Monte-Carlo (MC) codes used as a reference. However, when dealing with large HTR cores the fission source stabilization is rather poor with MC codes. In spite of this, it is shown in this paper that MC simulations may be used as a reference for a wide range of configurations. The first part of the paper is devoted to 2D and 3D MC calculations of a HTR core with control devices. Comparisons between MCNP4c and TRIPOLI4 MC codes are performed and show very consistent results. Finally, the last part of the paper is devoted to the code to code validation of the NEPHTIS deterministic scheme. (authors)

  8. Relative validity of the pre-coded food diary used in the Danish National Survey of Diet and Physical Activity

    DEFF Research Database (Denmark)

    Knudsen, Vibeke Kildegaard; Gille, Maj-Britt; Nielsen, Trine Holmgaard

    2011-01-01

    Objective: To determine the relative validity of the pre-coded food diary applied in the Danish National Survey of Dietary Habits and Physical Activity. Design: A cross-over study among seventy-two adults (aged 20 to 69 years) recording diet by means of a pre-coded food diary over 4 d and a 4 d...... weighed food record. Intakes of foods and drinks were estimated, and nutrient intakes were calculated. Means and medians of intake were compared, and crossclassification of individuals according to intake was performed. To assess agreement between the two methods, Pearson and Spearman’s correlation...... coefficients and weighted kappa coefficients were calculated. Setting: Validation study of the pre-coded food diary against a 4 d weighed food record. Subjects: Seventy-two volunteer, healthy free-living adults (thirty-five males, thirty-seven females). Results: Intakes of cereals and vegetables were higher...

  9. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)

    Science.gov (United States)

    Benson, Markland

    2008-01-01

    The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.

  10. Use of wearable technology for performance assessment: a validation study.

    Science.gov (United States)

    Papi, Enrica; Osei-Kuffour, Denise; Chen, Yen-Ming A; McGregor, Alison H

    2015-07-01

    The prevalence of osteoarthritis is increasing globally but current compliance with rehabilitation remains poor. This study explores whether wearable sensors can be used to provide objective measures of performance with a view to using them as motivators to aid compliance to osteoarthritis rehabilitation. More specifically, the use of a novel attachable wearable sensor integrated into clothing and inertial measurement units located in two different positions, at the waist and thigh pocket, was investigated. Fourteen healthy volunteers were asked to complete exercises adapted from a knee osteoarthritis rehabilitation programme whilst wearing the three sensors including five times sit-to-stand test, treadmill walking at slow, preferred and fast speeds. The performances of the three sensors were validated against a motion capture system and an instrumented treadmill. The systems showed a high correlation (r(2) > 0.7) and agreement (mean difference range: -0.02-0.03 m, 0.005-0.68 s) with gold standards. The novel attachable wearable sensor was able to monitor exercise tasks as well as the inertial measurement units (ICC > 0.95). Results also suggested that a functional placement (e.g., situated in a pocket) is a valid position for performance monitoring. This study shows the potential use of wearable technologies for assessing subject performance during exercise and suggests functional solutions to enhance acceptance.

  11. Validation of the RELAP5 code for the modeling of flashing-induced instabilities under natural-circulation conditions using experimental data from the CIRCUS test facility

    Energy Technology Data Exchange (ETDEWEB)

    Kozmenkov, Y. [Helmholtz-Zentrum Dresden-Rossendorf e.V. (FZD), Institute of Safety Research, P.O.B. 510119, D-01324 Dresden (Germany); Institute of Physics and Power Engineering, Obninsk (Russian Federation); Rohde, U., E-mail: U.Rohde@hzdr.de [Helmholtz-Zentrum Dresden-Rossendorf e.V. (FZD), Institute of Safety Research, P.O.B. 510119, D-01324 Dresden (Germany); Manera, A. [Paul Scherrer Institute (Switzerland)

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We report about the simulation of flashing-induced instabilities in natural circulation systems. Black-Right-Pointing-Pointer Flashing-induced instabilities are of relevance for operation of pool-type reactors of small power at low pressure. Black-Right-Pointing-Pointer The RELAP5 code is validated against measurement data from natural circulation experiments. Black-Right-Pointing-Pointer The magnitude and frequency of the oscillations were reproduced in good agreement with the measurement data. - Abstract: This paper reports on the use of the RELAP5 code for the simulation of flashing-induced instabilities in natural circulation systems. The RELAP 5 code is intended to be used for the simulation of transient processes in the Russian RUTA reactor concept operating at atmospheric pressure with forced convection of coolant. However, during transient processes, natural circulation with flashing-induced instabilities might occur. The RELAP5 code is validated against measurement data from natural circulation experiments performed within the framework of a European project (NACUSP) on the CIRCUS facility. The facility, built at the Delft University of Technology in The Netherlands, is a water/steam 1:1 height-scaled loop of a typical natural-circulation-cooled BWR. It was shown that the RELAP5 code is able to model all relevant phenomena related to flashing induced instabilities. The magnitude and frequency of the oscillations were reproduced in a good agreement with the measurement data. The close correspondence to the experiments was reached by detailed modeling of all components of the CIRCUS facility including the heat exchanger, the buffer vessel and the steam dome at the top of the facility.

  12. Simulation of plasma turbulence in scrape-off layer conditions: the GBS code, simulation results and code validation

    Science.gov (United States)

    Ricci, P.; Halpern, F. D.; Jolliet, S.; Loizu, J.; Mosetto, A.; Fasoli, A.; Furno, I.; Theiler, C.

    2012-12-01

    Based on the drift-reduced Braginskii equations, the Global Braginskii Solver, GBS, is able to model the scrape-off layer (SOL) plasma turbulence in terms of the interplay between the plasma outflow from the tokamak core, the turbulent transport, and the losses at the vessel. Model equations, the GBS numerical algorithm, and GBS simulation results are described. GBS has been first developed to model turbulence in basic plasma physics devices, such as linear and simple magnetized toroidal devices, which contain some of the main elements of SOL turbulence in a simplified setting. In this paper we summarize the findings obtained from the simulation carried out in these configurations and we report the first simulations of SOL turbulence. We also discuss the validation project that has been carried out together with the GBS development.

  13. Development of Safety Analysis Codes and Experimental Validation for a Very High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang, H. Oh, PhD; Cliff Davis; Richard Moore

    2004-11-01

    The very high temperature gas-cooled reactors (VHTGRs) are those concepts that have average coolant temperatures above 900 degrees C or operational fuel temperatures above 1250 degrees C. These concepts provide the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation and nuclear hydrogen generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperatures to support process heat applications, such as desalination and cogeneration, the VHTGR's higher temperatures are suitable for particular applications such as thermochemical hydrogen production. However, the high temperature operation can be detrimental to safety following a loss-of-coolant accident (LOCA) initiated by pipe breaks caused by seismic or other events. Following the loss of coolant through the break and coolant depressurization, air from the containment will enter the core by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structures and fuel. The oxidation will release heat and accelerate the heatup of the reactor core. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. The Idaho National Engineering and Environmental Laboratory (INEEL) has investigated this event for the past three years for the HTGR. However, the computer codes used, and in fact none of the world's computer codes, have been sufficiently developed and validated to reliably predict this event. New code development, improvement of the existing codes, and experimental validation are imperative to narrow the uncertaninty in the predictions of this type of accident. The objectives of this Korean/United States collaboration are to develop advanced computational methods for VHTGR safety analysis codes and to validate these computer codes.

  14. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  15. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  16. Development and Validation of Generalized Lifting Line Based Code for Wind Turbine Aerodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Grasso, F.; Garrel, A. van; Schepers, J.G. [ECN Wind Energy, Petten (Netherlands)

    2011-01-15

    In order to accurately model large, advanced and efficient wind turbines, more reliable and realistic aerodynamic simulation tools are necessary. Most of the available codes are based on the blade element momentum theory. These codes are fast but not well suited to properly describe the physics of wind turbines. On the other hand, by using computational fluid-dynamics codes, in which full Navier-Stokes equations are implemented, a strong expertise and a lot of computer time to perform analyses are required. A code, based on a generalized form of Prandtl's lifting line in combination with a free wake vortex wake has been developed at Energy research Centre of Netherlands. In the present work, the development of this new code is presented, together with the results coming from numerical-experimental comparisons. The final part of the work is dedicated to the analysis of innovative configurations like winglets and curved blades.

  17. Validation of coded aperture coherent scatter spectral imaging for normal and neoplastic breast tissues via surgical pathology

    Science.gov (United States)

    Morris, R. E.; Albanese, K. E.; Lakshmanan, M. N.; McCall, S. J.; Greenberg, J. A.; Kapadia, A. J.

    2016-03-01

    This study intends to validate the sensitivity and specificity of coded aperture coherent scatter spectral imaging (CACSSI) by comparison to standard histological preparation and pathologic analysis methods used to differentiate normal and neoplastic breast tissues. A composite overlay of the CACSSI rendered image and pathologist interpreted stained sections validate the ability of CACSSI to differentiate normal and neoplastic breast structures ex-vivo. Via comparison to pathologist annotated slides, the CACSSI system may be further optimized to maximize sensitivity and specificity for differentiation of breast carcinomas.

  18. Chiari malformation Type I surgery in pediatric patients. Part 1: validation of an ICD-9-CM code search algorithm.

    Science.gov (United States)

    Ladner, Travis R; Greenberg, Jacob K; Guerrero, Nicole; Olsen, Margaret A; Shannon, Chevis N; Yarbrough, Chester K; Piccirillo, Jay F; Anderson, Richard C E; Feldstein, Neil A; Wellons, John C; Smyth, Matthew D; Park, Tae Sung; Limbrick, David D

    2016-05-01

    OBJECTIVE Administrative billing data may facilitate large-scale assessments of treatment outcomes for pediatric Chiari malformation Type I (CM-I). Validated International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code algorithms for identifying CM-I surgery are critical prerequisites for such studies but are currently only available for adults. The objective of this study was to validate two ICD-9-CM code algorithms using hospital billing data to identify pediatric patients undergoing CM-I decompression surgery. METHODS The authors retrospectively analyzed the validity of two ICD-9-CM code algorithms for identifying pediatric CM-I decompression surgery performed at 3 academic medical centers between 2001 and 2013. Algorithm 1 included any discharge diagnosis code of 348.4 (CM-I), as well as a procedure code of 01.24 (cranial decompression) or 03.09 (spinal decompression or laminectomy). Algorithm 2 restricted this group to the subset of patients with a primary discharge diagnosis of 348.4. The positive predictive value (PPV) and sensitivity of each algorithm were calculated. RESULTS Among 625 first-time admissions identified by Algorithm 1, the overall PPV for CM-I decompression was 92%. Among the 581 admissions identified by Algorithm 2, the PPV was 97%. The PPV for Algorithm 1 was lower in one center (84%) compared with the other centers (93%-94%), whereas the PPV of Algorithm 2 remained high (96%-98%) across all subgroups. The sensitivity of Algorithms 1 (91%) and 2 (89%) was very good and remained so across subgroups (82%-97%). CONCLUSIONS An ICD-9-CM algorithm requiring a primary diagnosis of CM-I has excellent PPV and very good sensitivity for identifying CM-I decompression surgery in pediatric patients. These results establish a basis for utilizing administrative billing data to assess pediatric CM-I treatment outcomes.

  19. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  20. Notes for a Genealogy of Dress Codes and Aestheticizing Technologies in the Colombian School

    OpenAIRE

    Alexánder Aldana Bautista

    2016-01-01

    This article shows an analysis of the schoolchild’s construction from a series of aestheticizing technologies that constitute a child’s body in which the aesthetic utopia of modern school is inscribed. The paper, derived from an archaeological–genealogical research about school uniform and dress codes in the Colombian school during the late twentieth century and the early twenty– first century revolves around the following questions: What enabled the emergence of some discourses about school ...

  1. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    OpenAIRE

    Alessandro Fornaciai; Simone Tarquini; Massimiliano Favalli

    2011-01-01

    The use of a lava-flow simulation (DOWNFLOW) probabilistic code and airborne light detection and ranging (LIDAR) technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy). The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed...

  2. The Mistral base case to validate kinetic and fluid turbulence transport codes of the edge and SOL plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Dif-Pradalier, G., E-mail: gdifpradalier@ucsd.edu [Center for Astrophysics and Space Sciences, UCSD, La Jolla, CA 92093 (United States); Gunn, J. [CEA, IRFM, F-13108 Saint Paul lez Durance (France); Ciraolo, G. [M2P2, UMR 6181-CNRS, 38 Rue F. Joliot-Curie, 13451 Marseille (France); Chang, C.S. [Courant Institute of Mathematical Sciences, N.Y. University, New York, NY 10012 (United States); Chiavassa, G. [M2P2, UMR 6181-CNRS, 38 Rue F. Joliot-Curie, 13451 Marseille (France); Diamond, P. [Center for Astrophysics and Space Sciences, UCSD, La Jolla, CA 92093 (United States); Fedorczak, N. [CEA, IRFM, F-13108 Saint Paul lez Durance (France); Ghendrih, Ph., E-mail: philippe.ghendrih@cea.fr [CEA, IRFM, F-13108 Saint Paul lez Durance (France); Isoardi, L. [M2P2, UMR 6181-CNRS, 38 Rue F. Joliot-Curie, 13451 Marseille (France); Kocan, M. [CEA, IRFM, F-13108 Saint Paul lez Durance (France); Ku, S. [Courant Institute of Mathematical Sciences, N.Y. University, New York, NY 10012 (United States); Serre, E. [M2P2, UMR 6181-CNRS, 38 Rue F. Joliot-Curie, 13451 Marseille (France); Tamain, P. [CEA, IRFM, F-13108 Saint Paul lez Durance (France)

    2011-08-01

    Experimental data from the Tore Supra experiments are extrapolated in the SOL and edge to investigate the Kelvin-Helmholtz instability. The linear analysis indicates that a large part of the SOL is rather unstable. The effort is part of the set-up of the Mistral base case that is organised to validate the codes and address new issues on turbulent edges, including the comparison of kinetic and fluid modelling in the edge plasma.

  3. SIGACE Code for Generating High-Temperature ACE Files; Validation and Benchmarking

    Science.gov (United States)

    Sharma, Amit R.; Ganesan, S.; Trkov, A.

    2005-05-01

    A code named SIGACE has been developed as a tool for MCNP users within the scope of a research contract awarded by the Nuclear Data Section of the International Atomic Energy Agency (IAEA) (Ref: 302-F4-IND-11566 B5-IND-29641). A new recipe has been evolved for generating high-temperature ACE files for use with the MCNP code. Under this scheme the low-temperature ACE file is first converted to an ENDF formatted file using the ACELST code and then Doppler broadened, essentially limited to the data in the resolved resonance region, to any desired higher temperature using SIGMA1. The SIGACE code then generates a high-temperature ACE file for use with the MCNP code. A thinning routine has also been introduced in the SIGACE code for reducing the size of the ACE files. The SIGACE code and the recipe for generating ACE files at higher temperatures has been applied to the SEFOR fast reactor benchmark problem (sodium-cooled fast reactor benchmark described in ENDF-202/BNL-19302, 1974 document). The calculated Doppler coefficient is in good agreement with the experimental value. A similar calculation using ACE files generated directly with the NJOY system also agrees with our SIGACE computed results. The SIGACE code and the recipe is further applied to study the numerical benchmark configuration of selected idealized PWR pin cell configurations with five different fuel enrichments as reported by Mosteller and Eisenhart. The SIGACE code that has been tested with several FENDL/MC files will be available, free of cost, upon request, from the Nuclear Data Section of the IAEA.

  4. Validation of the BISON 3D Fuel Performance Code: Temperature Comparisons for Concentrically and Eccentrically Located Fuel Pellets

    Energy Technology Data Exchange (ETDEWEB)

    J. D. Hales; D. M. Perez; R. L. Williamson; S. R. Novascone; B. W. Spencer

    2013-03-01

    BISON is a modern finite-element based nuclear fuel performance code that has been under development at the Idaho National Laboratory (USA) since 2009. The code is applicable to both steady and transient fuel behaviour and is used to analyse either 2D axisymmetric or 3D geometries. BISON has been applied to a variety of fuel forms including LWR fuel rods, TRISO-coated fuel particles, and metallic fuel in both rod and plate geometries. Code validation is currently in progress, principally by comparison to instrumented LWR fuel rods. Halden IFA experiments constitute a large percentage of the current BISON validation base. The validation emphasis here is centreline temperatures at the beginning of fuel life, with comparisons made to seven rods from the IFA-431 and 432 assemblies. The principal focus is IFA-431 Rod 4, which included concentric and eccentrically located fuel pellets. This experiment provides an opportunity to explore 3D thermomechanical behaviour and assess the 3D simulation capabilities of BISON. Analysis results agree with experimental results showing lower fuel centreline temperatures for eccentric fuel with the peak temperature shifted from the centreline. The comparison confirms with modern 3D analysis tools that the measured temperature difference between concentric and eccentric pellets is not an artefact and provides a quantitative explanation for the difference.

  5. Validity of the International Classification of Diseases 10th revision code for hospitalisation with hyponatraemia in elderly patients

    Science.gov (United States)

    Gandhi, Sonja; Shariff, Salimah Z; Fleet, Jamie L; Weir, Matthew A; Jain, Arsh K; Garg, Amit X

    2012-01-01

    Objective To evaluate the validity of the International Classification of Diseases, 10th Revision (ICD-10) diagnosis code for hyponatraemia (E87.1) in two settings: at presentation to the emergency department and at hospital admission. Design Population-based retrospective validation study. Setting Twelve hospitals in Southwestern Ontario, Canada, from 2003 to 2010. Participants Patients aged 66 years and older with serum sodium laboratory measurements at presentation to the emergency department (n=64 581) and at hospital admission (n=64 499). Main outcome measures Sensitivity, specificity, positive predictive value and negative predictive value comparing various ICD-10 diagnostic coding algorithms for hyponatraemia to serum sodium laboratory measurements (reference standard). Median serum sodium values comparing patients who were code positive and code negative for hyponatraemia. Results The sensitivity of hyponatraemia (defined by a serum sodium ≤132 mmol/l) for the best-performing ICD-10 coding algorithm was 7.5% at presentation to the emergency department (95% CI 7.0% to 8.2%) and 10.6% at hospital admission (95% CI 9.9% to 11.2%). Both specificities were greater than 99%. In the two settings, the positive predictive values were 96.4% (95% CI 94.6% to 97.6%) and 82.3% (95% CI 80.0% to 84.4%), while the negative predictive values were 89.2% (95% CI 89.0% to 89.5%) and 87.1% (95% CI 86.8% to 87.4%). In patients who were code positive for hyponatraemia, the median (IQR) serum sodium measurements were 123 (119–126) mmol/l and 125 (120–130) mmol/l in the two settings. In code negative patients, the measurements were 138 (136–140) mmol/l and 137 (135–139) mmol/l. Conclusions The ICD-10 diagnostic code for hyponatraemia differentiates between two groups of patients with distinct serum sodium measurements at both presentation to the emergency department and at hospital admission. However, these codes underestimate the true incidence of hyponatraemia

  6. Development and Validation of Information Technology Mentor Teacher Attitude Scale: A Pilot Study

    Science.gov (United States)

    Saltan, Fatih

    2015-01-01

    The aim of this study development and validation of a teacher attitude scale toward Information Technology Mentor Teachers (ITMT). ITMTs give technological support to other teachers for integration of technology in their lessons. In the literature, many instruments have been developed to measure teachers' attitudes towards the technological tools…

  7. 软件代码测试技术%Testing Technologies for Software Code

    Institute of Scientific and Technical Information of China (English)

    金大海; 宫云战; 王雅文; 黄俊飞

    2015-01-01

    Code defect mode detection and testing data automatic generation are typical testing technologies of white-box testing. Now there are a lot of tools for above-mentioned technologies, but they have diferent realizing theories and application efects. This paper presents an efective software code testing technology based on deep research of defect mode and case generation technology. First, it introduces defect mode detection technology from the aspects of technical features, technical architecture, defect mode and the key technologies. Second, it describes the automatic data generation technology from the aspects of technical architecture, covering criterion and correlation technologies. In the end, the two technologies are respectively used for the domestic tools—Software defect testing system(DTS) and automatic code testing system(CTS). And they achieve good results.%在白盒测试技术中,代码缺陷模式检测和测试数据自动生成是两个典型的测试技术,目前虽然面向此类技术的工具众多,但其实现原理及应用效果却各不相同.文章基于对缺陷模式及用例生成技术的深入研究,提出一种有效的软件代码测试技术.首先从技术特点、技术架构、缺陷模式和关键技术四个方面介绍缺陷模式检测技术,然后从技术架构、覆盖准则和相关技术三个方面介绍自动化数据生成技术,最后将这两种技术分别应用于国产测试工具——软件缺陷检测系统(DTS)和自动化单元测试系统(CTS),取得较好的效果.

  8. A passive optical network based on optical code division multiplexing and time division multiple access technology

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A passive optical network (PON) scheme based on optical code division multiplexing (OCDM) for the downstream traffics is proposed and analyzed in detail. In the PON, the downstream traffics are broadcasted by OCDM technology to guarantee the security, while the upstream traffics pass through the same optical fiber by the common time division multiple access (TDMA) technology to decrease the cost.This schemes are denoted as OCDM/TDMA-PON, which can be applied to an optical access network (OAN) with full services on demand, such as Internet protocol, video on demand, tele-presence and high quality audio. The proposed OCDM/TDMA-PON scheme combines advantages of PON, TDMA, and OCDM technology. Simulation results indicate that the designed scheme improves the OAN performance,and enhances flexibility and scalability of the system.

  9. On the implementation of new technology modules for fusion reactor systems codes

    Energy Technology Data Exchange (ETDEWEB)

    Franza, F., E-mail: fabrizio.franza@kit.edu [Institute of Neutron Physics and Reactor Technology, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany); Boccaccini, L.V.; Fisher, U. [Institute of Neutron Physics and Reactor Technology, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany); Gade, P.V.; Heller, R. [Institute for Technical Physics, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany)

    2015-10-15

    Highlights: • At KIT a new technology modules for systems code are under development. • A new algorithm for the definition of the main reactor's components is defined. • A new blanket model based on 1D neutronics analysis is described. • A new TF coil stress model based on 3D electromagnetic analysis is described. • The models were successfully benchmarked against more detailed models. - Abstract: In the frame of the pre-conceptual design of the next generation fusion power plant (DEMO), systems codes are being used from nearly 20 years. In such computational tools the main reactor components (e.g. plasma, blanket, magnets, etc.) are integrated in a unique computational algorithm and simulated by means of rather simplified mathematical models (e.g. steady state and zero dimensional models). The systems code tries to identify the main design parameters (e.g. major radius, net electrical power, toroidal field) and to make the reactor's requirements and constraints to be simultaneously accomplished. In fusion applications, requirements and constraints can be either of physics or technology kind. Concerning the latest category, at Karlsruhe Institute of Technology a new modelling activity has been recently launched aiming to develop improved models focusing on the main technology areas, such as neutronics, thermal-hydraulics, electromagnetics, structural mechanics, fuel cycle and vacuum systems. These activities started by developing: (1) a geometry model for the definition of poloidal profiles for the main reactors components, (2) a blanket model based on neutronics analyses and (3) a toroidal field coil model based on electromagnetic analysis, firstly focusing on the stresses calculations. The objective of this paper is therefore to give a short outline of these models.

  10. Use of an Accurate DNS Particulate Flow Method to Supply and Validate Boundary Conditions for the MFIX Code

    Energy Technology Data Exchange (ETDEWEB)

    Zhi-Gang Feng

    2012-05-31

    The simulation of particulate flows for industrial applications often requires the use of two-fluid models, where the solid particles are considered as a separate continuous phase. One of the underlining uncertainties in the use of the two-fluid models in multiphase computations comes from the boundary condition of the solid phase. Typically, the gas or liquid fluid boundary condition at a solid wall is the so called no-slip condition, which has been widely accepted to be valid for single-phase fluid dynamics provided that the Knudsen number is low. However, the boundary condition for the solid phase is not well understood. The no-slip condition at a solid boundary is not a valid assumption for the solid phase. Instead, several researchers advocate a slip condition as a more appropriate boundary condition. However, the question on the selection of an exact slip length or a slip velocity coefficient is still unanswered. Experimental or numerical simulation data are needed in order to determinate the slip boundary condition that is applicable to a two-fluid model. The goal of this project is to improve the performance and accuracy of the boundary conditions used in two-fluid models such as the MFIX code, which is frequently used in multiphase flow simulations. The specific objectives of the project are to use first principles embedded in a validated Direct Numerical Simulation particulate flow numerical program, which uses the Immersed Boundary method (DNS-IB) and the Direct Forcing scheme in order to establish, modify and validate needed energy and momentum boundary conditions for the MFIX code. To achieve these objectives, we have developed a highly efficient DNS code and conducted numerical simulations to investigate the particle-wall and particle-particle interactions in particulate flows. Most of our research findings have been reported in major conferences and archived journals, which are listed in Section 7 of this report. In this report, we will present a

  11. Validation of vortex code viscous models using lidar wake measurements and CFD

    DEFF Research Database (Denmark)

    Branlard, Emmanuel; Machefaux, Ewan; Gaunaa, Mac;

    2014-01-01

    The newly implemented vortex code Omnivor coupled to the aero-servo-elastic tool hawc2 is described in this paper. Vortex wake improvements by the implementation of viscous effects are considered. Different viscous models are implemented and compared with each other. Turbulent flow fields...... with sheared inflow are used to compare the vortex code performance with CFD and lidar measurements. Laminar CFD computations are used to evaluate the performance of the viscous models. Consistent results between the vortex code and CFD tool are obtained up to three diameters downstream. The modelling...... of viscous boundaries appear more important than the modelling of viscosity in the wake. External turbulence and shear appear sufficient but their full potential flow modelling would be preferred....

  12. TRACE code validation for BWR spray cooling injection based on GOTA facility experiments

    Energy Technology Data Exchange (ETDEWEB)

    Racca, S. [San Piero a Grado Nuclear Research Group (GRNSPG), Pisa (Italy); Kozlowski, T. [Royal Inst. of Tech., Stockholm (Sweden)

    2011-07-01

    Best estimate codes have been used in the past thirty years for the design, licensing and safety of NPP. Nevertheless, large efforts are necessary for the qualification and the assessment of such codes. The aim of this work is to study the main phenomena involved in the emergency spray cooling injection in a Swedish designed BWR. For this purpose, data from the Swedish separate effect test facility GOTA have been simulated using TRACE version 5.0 Patch 2. Furthermore, uncertainty calculations have been performed with the propagation of input errors method and the identification of the input parameters that mostly influence the peak cladding temperature has been performed. (author)

  13. Validation of a personalized dosimetric evaluation tool (Oedipe) for targeted radiotherapy based on the Monte Carlo MCNPX code.

    Science.gov (United States)

    Chiavassa, S; Aubineau-Lanièce, I; Bitar, A; Lisbona, A; Barbet, J; Franck, D; Jourdain, J R; Bardiès, M

    2006-02-07

    Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.

  14. Reliability and predictive validity of the Food Technology Neophobia Scale.

    Science.gov (United States)

    Evans, G; Kermarrec, C; Sable, T; Cox, D N

    2010-04-01

    The recently developed Food Technology Neophobia Scale (FTNS) was further tested to assess scale reliability. On 2 occasions, 131 consumers responded to the FTNS, technologies descriptions and 'willingness to try' food technologies for 7 products. In the second session, they were offered foods to taste. 'Information seeking' was measured as a potential confounder of stability. The intra-class correlation was 0.86 and there was no difference between the FTNS scores (p>0.05). Correlations with 'willingness to try' novel technologies were -0.39 to -0.58. The FTNS is confirmed as a reliable and predictive measure of responses to novel food technologies.

  15. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    Science.gov (United States)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery

  16. Experimental validation of the DPM Monte Carlo code using minimally scattered electron beams in heterogeneous media

    Science.gov (United States)

    Chetty, Indrin J.; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A.; Wilderman, Scott J.; Bielajew, Alex F.

    2002-06-01

    A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions.

  17. Validity of diagnostic codes and laboratory measurements to identify patients with idiopathic acute liver injury in a hospital database

    DEFF Research Database (Denmark)

    Udo, Renate; Maitland-van der Zee, Anke H; Egberts, Toine C G;

    2016-01-01

    of liver enzyme values (ALT > 2× upper limit of normal (ULN); AST > 1ULN + AP > 1ULN + bilirubin > 1ULN; ALT > 3ULN; ALT > 3ULN + bilirubin > 2ULN; ALT > 10ULN) and (II) algorithms based on solely liver enzyme values (ALT > 3ULN + bilirubin > 2ULN; ALT > 10ULN). Hospital medical records were reviewed......PURPOSE: The development and validation of algorithms to identify cases of idiopathic acute liver injury (ALI) are essential to facilitate epidemiologic studies on drug-induced liver injury. The aim of this study is to determine the ability of diagnostic codes and laboratory measurements...... 32% (13/41) to 48% (43/90) with the highest PPV found with ALT > 2ULN. The PPV for (II) algorithms with liver test abnormalities was maximally 26% (150/571). CONCLUSIONS: The algorithm based on ICD-9-CM codes indicative of ALI combined with abnormal liver-related laboratory tests is the most...

  18. Notes for a Genealogy of Dress Codes and Aestheticizing Technologies in the Colombian School

    Directory of Open Access Journals (Sweden)

    Alexánder Aldana Bautista

    2016-08-01

    Full Text Available This article shows an analysis of the schoolchild’s construction from a series of aestheticizing technologies that constitute a child’s body in which the aesthetic utopia of modern school is inscribed. The paper, derived from an archaeological–genealogical research about school uniform and dress codes in the Colombian school during the late twentieth century and the early twenty– first century revolves around the following questions: What enabled the emergence of some discourses about school bodies, appropriate appearance and attire in the Colombian school? How did the school subject became a properly uniformed, seemly, neat, respectful and beauty person?

  19. Offshore Code Comparison Collaboration (OC3) for IEA Wind Task 23 Offshore Wind Technology and Deployment

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J.; Musial, W.

    2010-12-01

    This final report for IEA Wind Task 23, Offshore Wind Energy Technology and Deployment, is made up of two separate reports, Subtask 1: Experience with Critical Deployment Issues and Subtask 2: Offshore Code Comparison Collaborative (OC3). Subtask 1 discusses ecological issues and regulation, electrical system integration, external conditions, and key conclusions for Subtask 1. Subtask 2 included here, is the larger of the two volumes and contains five chapters that cover background information and objectives of Subtask 2 and results from each of the four phases of the project.

  20. TELEMEDICINE NETWORK BASED ON CODE-DIVISION MULTIPLE ACCESS WIRELESS TECHNOLOGY

    Institute of Scientific and Technical Information of China (English)

    同鸣; 卞正中; 张亮

    2003-01-01

    Objective To satisfy the need of reliable and efficient telemedicine in many mobile and ambulance situations. Methods A practical telemedicine system bases on code-division multiple access (CDMA) wireless communication technology has been developed, which has never been mentioned before. The design scheme for the proposed system is described and detailed analysis of the network protocol stacks and the data flow is presented. Results Experiments on real time transmission of medical images using developed system have demonstrated that the system performance is satisfactory and acceptable. Conclusion The telemedicine system based on CDMA is easy to implement and has high quality of transmitted images. It would be a prospective solution to mobile telemedicine.

  1. Validation of lattice code 'EXCEL' with TIC experiments on uniform and regularly perturbed lattices

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishna, A., E-mail: anantatmula.ramakrishna@gmail.co [Atomic Energy Regulatory Board, Niyamak Bhavan, Anushaktinagar, Mumbai 400 094 (India); Jagannathan, V. [Light Water Reactors Physics Section, Reactor Physics Design Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Jain, R.P. [IIT Bombay, Mumbai (India)

    2010-12-15

    Temporary International Collective (TIC) was established in 1972 by an agreement among seven countries, namely, Bulgaria, Czechoslovakia, Germany, Hungary, Poland, Romania and Union of Soviet Socialist Republics. The main objective of TIC was to provide the experimental data for the reactor physics analysis of water cooled and water moderated power reactors (WWER). Extensive experimental work for different core configurations was carried out by TIC countries to investigate the physics behaviour of WWER lattices and the results were published in TIC volumes. Two VVER-1000 MWe reactors are currently in an advanced stage of construction and due for commissioning in Kudankulam, Tamil Nadu, India. Indigenous development of in-core fuel management computer codes for the analysis of hexagonal lattice cores is also in an advanced stage to address various design, operation and safety issues of VVER type cores. The validation of the above TIC lattice experiments will help in the identification of deficiencies in reactor physics design computational codes and the associated nuclear data libraries. In this paper, TIC experiments on uniform and regularly perturbed lattices have been analyzed as part of the validation of indigenous computer codes, EXCEL, TRIHEX-FA and HEXPIN developed at Light Water Reactors Physics Section, B.A.R.C. Neutron-nuclear multi-group cross-section libraries in WIMS/D format in 69/172 energy groups have been released by IAEA at the conclusion of WIMS library update project (WLUP). In the present study we have used libraries based on ENDF/B-6, ENDF/B-7, JEFF3.1 and JENDL3.2 evaluated nuclear datasets. The results of the theoretical analyses bring out the performance of the code system and various cross-section libraries.

  2. Validation of an Instrument to Measure Students' Motivation and Self-Regulation towards Technology Learning

    Science.gov (United States)

    Liou, Pey-Yan; Kuo, Pei-Jung

    2014-01-01

    Background: Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose: The present study is to validate an…

  3. Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study

    Science.gov (United States)

    Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald

    2015-03-01

    Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was

  4. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  5. TRIPOLI-4{sup ®} Monte Carlo code ITER A-lite neutronic model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jaboulay, Jean-Charles, E-mail: jean-charles.jaboulay@cea.fr [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Cayla, Pierre-Yves; Fausser, Clement [MILLENNIUM, 16 Av du Québec Silic 628, F-91945 Villebon sur Yvette (France); Damian, Frederic; Lee, Yi-Kang; Puma, Antonella Li; Trama, Jean-Christophe [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France)

    2014-10-15

    3D Monte Carlo transport codes are extensively used in neutronic analysis, especially in radiation protection and shielding analyses for fission and fusion reactors. TRIPOLI-4{sup ®} is a Monte Carlo code developed by CEA. The aim of this paper is to show its capability to model a large-scale fusion reactor with complex neutron source and geometry. A benchmark between MCNP5 and TRIPOLI-4{sup ®}, on the ITER A-lite model was carried out; neutron flux, nuclear heating in the blankets and tritium production rate in the European TBMs were evaluated and compared. The methodology to build the TRIPOLI-4{sup ®} A-lite model is based on MCAM and the MCNP A-lite model. Simplified TBMs, from KIT, were integrated in the equatorial-port. A good agreement between MCNP and TRIPOLI-4{sup ®} is shown; discrepancies are mainly included in the statistical error.

  6. Experimental validation of the DPM Monte Carlo code using minimally scattered electron beams in heterogeneous media

    Energy Technology Data Exchange (ETDEWEB)

    Chetty, Indrin J. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States)]. E-mail: indrin@med.umich.edu; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Wilderman, Scott J.; Bielajew, Alex F. [Department of Nuclear Engineering, University of Michigan, Ann Arbor, MI (United States)

    2002-06-07

    A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions. (author)

  7. Validation of a multidimensional deterministic nuclear data sensitivity and uncertainty code system: an application needing supercomputing

    Energy Technology Data Exchange (ETDEWEB)

    Bidaud, A.; Mastrangelo, V. [Conservatoire National des Arts et Metiers, Laboratoire de Physique (CNAM), 75 - Paris (France); Institut de Physique Nucleaire (IN2P3/CNRS) 91 - Orsay (France); Kodeli, I.; Sartori, E. [OECD NEA Data Bank, 92 - Issy les Moulineaux (France)

    2003-07-01

    The quality of nuclear core modelling is linked to the quality of basic nuclear data such as probability of reaction (i.e. cross sections) between neutrons and the nucleus of the core materials. Perturbation Theory, whose applications in nuclear science has been largely developed in the sixties provides tools for estimating the sensitivity of integral parameters such as k-eff, reaction rates, or breeding ratio to the cross sections. The computation with these tools requires approximations in the simulation of space, angles and energy dependent neutron transport. To minimise the impact of the geometry modelling approximations in the calculation, use of 3 dimensional multigroup transport codes is recommended. Sensitivity and uncertainty analyses are the tools needed to estimate the accuracy that a code system with data libraries can achieve. They can guide users as to the specific need for improved data to carry out reliable simulations. However, as full-scale models in 3 dimensions with refined descriptions of the phase-space are used, high performance computers and codes designed to run on parallel architectures are needed to obtain results within acceptable time limits.

  8. Validation of Framework Code Approach to a Life Prediction System for Fiber Reinforced Composites

    Science.gov (United States)

    Gravett, Phillip

    1997-01-01

    The grant was conducted by the MMC Life Prediction Cooperative, an industry/government collaborative team, Ohio Aerospace Institute (OAI) acted as the prime contractor on behalf of the Cooperative for this grant effort. See Figure I for the organization and responsibilities of team members. The technical effort was conducted during the period August 7, 1995 to June 30, 1996 in cooperation with Erwin Zaretsky, the LERC Program Monitor. Phil Gravett of Pratt & Whitney was the principal technical investigator. Table I documents all meeting-related coordination memos during this period. The effort under this grant was closely coordinated with an existing USAF sponsored program focused on putting into practice a life prediction system for turbine engine components made of metal matrix composites (MMC). The overall architecture of the NMC life prediction system was defined in the USAF sponsored program (prior to this grant). The efforts of this grant were focussed on implementing and tailoring of the life prediction system, the framework code within it and the damage modules within it to meet the specific requirements of the Cooperative. T'he tailoring of the life prediction system provides the basis for pervasive and continued use of this capability by the industry/government cooperative. The outputs of this grant are: 1. Definition of the framework code to analysis modules interfaces, 2. Definition of the interface between the materials database and the finite element model, and 3. Definition of the integration of the framework code into an FEM design tool.

  9. Validation of the HZETRN code for laboratory exposures with 1A GeV iron ions in several targets.

    Science.gov (United States)

    Walker, S A; Tweed, J; Wilson, J W; Cucinotta, F A; Tripathi, R K; Blattnig, S; Zeitlin, C; Heilbronn, L; Miller, J

    2005-01-01

    A new version of the HZETRN code capable of validation with HZE ions in either the laboratory or the space environment is under development. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and downshift. Measurements to test the model were performed at the Alternating Gradient Synchrotron and the NASA Space Radiation Laboratory at Brookhaven National Laboratory with iron ions. Surviving beam particles and produced fragments were measured with solid-state detectors. Beam analysis software has been written to relate the computational results to the measured energy loss spectra of the incident ions for rapid validation of modeled target transmission functions.

  10. Euler Technology Assessment program for preliminary aircraft design employing SPLITFLOW code with Cartesian unstructured grid method

    Science.gov (United States)

    Finley, Dennis B.

    1995-01-01

    This report documents results from the Euler Technology Assessment program. The objective was to evaluate the efficacy of Euler computational fluid dynamics (CFD) codes for use in preliminary aircraft design. Both the accuracy of the predictions and the rapidity of calculations were to be assessed. This portion of the study was conducted by Lockheed Fort Worth Company, using a recently developed in-house Cartesian-grid code called SPLITFLOW. The Cartesian grid technique offers several advantages for this study, including ease of volume grid generation and reduced number of cells compared to other grid schemes. SPLITFLOW also includes grid adaptation of the volume grid during the solution convergence to resolve high-gradient flow regions. This proved beneficial in resolving the large vortical structures in the flow for several configurations examined in the present study. The SPLITFLOW code predictions of the configuration forces and moments are shown to be adequate for preliminary design analysis, including predictions of sideslip effects and the effects of geometry variations at low and high angles of attack. The time required to generate the results from initial surface definition is on the order of several hours, including grid generation, which is compatible with the needs of the design environment.

  11. Decay heat measurement on fusion reactor materials and validation of calculation code system

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Decay heat rates for 32 fusion reactor relevant materials irradiated with 14-MeV neutrons were measured for the cooling time period between 1 minute and 400 days. With using the experimental data base, validity of decay heat calculation systems for fusion reactors were investigated. (author)

  12. Improving reliability of non-volatile memory technologies through circuit level techniques and error control coding

    Science.gov (United States)

    Yang, Chengen; Emre, Yunus; Cao, Yu; Chakrabarti, Chaitali

    2012-12-01

    Non-volatile resistive memories, such as phase-change RAM (PRAM) and spin transfer torque RAM (STT-RAM), have emerged as promising candidates because of their fast read access, high storage density, and very low standby power. Unfortunately, in scaled technologies, high storage density comes at a price of lower reliability. In this article, we first study in detail the causes of errors for PRAM and STT-RAM. We see that while for multi-level cell (MLC) PRAM, the errors are due to resistance drift, in STT-RAM they are due to process variations and variations in the device geometry. We develop error models to capture these effects and propose techniques based on tuning of circuit level parameters to mitigate some of these errors. Unfortunately for reliable memory operation, only circuit-level techniques are not sufficient and so we propose error control coding (ECC) techniques that can be used on top of circuit-level techniques. We show that for STT-RAM, a combination of voltage boosting and write pulse width adjustment at the circuit-level followed by a BCH-based ECC scheme can reduce the block failure rate (BFR) to 10-8. For MLC-PRAM, a combination of threshold resistance tuning and BCH-based product code ECC scheme can achieve the same target BFR of 10-8. The product code scheme is flexible; it allows migration to a stronger code to guarantee the same target BFR when the raw bit error rate increases with increase in the number of programming cycles.

  13. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    Directory of Open Access Journals (Sweden)

    Alessandro Fornaciai

    2011-12-01

    Full Text Available The use of a lava-flow simulation (DOWNFLOW probabilistic code and airborne light detection and ranging (LIDAR technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy. The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed quantitative picture of the topographic changes. The results highlight how the flow field evolves as a number of narrow (5-15 m wide disjointed flow units that are fed simultaneously by uneven lava pulses that advance within formed channels. These flow units have widely ranging advance velocities (3-90 m/h. Overflows, bifurcations and braiding are also clearly displayed. In such a complex scenario, the suitability of deterministic codes for lava-flow simulation can be hampered by the fundamental difficulty of measuring the flow parameters (e.g. the lava discharge rate, or the lava viscosity of a single flow unit. However, the DOWNFLOW probabilistic code approaches this point statistically and needs no direct knowledge of flow parameters. DOWNFLOW intrinsically accounts for complexities and perturbations of lava flows by randomly varying the pre-eruption topography. This DOWNFLOW code is systematically applied here over Mount Etna, to derive a lava-flow hazard map based on: (i the topography of the volcano; (ii the probability density function for vent opening; and (iii a law for the expected lava-flow length for all of the computational vents considered. Changes in the hazard due to the recent morphological evolution of Mount Etna have also been addressed.

  14. Analysis of the technology acceptance model in examining hospital nurses' behavioral intentions toward the use of bar code medication administration.

    Science.gov (United States)

    Song, Lunar; Park, Byeonghwa; Oh, Kyeung Mi

    2015-04-01

    Serious medication errors continue to exist in hospitals, even though there is technology that could potentially eliminate them such as bar code medication administration. Little is known about the degree to which the culture of patient safety is associated with behavioral intention to use bar code medication administration. Based on the Technology Acceptance Model, this study evaluated the relationships among patient safety culture and perceived usefulness and perceived ease of use, and behavioral intention to use bar code medication administration technology among nurses in hospitals. Cross-sectional surveys with a convenience sample of 163 nurses using bar code medication administration were conducted. Feedback and communication about errors had a positive impact in predicting perceived usefulness (β=.26, Pmodel predicting for behavioral intention, age had a negative impact (β=-.17, Pmodel explained 24% (Ptechnology.

  15. Calibration/Validation Technology for the CO2 Satellite Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to develop high altitude CO2 analyzer technology that can be deployed on the research aircraft of NASA's Airborne Science Program (ASP). The...

  16. Field Validation of Visual Cleaning Performance Indicator (VCPI) Technology

    Science.gov (United States)

    2007-08-31

    test panels. Panels sets included 2024-T3 aluminum alloy sheet (Air Force platform), and primer coated HY80 steel alloy (Navy platform). 1. Weight...Cleaning Petfmmance Indicator (VCPI) tedmology as a means to verify surface cleanliness on aluminum and painted steel alloys. The VCPI technology...the surface cleanliness of unpainted structures fabricated from aluminum and steel alloys. In concept, the VCPI technology represents an innovative

  17. Validation and Verification of MCNP6 Against Intermediate and High-Energy Experimental Data and Results by Other Codes

    CERN Document Server

    Mashnik, Stepan G

    2010-01-01

    MCNP6, the latest and most advanced LANL transport code representing a recent merger of MCNP5 and MCNPX, has been Validated and Verified (V&V) against a variety of intermediate and high-energy experimental data and against results by different versions of MCNPX and other codes. In the present work, we V&V MCNP6 using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.02 and LAQGSM03.03. We found that MCNP6 describes reasonably well various reactions induced by particles and nuclei at incident energies from 18 MeV to about 1 TeV per nucleon measured on thin and thick targets and agrees very well with similar results obtained with MCNPX and calculations by CEM03.02, LAQGSM03.01 (03.03), INCL4 + ABLA, and Bertini INC + Dresner evaporation, EPAX, ABRABLA, HIPSE, and AMD, used as stand alone codes. Most of several computational bugs and more serious physics problems observed in MCNP6/X during our V...

  18. Validation of Heat Transfer and Film Cooling Capabilities of the 3-D RANS Code TURBO

    Science.gov (United States)

    Shyam, Vikram; Ameri, Ali; Chen, Jen-Ping

    2010-01-01

    The capabilities of the 3-D unsteady RANS code TURBO have been extended to include heat transfer and film cooling applications. The results of simulations performed with the modified code are compared to experiment and to theory, where applicable. Wilcox s k-turbulence model has been implemented to close the RANS equations. Two simulations are conducted: (1) flow over a flat plate and (2) flow over an adiabatic flat plate cooled by one hole inclined at 35 to the free stream. For (1) agreement with theory is found to be excellent for heat transfer, represented by local Nusselt number, and quite good for momentum, as represented by the local skin friction coefficient. This report compares the local skin friction coefficients and Nusselt numbers on a flat plate obtained using Wilcox's k-model with the theory of Blasius. The study looks at laminar and turbulent flows over an adiabatic flat plate and over an isothermal flat plate for two different wall temperatures. It is shown that TURBO is able to accurately predict heat transfer on a flat plate. For (2) TURBO shows good qualitative agreement with film cooling experiments performed on a flat plate with one cooling hole. Quantitatively, film effectiveness is under predicted downstream of the hole.

  19. Validation of Printed Circuit Heat Exchanger Design Code KAIST{sub H}XD

    Energy Technology Data Exchange (ETDEWEB)

    Baik, Seungjoon; Kim, Seong Gu; Lee, Jekyoung; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    Supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle has been suggested for the SFR due to the relatively mild sodium-CO{sub 2} interaction. The S-CO{sub 2} power conversion cycle can achieve not only high safety but also high efficiency with SFR core thermal condition. However, due to the dramatic property change near the critical point, the inlet pressure and temperature conditions of compressor can have significant effect on the overall cycle efficiency. To maintain the inlet condition of compressor, a sensitive precooler control system is required for stable operation. Therefore understanding the precooler performance is essential for the S-CO{sub 2} power conversion system. According to experimental result, designed PCHE showed high effectiveness in various operating regions. Comparing the experimental and the design data, heat transfer performance estimation showed less than 6% error. On the other hand, the pressure drop estimation showed large gap. The water side pressure drop showed 50-70% under estimation. Because the form losses were not included in the design code, water side pressure drop estimation result seems reliable. However, the CO{sub 2} side showed more than 70% over estimation in the pressure drop from the code. The authors suspect that the differences may have occurred by the channel corner shape. The real channel has round corners and smooth edge, but the correlation is based on the sharp edged zig-zag channel. Further studies are required to understand and interpret the results correctly in the future.

  20. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  1. Design and implementation of safety traceability system for candied fruits based on two-dimension code technology

    Directory of Open Access Journals (Sweden)

    ZHAO Kun

    2014-12-01

    Full Text Available Traceability is the basic principle of food safety.A food safety traceability system based on QR code and cloud computing technology was introduced in this paper.First of all we introduced the QR code technology and the concept of traceability.And then through the field investigation,we analyzed the traceability process.At the same time,we designed the system and database were found,and the consumer experiencing technology is studied.Finally we expounded the traceability information collection,transmission and final presentation style and expected the future development of traceability system.

  2. The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness.

    Energy Technology Data Exchange (ETDEWEB)

    Walraven, Jeremy Allen; Blecke, Jill; Baker, Michael Sean; Clemens, Rebecca C.; Mitchell, John Anthony; Brake, Matthew Robert; Epp, David S.; Wittwer, Jonathan W.

    2008-10-01

    This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal effects. Finally, technology readiness testing demonstrated the integrated elements of the sensor under realistic environments.

  3. Validation of WIMS-AECL/(MULTICELL)/RFSP code system using wolsong units 3, 4 phase-B measurement data

    Energy Technology Data Exchange (ETDEWEB)

    Lyou, Seok Jean; Hong, In Seob; Min, Byung Joo; Suk, Ho Chun

    2000-07-01

    The objects of this study is to evaluate and validate and the WIMS-AECL/(MULTICELL)/RFSP code system for nuclear design and analysis of advanced CANDU fuel core. Wolsong units 3 and 4 phase-B physics test data were used for the assessment of code system. The code system used in this study is the same as one used in the previous post-simulation of wolsong unit 2. As a result, the predicted boron worth was consistent with measured data within 1%(0.08mk/ppm). And the predicted total worth of reactivity control device was generally agreed with the measured data, but individual worths were a tendency to have a difference between the predicted and measured values. Also moderator temperature coefficient was calculated to be lower than that of measurement by about 0.13-0.23mk. As synthesizing the results of wolsong units 2,3 and 4 post-simulations, the difference between the predicted and measured critical boron concentrations was caused by the worth of reactivity devices and structural materials and the difference between the predicted and measured individual worth of the strong absorber such as shutoff rod and mechanical control absorber was estimated to need the detail simulation of the core structure. Also the difference between the predicted and measured moderator temperature coefficient would be brought about measured data rather than calculation error. In order to improve the accuracy of the predicted values by WIMS-AECL/(MULTICELL)/RFSP code system, a sensitivity test shall be performed about the mesh structure surrounding the strong absorber, and also more accurate condition in the operation of the moderator temperature coefficient measurement shall be recorded.

  4. Integral and Separate Effects Tests for Thermal Hydraulics Code Validation for Liquid-Salt Cooled Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Per

    2012-10-30

    The objective of the 3-year project was to collect integral effects test (IET) data to validate the RELAP5-3D code and other thermal hydraulics codes for use in predicting the transient thermal hydraulics response of liquid salt cooled reactor systems, including integral transient response for forced and natural circulation operation. The reference system for the project is a modular, 900-MWth Pebble Bed Advanced High Temperature Reactor (PB-AHTR), a specific type of Fluoride salt-cooled High temperature Reactor (FHR). Two experimental facilities were developed for thermal-hydraulic integral effects tests (IETs) and separate effects tests (SETs). The facilities use simulant fluids for the liquid fluoride salts, with very little distortion to the heat transfer and fluid dynamics behavior. The CIET Test Bay facility was designed, built, and operated. IET data for steady state and transient natural circulation was collected. SET data for convective heat transfer in pebble beds and straight channel geometries was collected. The facility continues to be operational and will be used for future experiments, and for component development. The CIET 2 facility is larger in scope, and its construction and operation has a longer timeline than the duration of this grant. The design for the CIET 2 facility has drawn heavily on the experience and data collected on the CIET Test Bay, and it was completed in parallel with operation of the CIET Test Bay. CIET 2 will demonstrate start-up and shut-down transients and control logic, in addition to LOFC and LOHS transients, and buoyant shut down rod operation during transients. Design of the CIET 2 Facility is complete, and engineering drawings have been submitted to an external vendor for outsourced quality controlled construction. CIET 2 construction and operation continue under another NEUP grant. IET data from both CIET facilities is to be used for validation of system codes used for FHR modeling, such as RELAP5-3D. A set of

  5. Development and validation of burnup dependent computational schemes for the analysis of assemblies with advanced lattice codes

    Science.gov (United States)

    Ramamoorthy, Karthikeyan

    The main aim of this research is the development and validation of computational schemes for advanced lattice codes. The advanced lattice code which forms the primary part of this research is "DRAGON Version4". The code has unique features like self shielding calculation with capabilities to represent distributed and mutual resonance shielding effects, leakage models with space-dependent isotropic or anisotropic streaming effect, availability of the method of characteristics (MOC), burnup calculation with reaction-detailed energy production etc. Qualified reactor physics codes are essential for the study of all existing and envisaged designs of nuclear reactors. Any new design would require a thorough analysis of all the safety parameters and burnup dependent behaviour. Any reactor physics calculation requires the estimation of neutron fluxes in various regions of the problem domain. The calculation goes through several levels before the desired solution is obtained. Each level of the lattice calculation has its own significance and any compromise at any step will lead to poor final result. The various levels include choice of nuclear data library and energy group boundaries into which the multigroup library is cast; self shielding of nuclear data depending on the heterogeneous geometry and composition; tracking of geometry, keeping error in volume and surface to an acceptable minimum; generation of regionwise and groupwise collision probabilities or MOC-related information and their subsequent normalization thereof, solution of transport equation using the previously generated groupwise information and obtaining the fluxes and reaction rates in various regions of the lattice; depletion of fuel and of other materials based on normalization with constant power or constant flux. Of the above mentioned levels, the present research will mainly focus on two aspects, namely self shielding and depletion. The behaviour of the system is determined by composition of resonant

  6. 3D Measurement Technology by Structured Light Using Stripe-Edge-Based Gray Code

    Science.gov (United States)

    Wu, H. B.; Chen, Y.; Wu, M. Y.; Guan, C. R.; Yu, X. Y.

    2006-10-01

    The key problem of 3D vision measurement using triangle method based on structured light is to acquiring projecting angle of projecting light accurately. In order to acquire projecting angle thereby determine the corresponding relationship between sampling point and image point, method for encoding and decoding structured light based on stripe edge of Gray code is presented. The method encoded with Gray code stripe and decoded with stripe edge acquired by sub-pixel technology instead of pixel centre, so latter one-bit decoding error was removed. Accuracy of image sampling point location and correspondence between image sampling point and object sampling point achieved sub-pixel degree. In addition, measurement error caused by dividing projecting angle irregularly by even-width encoding stripe was analysed and corrected. Encoding and decoding principle and decoding equations were described. Finally, 3dsmax and Matlab software were used to simulate measurement system and reconstruct measured surface. Indicated by experimental results, measurement error is about 0.05%.

  7. The physics and technology basis entering European system code studies for DEMO

    Science.gov (United States)

    Wenninger, R.; Kembleton, R.; Bachmann, C.; Biel, W.; Bolzonella, T.; Ciattaglia, S.; Cismondi, F.; Coleman, M.; Donné, A. J. H.; Eich, T.; Fable, E.; Federici, G.; Franke, T.; Lux, H.; Maviglia, F.; Meszaros, B.; Pütterich, T.; Saarelma, S.; Snickers, A.; Villone, F.; Vincenzi, P.; Wolff, D.; Zohm, H.

    2017-01-01

    A large scale program to develop a conceptual design for a demonstration fusion power plant (DEMO) has been initiated in Europe. Central elements are the baseline design points, which are developed by system codes. The assessment of the credibility of these design points is often hampered by missing information. The main physics and technology content of the central European system codes have been published (Kovari et al 2014 Fusion Eng. Des. 89 3054-69, 2016 Fusion Eng. Des. 104 9-20, Reux et al 2015 Nucl. Fusion 55 073011). In addition, this publication discusses key input parameters for the pulsed and conservative design option \\tt{EU DEMO1 2015} and provides justifications for the parameter choices. In this context several DEMO physics gaps are identified, which need to be addressed in the future to reduce the uncertainty in predicting the performance of the device. Also the sensitivities of net electric power and pulse duration to variations of the input parameters are investigated. The most extreme sensitivity is found for the elongation ( Δ {κ95}=10 % corresponds to Δ {{P}\\text{el,\\text{net}}}=125 % ).

  8. Development of accident management technology and computer codes -A study for nuclear safety improvement-

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Kyu; Jae, Moo Sung; Jo, Young Gyun; Park, Rae Jun; Kim, Jae Hwan; Ha, Jae Ju; Kang, Dae Il; Choi, Sun Young; Kim, Si Hwan [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1994-07-01

    We have surveyed new technologies and research results for the accident management of nuclear power plants. And, based on the concept of using the existing plant capabilities for accident management, both in-vessel and ex-vessel strategies were identified and analyzed. When assessing accident management strategies, their effectiveness, adverse effects, and their feasibility must be considered. We have developed a framework for assessing the strategies with these factors in mind. We have applied the developed framework to assessing the strategies, including the likelihood that the operator correctly diagnoses the situation and successfully implements the strategies. Finally, the cavity flooding strategy was assessed by applying it to the station blackout sequence, which have been identified as one of the major contributors to risk at the reference plant. The thermohydraulic analyses with sensitivity calculations have been performed using MAAP 4 computer code. (Author).

  9. VATE: VAlidation of high TEchnology based on large database analysis by learning machine

    NARCIS (Netherlands)

    Meldolesi, E; Van Soest, J; Alitto, A R; Autorino, R; Dinapoli, N; Dekker, A; Gambacorta, M A; Gatta, R; Tagliaferri, L; Damiani, A; Valentini, V

    2014-01-01

    The interaction between implementation of new technologies and different outcomes can allow a broad range of researches to be expanded. The purpose of this paper is to introduce the VAlidation of high TEchnology based on large database analysis by learning machine (VATE) project that aims to combine

  10. Heat removal (wetting, heat transfer, T/H, secondary circuit, code validation etc.)

    Energy Technology Data Exchange (ETDEWEB)

    Dury, T.; Siman-Tov, M.

    1996-06-01

    This working group provided a comprehensive list of feasibility and uncertainty issues. Most of the issues seem to fall into the `needed but can be worked out` category. They feel these can be worked out as the project develops. A few issues can be considered critical or feasibility issues (that must be proven to be feasible). Those include: (1) Thermal shock and its mitigation (>1 MW); how to inject the He bubbles (if used) - back pressure into He lines - mercury traces in He lines; how to maintain proper bubble distribution and size (static and dynamic; if used); vibrations and fatigue (dynamic); possibility of cavitation from thermal shock. (2) Wetting and/or non-wetting of mercury on containment walls with or without gases and its effect on heat transfer (and materials). (3) Prediction capabilities in the CFD code; bubbles behavior in mercury (if used) - cross stream turbulence (ESS only) - wetting/non-wetting effects. (4) Cooling of beam `windows`; concentration of local heat deposition at center, especially if beam is of parabolic profile.

  11. New BRC neutron evaluations of actinides with the TALYS code: Modelization and first validation tests

    Directory of Open Access Journals (Sweden)

    Romain P.

    2010-10-01

    Full Text Available The reader may have a look on references [1–3,5] for more details. Over the last five years, new evaluations of plutonium and uranium have been performed at Bruyèeres-le-Châtel (BRC from the resolved resonance region up to 30MeV. Only nuclear reactions models have been used to build these evaluations. Total, shape elastic and direct inelastic cross sections are obtained from a coupled channel model using a dispersive optical potential (BRC, [13] devoted to actinides. All the other cross sections are calculated owing to the Hauser-Fesbach theory (TALYS code [4].We take particular care over the fission channel. For uranium isotopes, a triple-humped barrier [3] is required in order to reproduce accurately the variations of the experimental fission cross sections. As not commonly expected, we show [5] that the effect of the class II or class III states located in the wells of the aforementioned fission barrier provide sometimes an anti-resonant transmission rather than a resonant. With increasing neutron incident energy, a lot of residual nuclei produced by nucleon emission lead to fission also. All available experimental data assigned to the various fission mechanisms of the same nucleus are used to define its fission barrier parameters. As a result of this approach, we are now able to provide consistent evaluations for a large series of isotopes. Of course, our new evaluations have been tested against integral data.

  12. The Motivational Interviewing Treatment Integrity Code (MITI 4): Rationale, Preliminary Reliability and Validity.

    Science.gov (United States)

    Moyers, Theresa B; Rowell, Lauren N; Manuel, Jennifer K; Ernst, Denise; Houck, Jon M

    2016-06-01

    The Motivational Interviewing Treatment Integrity code has been revised to address new evidence-based elements of motivational interviewing (MI). This new version (MITI 4) includes new global ratings to assess clinician's attention to client language, increased rigor in assessing autonomy support and client choice, and items to evaluate the use of persuasion when giving information and advice. Four undergraduate, non-professional raters were trained in the MITI and used it to review 50 audiotapes of clinicians conducting MI in actual treatments sessions. Both kappa and intraclass correlation indices were calculated for all coders, for the best rater pair and for a 20% randomly selected sample from the best rater pair. Reliability across raters, with the exception of Emphasize Autonomy and % Complex Reflections, were in the good to excellent range. Reliability estimates decrease when smaller samples are used and when fewer raters contribute. The advantages and drawbacks of this revision are discussed including implications for research and clinical applications. The MITI 4.0 represents a reliable method for assessing the integrity of MI including both the technical and relational components of the method. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A Global Approach to the Physics Validation of Simulation Codes for Future Nuclear Systems

    Energy Technology Data Exchange (ETDEWEB)

    Giuseppe Palmiotti; Massimo Salvatores; Gerardo Aliberti; Hikarui Hiruta; R. McKnight; P. Oblozinsky; W. S. Yang

    2008-09-01

    This paper presents a global approach to the validation of the parameters that enter into the neutronics simulation tools for advanced fast reactors with the objective to reduce the uncertainties associated to crucial design parameters. This global approach makes use of sensitivity/uncertainty methods; statistical data adjustments; integral experiment selection, analysis and “representativity” quantification with respect to a reference system; scientifically based cross section covariance data and appropriate methods for their use in multigroup calculations. This global approach has been applied to the uncertainty reduction on the criticality of the Advanced Burner Reactor, (both metal and oxide core versions) presently investigated in the frame of the GNEP initiative. The results obtained are very encouraging and allow to indicate some possible improvements of the ENDF/B-VII data file.

  14. Validation of the solar heating and cooling high speed performance (HISPER) computer code

    Science.gov (United States)

    Wallace, D. B.

    1980-10-01

    Developed to give a quick and accurate predictions HISPER, a simplification of the TRNSYS program, achieves its computational speed by not simulating detailed system operations or performing detailed load computations. In order to validate the HISPER computer for air systems the simulation was compared to the actual performance of an operational test site. Solar insolation, ambient temperature, water usage rate, and water main temperatures from the data tapes for an office building in Huntsville, Alabama were used as input. The HISPER program was found to predict the heating loads and solar fraction of the loads with errors of less than ten percent. Good correlation was found on both a seasonal basis and a monthly basis. Several parameters (such as infiltration rate and the outside ambient temperature above which heating is not required) were found to require careful selection for accurate simulation.

  15. Initial validation of 4D-model for a clinical PET scanner using the Monte Carlo code gate

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Igor F.; Lima, Fernando R.A.; Gomes, Marcelo S., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Vieira, Jose W.; Pacheco, Ludimila M. [Instituto Federal de Educacao, Ciencia e Tecnologia (IFPE), Recife, PE (Brazil); Chaves, Rosa M. [Instituto de Radium e Supervoltagem Ivo Roesler, Recife, PE (Brazil)

    2011-07-01

    Building exposure computational models (ECM) of emission tomography (PET and SPECT) currently has several dedicated computing tools based on Monte Carlo techniques (SimSET, SORTEO, SIMIND, GATE). This paper is divided into two steps: (1) using the dedicated code GATE (Geant4 Application for Tomographic Emission) to build a 4D model (where the fourth dimension is the time) of a clinical PET scanner from General Electric, GE ADVANCE, simulating the geometric and electronic structures suitable for this scanner, as well as some phenomena 4D, for example, rotating gantry; (2) the next step is to evaluate the performance of the model built here in the reproduction of test noise equivalent count rate (NEC) based on the NEMA Standards Publication NU protocols 2-2007 for this tomography. The results for steps (1) and (2) will be compared with experimental and theoretical values of the literature showing actual state of art of validation. (author)

  16. Qualitative and quantitative validation of the SINBAD code on complex HPGe gamma-ray spectra

    Energy Technology Data Exchange (ETDEWEB)

    Rohee, E.; Coulon, R.; Normand, S.; Carrel, F. [CEA, LIST, Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Dautremer, T.; Barat, E.; Montagu, T. [CEA, LIST, Laboratoire Modelisation, Simulation et Systemes, F-91191 Gif-sur-Yvette, (France); Jammes, C. [CEA/DEN/SPEx/LDCI, Centre de Cadarache, F-13109 Saint-Paul-lez-Durance, (France)

    2015-07-01

    Radionuclides identification and quantification is a serious concern for many applications as safety or security of nuclear power plant or fuel cycle facility, CBRN risk identification, environmental radioprotection and waste measurements. High resolution gamma-ray spectrometry based on HPGe detectors is a performing solution for all these topics. During last decades, a great number of software has been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when photoelectric peaks are folded together with a high ratio between theirs amplitudes, when the Compton background is much larger compared to the signal of a single peak and when spectra are composed of a great number of peaks. This study deals with the comparison between conventional methods in radionuclides identification and quantification and the code called SINBAD ('Spectrometrie par Inference Non parametrique Bayesienne Deconvolutive'). For many years, SINBAD has been developed by CEA LIST for unfolding complex spectra from HPGe detectors. Contrary to conventional methods using fitting procedures, SINBAD uses a probabilistic approach with Bayesian inference to describe spectrum data. This conventional fitting method founded for example in Genie 2000 is compared with the nonparametric SINBAD approach regarding some key figures of merit as the peak centroid evaluation (identification) and peak surface evaluation (quantification). Unfriendly cases are studied for nuclides detection with closed gamma-rays energies and high photoelectric peak intensity differences. Tests are performed with spectra from the International Atomic Energy Agency (IAEA) for gamma spectra analysis software benchmark and with spectra acquired at the laboratory. Results show that SINBAD and Genie 2000 performances are quite similar with sometimes best results for SINBAD with the important difference that to achieve same performances the nonparametric method is user-friendly compared

  17. Translation and pilot validation of the Danish version of the Everyday Technology Use Questionnaire (ETUQ)

    DEFF Research Database (Denmark)

    Helle, Tina; Kaptain, Rina Juel; Kottorp, Anders

    2016-01-01

    evidence in relation to response processes, internal scale validity and precision in measures. Results: After collapsing some scale step categories used in the ETUQ, a reduced number of items (n=40) demonstrated acceptable goodness-of-fit to the Rasch model. This reduced number of items demonstrated......Background: The use of everyday technologies has grown rapidly during the last decades and become an increasing part of people’s everyday life, and also now include the use of e-health technologies that are used on a daily basis for persons living with chronic health conditions, e.g., chronic...... obstructive pulmonary disease (COPD). There are however no validated assessments targeting the competence to use everyday- and e-health technology for these people. The aim of this pilot study was to investigate the validity of the Danish version of the Everyday Technology Use Questionnaire (ETUQ) in a sample...

  18. Experimental validation of computational fluid dynamic codes (CFD for liquid-solid risers in clean alkylation processes

    Directory of Open Access Journals (Sweden)

    Duduković Milorad P.

    2002-01-01

    Full Text Available This manuscript, based on the presentation given by one of the authors (M.P. Dudukovic at the Technological and Engineering Forum in Pančevo, May 21 2002, summarizes the use of the computer automated radioactive particle tracking (CARPT and gamma computed tomography (CT in obtaining the data needed to validate the Euler-Euler based CFD simulations for solids distribution, flow pattern and mixing in a liquid-solid riser. The riser is one of the reactors considered for acid solid catalyst promoted alkylation. It is shown that CFD calculations, validated by CARPT-CT data, show promise for scale-up and design of this novel reactor type.

  19. Validation of the CASMO-4 code against SIMS-measured spatial gadolinium distributions inside a BWR pin

    Energy Technology Data Exchange (ETDEWEB)

    Holzgrewe, F.; Gavillet, D.; Restani, R.; Zimmermann, M.A

    2000-07-01

    The purpose of the present study was to establish a database, useful for the assessment of the predictive capabilities of assembly burnup codes with respect to the depletion of the burnable absorber gadolinium (Gd). An SVEA-96 fuel assembly containing one unique Gd rod, with an initial Gd{sub 2}O{sub 3}-content of 9 wt%, was irradiated for one cycle in a Swiss Boiling Water Reactor (BWR), and then transported to the PSI hotcells for post-irradiation examination. Relative radial and azimuthal Gd distributions were obtained from Secondary Ion Mass Spectrometry (SIMS) at three axial positions. Two perpendicular line scans were performed at each position in order to capture the expected asymmetry in the Gd depletion. Since such high-spatial-resolution experimental data for individual fuel pins are quite rare, they form a valuable basis for the further validation of the calculational methods in reactor physics codes. The goal of this study was to contribute to the validation of the micro-region depletion model of CASMO-4 with respect to its standard application of generating two-group cross sections for the 3-D core simulator SIMULATE-3. The only notable difference to the standard application is a more detailed noding scheme for the Gd pin, required to obtain an improved resolution of the calculated distributions. The comparison of measurements with calculational results was found to be quite insensitive to the axial position, and the agreement was found to be very good for all isotopes investigated. The two important neutron-absorbing isotopes {sup 155} Gd and {sup 157} Gd, in particular, show excellent agreement. In conclusion, the CASMO-4 micro-region depletion model has been demonstrated to accurately predict the evolution of the radial distribution of the burnable absorber gadolinium. (authors)

  20. Validation of CESAR Thermal-hydraulic Module of ASTEC V1.2 Code on BETHSY Experiments

    Science.gov (United States)

    Tregoures, Nicolas; Bandini, Giacomino; Foucher, Laurent; Fleurot, Joëlle; Meloni, Paride

    The ASTEC V1 system code is being jointly developed by the French Institut de Radioprotection et Sûreté Nucléaire (IRSN) and the German Gesellschaft für Anlagen und ReaktorSicherheit (GRS) to address severe accident sequences in a nuclear power plant. Thermal-hydraulics in primary and secondary system is addressed by the CESAR module. The aim of this paper is to present the validation of the CESAR module, from the ASTEC V1.2 version, on the basis of well instrumented and qualified integral experiments carried out in the BETHSY facility (CEA, France), which simulates a French 900 MWe PWR reactor. Three tests have been thoroughly investigated with CESAR: the loss of coolant 9.1b test (OECD ISP N° 27), the loss of feedwater 5.2e test, and the multiple steam generator tube rupture 4.3b test. In the present paper, the results of the code for the three analyzed tests are presented in comparison with the experimental data. The thermal-hydraulic behavior of the BETHSY facility during the transient phase is well reproduced by CESAR: the occurrence of major events and the time evolution of main thermal-hydraulic parameters of both primary and secondary circuits are well predicted.

  1. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  2. Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii

    Energy Technology Data Exchange (ETDEWEB)

    Kourosh Salehi-Ashtiani; Jason A. Papin

    2012-01-13

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnected metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and

  3. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solis Sanches, L. O.; Miranda, R. Castaneda; Cervantes Viramontes, J. M. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac (Mexico); Vega-Carrillo, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac., Mexico. and Unidad Academica de Estudios Nucleares. C. Cip (Mexico)

    2013-07-03

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called ''Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres'', (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the ''Robust design of artificial neural networks methodology'' and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of {sup 252}Cf, {sup 241}AmBe and {sup 239}PuBe neutron sources measured with a Bonner spheres system.

  4. Clinical Validation of a New Tinnitus Assessment Technology

    Science.gov (United States)

    Hébert, Sylvie; Fournier, Philippe

    2017-01-01

    Current clinical assessment of tinnitus relies mainly on self-report. Psychoacoustic assessment of tinnitus pitch and loudness are recommended but methods yield variable results. Herein, we investigated the proposition that a previously validated fixed laboratory-based method (Touchscreen) and a newly developed clinically relevant portable prototype (Stand-alone) yield comparable results in the assessment of psychoacoustic tinnitus pitch and loudness. Participants with tinnitus [N = 15, 7 with normal hearing and 8 with hearing loss (HL)] and participants simulating tinnitus (simulators, N = 15) were instructed to rate the likeness of pure tones (250—16 kHz) to their tinnitus pitch and match their loudness using both methods presented in a counterbalanced order. Results indicate that simulators rated their “tinnitus” at lower frequencies and at louder levels (~10 dB) compared to tinnitus participants. Tinnitus subgroups (with vs. without HL) differed in their predominant tinnitus pitch (i.e., lower in the tinnitus with HL subgroups), but not in their loudness matching in decibel SL. Loudness at the predominant pitch was identified as a factor yielding significant sensitivity and specificity in discriminating between the two groups of participants. Importantly, despite differences in the devices’ physical presentations, likeness and loudness ratings were globally consistent between the two methods and, moreover, highly reproducible from one method to the other in both groups. All in all, both methods yielded robust tinnitus data in less than 12 min, with the Stand-alone having the advantage of not being dependent of learning effects, being user-friendly, and being adapted to the audiogram of each patient to further reduce testing time. PMID:28270792

  5. Development of safety analysis codes and experimental validation for a very high temperature gas-cooled reactor Final report

    Energy Technology Data Exchange (ETDEWEB)

    Chang Oh

    2006-03-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gasses (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. Research Objectives As described above, a pipe break may lead to significant fuel damage and fission product release in the VHTR. The objectives of this Korean/United States collaboration were to develop and validate advanced computational methods for VHTR safety analysis. The methods that have been developed are now

  6. Validation of coupled Relap5-3D code in the analysis of RBMK-1500 specific transients

    Energy Technology Data Exchange (ETDEWEB)

    Evaldas, Bubelis; Algirdas, Kaliatka; Eugenijus, Uspuras [Lithuanian Energy Institute, Kaunas (Lithuania)

    2003-07-01

    This paper deals with the modelling of RBMK-1500 specific transients taking place at Ignalina NPP. These transients include: measurements of void and fast power reactivity coefficients, change of graphite cooling conditions and reactor power reduction transients. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Graphite temperature reactivity coefficient at the plant is determined by changing graphite cooling conditions in the reactor cavity. This type of transient is very unique and important from the gap between fuel channel and the graphite bricks model validation point of view. The measurement results, obtained during this transient, allowed to determine the thermal conductivity coefficient for this gap and to validate the graphite temperature reactivity feedback model. Reactor power reduction is a regular operation procedure during the entire lifetime of the reactor. In all cases it starts by either a scram or a power reduction signal activation by the reactor control and protection system or by an operator. The obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviours of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modelling of the neutronic processes taking place in RBMK- 1500 reactor core. And finally, the performed validation of RELAP5-3D model of Ignalina NPP RBMK-1500

  7. Validation of an instrument to measure students' motivation and self-regulation towards technology learning

    Science.gov (United States)

    Liou, Pey-Yan; Kuo, Pei-Jung

    2014-05-01

    Background:Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose:The present study is to validate an instrument for assessing senior high school students' motivation and self-regulation towards technology learning. Sample:A total of 1822 Taiwanese senior high school students (1020 males and 802 females) responded to the newly developed instrument. Design and method:The Motivation and Self-regulation towards Technology Learning (MSRTL) instrument was developed based on the previous instruments measuring students' motivation and self-regulation towards science learning. Exploratory and confirmatory factor analyses were utilized to investigate the structure of the items. Cronbach's alpha was applied for measuring the internal consistency of each scale. Furthermore, multivariate analysis of variance was used to examine gender differences. Results:Seven scales, including 'Technology learning self-efficacy,' 'Technology learning value,' 'Technology active learning strategies,' 'Technology learning environment stimulation,' 'Technology learning goal-orientation,' 'Technology learning self-regulation-triggering,' and 'Technology learning self-regulation-implementing' were confirmed for the MSRTL instrument. Moreover, the results also showed that male and female students did not present the same degree of preference in all of the scales. Conclusions:The MSRTL instrument composed of seven scales corresponding to 39 items was shown to be valid based on validity and reliability analyses. While male students tended to express more positive and active performance in the motivation scales, no gender differences were found in the self-regulation scales.

  8. Patient validation of cues and concerns identified according to Verona coding definitions of emotional sequences (VR-CoDES): a video- and interview-based approach.

    Science.gov (United States)

    Eide, Hilde; Eide, Tom; Rustøen, Tone; Finset, Arnstein

    2011-02-01

    A challenging but main task for clinicians is to identify patients' concerns related to their medical conditions. The study aim was to validate a new coding scheme for identifying patients' cues and concerns. 12 videotaped consultations between nurses and pain patients were coded according to the Verona Coding Scheme for Emotional Sequences (VR-CoDES). During a metainterview each patient watched his/her own video interview with the researcher to confirm or disconfirm the identified cues and concerns. A directive or an open format was applied. Quantitative and qualitative data analyses were performed. Patients' confirmation in relation to the coding gave a sensitivity of 0.95 and specificity of 0.99 in the directive format and a sensitivity of 0.99 and specificity of 0.70 applying the open format. Through a qualitative analysis 83% of researcher-identified cues and concerns were validated. 17% were not confirmed or uncertain. The VR-CoDES seems to capture what are experienced as real concerns to patients, and proves to be a coding scheme with a high degree of ecological validity. The VR-CoDES provides a valid framework for detecting patients' cues and concerns, and should be explored as a training tool to develop clinicians' empathic accuracy. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Experimental Space Shuttle Orbiter Studies to Acquire Data for Code and Flight Heating Model Validation

    Science.gov (United States)

    Wadhams, T. P.; Holden, M. S.; MacLean, M. G.; Campbell, Charles

    2010-01-01

    thin-film resolution in both the span and chord direction in the area of peak heating. Additional objectives of this first study included: obtaining natural or tripped turbulent wing leading edge heating levels, assessing the effectiveness of protuberances and cavities placed at specified locations on the orbiter over a range of Mach numbers and Reynolds numbers to evaluate and compare to existing engineering and computational tools, obtaining cavity floor heating to aid in the verification of cavity heating correlations, acquiring control surface deflection heating data on both the main body flap and elevons, and obtain high speed schlieren videos of the interaction of the orbiter nose bow shock with the wing leading edge. To support these objectives, the stainless steel 1.8% scale orbiter model in addition to the sensors on the wing leading edge was instrumented down the windward centerline, over the wing acreage on the port side, and painted with temperature sensitive paint on the starboard side wing acreage. In all, the stainless steel 1.8% scale Orbiter model was instrumented with over three-hundred highly sensitive thin-film heating sensors, two-hundred of which were located in the wing leading edge shock interaction region. Further experimental studies will also be performed following the successful acquisition of flight data during the Orbiter Entry Boundary Layer Flight Experiment and HYTHIRM on STS-119 at specific data points simulating flight conditions and geometries. Additional instrumentation and a protuberance matching the layout present during the STS-119 boundary layer transition flight experiment were added with testing performed at Mach number and Reynolds number conditions simulating conditions experienced in flight. In addition to the experimental studies, CUBRC also performed a large amount of CFD analysis to confirm and validate not only the tunnel freestream conditions, but also 3D flows over the orbiter acreage, wing leading edge, and

  10. WIAMan Technology Demonstrator Sensor Codes Conforming to International Organization for Standardization/Technical Standard (ISO/TS) 13499

    Science.gov (United States)

    2016-03-01

    Standardization/Technical Standard ( ISO /TS) 13499 by Michael Tegtmeyer Approved for public release; distribution is unlimited. NOTICES Disclaimers The findings...International Organization for Standardization/Technical Standard ( ISO /TS) 13499 by Michael TegtmeyerWIAMan Engineering Office, ARL Approved for public...WIAMan Technology Demonstrator Sensor Codes Conforming to International Organization for Standardization/Technical Standard ( ISO /TS) 13499 Michael

  11. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    Science.gov (United States)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  12. Development of essential system technologies for advanced reactor - Development of natural circulation analysis code for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Park, Ik Gyu; Kim, Jae Hak; Lee, Sang Min; Kim, Tae Wan [Seoul National University, Seoul (Korea)

    1999-04-01

    The objective of this study is to understand the natural circulation characteristics of integral type reactors and to develope the natural circulation analysis code for integral type reactors. This study is focused on the asymmetric 3-dimensional flow during natural circulation such as 1/4 steam generator section isolation and the inclination of the reactor systems. Natural circulation experiments were done using small-scale facilities of integral reactor SMART (System-Integrated Modular Advanced ReacTor). CFX4 code was used to investigate the flow patterns and thermal mixing phenomena in upper pressure header and downcomer. Differences between normal operation of all steam generators and the 1/4 section isolation conditions were observed and the results were used as the data 1/4 section isolation conditions were observed and the results were used as the data for RETRAN-03/INT code validation. RETRAN-03 code was modified for the development of natural circulation analysis code for integral type reactors, which was development of natural circulation analysis code for integral type reactors, which was named as RETRAN-03/INT. 3-dimensional analysis models for asymmetric flow in integral type reactors were developed using vector momentum equations in RETRAN-03. Analysis results using RETRAN-03/INT were compared with experimental and CFX4 analysis results and showed good agreements. The natural circulation characteristics obtained in this study will provide the important and fundamental design features for the future small and medium integral reactors. (author). 29 refs., 75 figs., 18 tabs.

  13. Development of essential system technologies for advanced reactor - Development of natural circulation analysis code for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Park, Ik Gyu; Kim, Jae Hak; Lee, Sang Min; Kim, Tae Wan [Seoul National University, Seoul (Korea)

    1999-04-01

    The objective of this study is to understand the natural circulation characteristics of integral type reactors and to develope the natural circulation analysis code for integral type reactors. This study is focused on the asymmetric 3-dimensional flow during natural circulation such as 1/4 steam generator section isolation and the inclination of the reactor systems. Natural circulation experiments were done using small-scale facilities of integral reactor SMART (System-Integrated Modular Advanced ReacTor). CFX4 code was used to investigate the flow patterns and thermal mixing phenomena in upper pressure header and downcomer. Differences between normal operation of all steam generators and the 1/4 section isolation conditions were observed and the results were used as the data 1/4 section isolation conditions were observed and the results were used as the data for RETRAN-03/INT code validation. RETRAN-03 code was modified for the development of natural circulation analysis code for integral type reactors, which was development of natural circulation analysis code for integral type reactors, which was named as RETRAN-03/INT. 3-dimensional analysis models for asymmetric flow in integral type reactors were developed using vector momentum equations in RETRAN-03. Analysis results using RETRAN-03/INT were compared with experimental and CFX4 analysis results and showed good agreements. The natural circulation characteristics obtained in this study will provide the important and fundamental design features for the future small and medium integral reactors. (author). 29 refs., 75 figs., 18 tabs.

  14. Validation and Comparison of 2D and 3D Codes for Nearshore Motion of Long Waves Using Benchmark Problems

    Science.gov (United States)

    Velioǧlu, Deniz; Cevdet Yalçıner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Tsunamis are huge waves with long wave periods and wave lengths that can cause great devastation and loss of life when they strike a coast. The interest in experimental and numerical modeling of tsunami propagation and inundation increased considerably after the 2011 Great East Japan earthquake. In this study, two numerical codes, FLOW 3D and NAMI DANCE, that analyze tsunami propagation and inundation patterns are considered. Flow 3D simulates linear and nonlinear propagating surface waves as well as long waves by solving three-dimensional Navier-Stokes (3D-NS) equations. NAMI DANCE uses finite difference computational method to solve 2D depth-averaged linear and nonlinear forms of shallow water equations (NSWE) in long wave problems, specifically tsunamis. In order to validate these two codes and analyze the differences between 3D-NS and 2D depth-averaged NSWE equations, two benchmark problems are applied. One benchmark problem investigates the runup of long waves over a complex 3D beach. The experimental setup is a 1:400 scale model of Monai Valley located on the west coast of Okushiri Island, Japan. Other benchmark problem is discussed in 2015 National Tsunami Hazard Mitigation Program (NTHMP) Annual meeting in Portland, USA. It is a field dataset, recording the Japan 2011 tsunami in Hilo Harbor, Hawaii. The computed water surface elevation and velocity data are compared with the measured data. The comparisons showed that both codes are in fairly good agreement with each other and benchmark data. The differences between 3D-NS and 2D depth-averaged NSWE equations are highlighted. All results are presented with discussions and comparisons. Acknowledgements: Partial support by Japan-Turkey Joint Research Project by JICA on earthquakes and tsunamis in Marmara Region (JICA SATREPS - MarDiM Project), 603839 ASTARTE Project of EU, UDAP-C-12-14 project of AFAD Turkey, 108Y227, 113M556 and 213M534 projects of TUBITAK Turkey, RAPSODI (CONCERT_Dis-021) of CONCERT

  15. Validation of the everyday technology use questionnaire in a Japanese context.

    OpenAIRE

    Malinowsky, Camilla; Kottorp, Anders; Tanemura, Rumi; Asaba, Eric; Nagao, Toru; Noda, Kazue; Sagara, Jiro; Bontje, Peter; Rosenberg, Lena; Nygård, Louise

    2015-01-01

    Background/Objective The Everyday Technology Use Questionnaire (ETUQ), which evaluates the perceived relevance of and the perceived ability in everyday technology (ET) use, has demonstrated acceptable psychometric properties in Swedish studies of older adults. The aim of this study was to examine the reliability and validity of the ETUQ in a Japanese context in older Japanese adults. Methods A sample of older Japanese adults (n = 164) including persons with (n = 32) and without ...

  16. Technological change in the wine market? The role of QR codes and wine apps in consumer wine purchases

    Directory of Open Access Journals (Sweden)

    Lindsey M. Higgins

    2014-06-01

    Full Text Available As an experiential good, wine purchases in the absence of tastings are often challenging and information-laden decisions. Technology has shaped the way consumers negotiate this complex purchase process. Using a sample of 631 US wine consumers, this research aims to identify the role of mobile applications and QR codes in the wine purchase decision. Results suggest that wine consumers that consider themselves wine connoisseurs or experts, enjoy talking about wine, and are interested in wine that is produced locally, organically, or sustainably are more likely to employ technology in their wine purchase decision. While disruption appears to have occurred on the supply side (number of wine applications available and the number of wine labels with a QR code, this research suggests that relatively little change is occurring on the demand side (a relatively small segment of the population—those already interested in wine—are employing the technology to aid in their purchase decision.

  17. A Logistics Code Identification Technology Based on Improved BRIEF Algorithm%基于改进BRIEF算法的物流编码识别技术

    Institute of Scientific and Technical Information of China (English)

    杜辉

    2015-01-01

    In this paper, we proposed a QR code identification method based on the improved BRIEF algorithm, which, making use of the image modification and rapid feature identification technologies, could quickly identify and match QR codes and effectively solve problems met in the practical application of the QR codes, and at the end, through a test, proved the validity of the method.%提出一种改进BRIEF算法的QR码识别方法,该方法通过图像校正和快速特征点识别技术,对QR码进行快速识别匹配,有效地解决了QR码在实际应用中遇到的问题.实验结果表明,该QR码识别方法运算速度快,实时性好,识别率高,鲁棒性强,改善了现实中的识别难题,满足了移动物流管理的需求.

  18. Validation of a modified PENELOPE Monte Carlo code for applications in digital and dual-energy mammography

    Science.gov (United States)

    Del Lama, L. S.; Cunha, D. M.; Poletti, M. E.

    2017-08-01

    The presence and morphology of microcalcification clusters are the main point to provide early indications of breast carcinomas. However, the visualization of those structures may be jeopardized due to overlapping tissues even for digital mammography systems. Although digital mammography is the current standard for breast cancer diagnosis, further improvements should be achieved in order to address some of those physical limitations. One possible solution for such issues is the application of the dual-energy technique (DE), which is able to highlight specific lesions or cancel out the tissue background. In this sense, this work aimed to evaluate several quantities of interest in radiation applications and compare those values with works present in the literature to validate a modified PENELOPE code for digital mammography applications. For instance, the scatter-to-primary ratio (SPR), the scatter fraction (SF) and the normalized mean glandular dose (DgN) were evaluated by simulations and the resulting values were compared to those found in earlier studies. Our results present a good correlation for the evaluated quantities, showing agreement equal or better than 5% for the scatter and dosimetric-related quantities when compared to the literature. Finally, a DE imaging chain was simulated and the visualization of microcalcifications was investigated.

  19. Test and validation of the iterative code for the neutrons spectrometry and dosimetry: NSDUAZ; Prueba y validacion del codigo iterativo para la espectrometria y dosimetria de neutrones: NSDUAZ

    Energy Technology Data Exchange (ETDEWEB)

    Reyes H, A.; Ortiz R, J. M.; Reyes A, A.; Castaneda M, R.; Solis S, L. O.; Vega C, H. R., E-mail: alfredo_reyesh@hotmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Lopez Velarde 801, Col. Centro, 98000 Zacatecas (Mexico)

    2014-08-15

    In this work was realized the test and validation of an iterative code for neutronic spectrometry known as Neutron Spectrometry and Dosimetry of the Universidad Autonoma de Zacatecas (NSDUAZ). This code was designed in a user graph interface, friendly and intuitive in the environment programming of LabVIEW using the iterative algorithm known as SPUNIT. The main characteristics of the program are: the automatic selection of the initial spectrum starting from the neutrons spectra catalog compiled by the International Atomic Energy Agency, the possibility to generate a report in HTML format that shows in graph and numeric way the neutrons flowing and calculates the ambient dose equivalent with base to this. To prove the designed code, the count rates of a spectrometer system of Bonner spheres were used with a detector of {sup 6}LiI(Eu) with 7 polyethylene spheres with diameter of 0, 2, 3, 5, 8, 10 and 12. The count rates measured with two neutron sources: {sup 252}Cf and {sup 239}PuBe were used to validate the code, the obtained results were compared against those obtained using the BUNKIUT code. We find that the reconstructed spectra present an error that is inside the limit reported in the literature that oscillates around 15%. Therefore, it was concluded that the designed code presents similar results to those techniques used at the present time. (Author)

  20. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kress, T. S. [comp.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  1. Validation of the Proficiency Examination for Diagnostic Radiologic Technology. Final Report.

    Science.gov (United States)

    Educational Testing Service, Princeton, NJ.

    The validity of the Proficiency Examination for Diagnostic Radiologic Technology was investigated, using 140 radiologic technologists who took both the written Proficiency Examination and a performance test. As an additional criterion measure of job proficiency, supervisors' assessments were obtained for 128 of the technologists. The resulting…

  2. Faculty's Acceptance of Computer Based Technology: Cross-Validation of an Extended Model

    Science.gov (United States)

    Ahmad, Tunku Badariah Tunku; Madarsha, Kamal Basha; Zainuddin, Ahmad Marzuki; Ismail, Nik Ahmad Hisham; Nordin, Mohamad Sahari

    2010-01-01

    The first aim of the present study is to validate an extended technology acceptance model (TAME) on the data derived from the faculty members of a university in an ongoing, computer mediated work setting. The study extended the original TAM model by including an intrinsic motivation component--computer self efficacy. In so doing, the study…

  3. Validation of the Domains of Creativity Scale for Nigerian Preservice Science, Technology, and Mathematics Teachers

    Science.gov (United States)

    Awofala, Adeneye O. A.; Fatade, Alfred O.

    2015-01-01

    Introduction: Investigation into the factor structure of Domains of Creativity Scale has been on for sometimes now. The purpose of this study was to test the validity of the Kaufman Domains of Creativity Scale on Nigerian preservice science, technology, and mathematics teachers. Method: Exploratory and confirmatory factor analyses were performed…

  4. Assistive technology for visually impaired women for use of the female condom: a validation study

    Directory of Open Access Journals (Sweden)

    Luana Duarte Wanderley Cavalcante

    2015-02-01

    Full Text Available OBJECTIVE To validate assistive technology for visually impaired women to learn how to use the female condom. METHOD a methodological development study conducted on a web page, with data collection between May and October 2012. Participants were 14 judges; seven judges in sexual and reproductive health (1st stage and seven in special education (2nd stage. RESULTS All items have reached the adopted parameter of 70% agreement. In Stage 1 new materials were added to represent the cervix, and instructions that must be heard twice were included in the 2nd stage. CONCLUSION The technology has been validated and is appropriate for its objectives, structure / presentation and relevance. It is an innovative, low cost and valid instrument for promoting health and one which may help women with visual disabilities to use the female condom.

  5. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, K. W

    2006-01-15

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC.

  6. Ella-V and technology usage technology usage in an english language and literacy acquisition validation randomized controlled trial study

    Directory of Open Access Journals (Sweden)

    Roisin P. Corcoran

    2014-12-01

    Full Text Available This paper describes the use of technology to provide virtual professional development (VPD for teachers and to conduct classroom observations in a study of English Language Learner (ELL instruction in grades K–3. The technology applications were part of a cluster randomized control trial (RCT design for a federally funded longitudinal validation study of a particular program, English Language and Literacy Acquisition-Validation, ELLA- V, to determine its degree of impact on English oral language/literacy, reading, and science across 63 randomly assigned urban, suburban, and rural schools (first year of implementation. ELLA-V also examines the impact of bimonthly VPD for treatment teachers compared to comparison group teachers on pedagogical skills, measured by sound observation instruments, and on student achievement, measured by state/national English language/literacy/reading tests and a national science test. This study features extensive technology use via virtual observations, bimonthly VPD, and randomly assigned treatment and control schools with students served in English as second language (ESL instructional time. The study design and methodology are discussed relativeto the specialized uses of technology and issues involving the evaluation of technology’s contribution to the intervention of interest and of the efficient, cost-effective execution of the study.

  7. NanoString, a novel digital color-coded barcode technology: current and future applications in molecular diagnostics.

    Science.gov (United States)

    Tsang, Hin-Fung; Xue, Vivian Weiwen; Koh, Su-Pin; Chiu, Ya-Ming; Ng, Lawrence Po-Wah; Wong, Sze-Chuen Cesar

    2017-01-01

    Formalin-fixed, paraffin-embedded (FFPE) tissue sample is a gold mine of resources for molecular diagnosis and retrospective clinical studies. Although molecular technologies have expanded the range of mutations identified in FFPE samples, the applications of existing technologies are limited by the low nucleic acids yield and poor extraction quality. As a result, the routine clinical applications of molecular diagnosis using FFPE samples has been associated with many practical challenges. NanoString technologies utilize a novel digital color-coded barcode technology based on direct multiplexed measurement of gene expression and offer high levels of precision and sensitivity. Each color-coded barcode is attached to a single target-specific probe corresponding to a single gene which can be individually counted without amplification. Therefore, NanoString is especially useful for measuring gene expression in degraded clinical specimens. Areas covered: This article describes the applications of NanoString technologies in molecular diagnostics and challenges associated with its applications and the future development. Expert commentary: Although NanoString technology is still in the early stages of clinical use, it is expected that NanoString-based cancer expression panels would play more important roles in the future in classifying cancer patients and in predicting the response to therapy for better personal therapeutic care.

  8. Heat transfer to a heavy liquid metal in curved geometry: Code validation and CFD simulation for the MEGAPIE lower target

    Science.gov (United States)

    Dury, Trevor V.

    2006-06-01

    The ESS and SINQ Heat Emitting Temperature Sensing Surface (HETSS) mercury experiments have been used to validate the Computational Fluid Dynamics (CFD) code CFX-4 employed in designing the lower region of the international liquid metal cooled MEGAPIE target, to be installed at SINQ, PSI, in 2006. Conclusions were drawn on the best turbulence models and degrees of mesh refinement to apply, and a new CFD model of the MEGAPIE geometry was made, based on the CATIA CAD design of the exact geometry constructed. This model contained the fill and drain tubes as well as the bypass feed duct, with the differences in relative vertical length due to thermal expansion being considered between these tubes and the window. Results of the mercury experiments showed that CFD calculations can be trusted to give peak target window temperature under normal operational conditions to within about ±10%. The target nozzle actually constructed varied from the theoretical design model used for CFD due to the need to apply more generous separation distances between the nozzle and the window. In addition, the bypass duct contraction approaching the nozzle exit was less sharp compared with earlier designs. Both of these changes modified the bypass jet penetration and coverage of the heated window zone. Peak external window temperature with a 1.4 mA proton beam and steady-state operation is now predicted to be 375 °C, with internal temperature 354.0 °C (about 32 °C above earlier predictions). Increasing bypass flow from 2.5 to 3.0 kg/s lowers these peak temperatures by about 12 °C. Stress analysis still needs to be made, based on these thermal data.

  9. Targeting Non-Coding RNAs in Plants with the CRISPR-Cas Technology is a Challenge yet Worth Accepting.

    Science.gov (United States)

    Basak, Jolly; Nithin, Chandran

    2015-01-01

    Non-coding RNAs (ncRNAs) have emerged as versatile master regulator of biological functions in recent years. MicroRNAs (miRNAs) are small endogenous ncRNAs of 18-24 nucleotides in length that originates from long self-complementary precursors. Besides their direct involvement in developmental processes, plant miRNAs play key roles in gene regulatory networks and varied biological processes. Alternatively, long ncRNAs (lncRNAs) are a large and diverse class of transcribed ncRNAs whose length exceed that of 200 nucleotides. Plant lncRNAs are transcribed by different RNA polymerases, showing diverse structural features. Plant lncRNAs also are important regulators of gene expression in diverse biological processes. There has been a breakthrough in the technology of genome editing, the CRISPR-Cas9 (clustered regulatory interspaced short palindromic repeats/CRISPR-associated protein 9) technology, in the last decade. CRISPR loci are transcribed into ncRNA and eventually form a functional complex with Cas9 and further guide the complex to cleave complementary invading DNA. The CRISPR-Cas technology has been successfully applied in model plants such as Arabidopsis and tobacco and important crops like wheat, maize, and rice. However, all these studies are focused on protein coding genes. Information about targeting non-coding genes is scarce. Hitherto, the CRISPR-Cas technology has been exclusively used in vertebrate systems to engineer miRNA/lncRNAs, but it is still relatively unexplored in plants. While briefing miRNAs, lncRNAs and applications of the CRISPR-Cas technology in human and animals, this review essentially elaborates several strategies to overcome the challenges of applying the CRISPR-Cas technology in editing ncRNAs in plants and the future perspective of this field.

  10. Targeting non-coding RNAs in Plants with the CRISPR-Cas technology is a challenge yet worth accepting

    Directory of Open Access Journals (Sweden)

    Jolly eBasak

    2015-11-01

    Full Text Available Non-coding RNAs (ncRNAs have emerged as versatile master regulator of biological functions in recent years. MicroRNAs (miRNAs are small endogenous ncRNAs of 18-24 nucleotides in length that originates from long self-complementary precursors. Besides their direct involvement in developmental processes, plant miRNAs play key roles in gene regulatory networks and varied biological processes. Alternatively, long ncRNAs (lncRNAs are a large and diverse class of transcribed ncRNAs whose length exceed that of 200 nucleotides. Plant lncRNAs are transcribed by different RNA polymerases, showing diverse structural features. Plant lncRNAs also are important regulators of gene expression in diverse biological processes. There has been a breakthrough in the technology of genome editing, the CRISPR-Cas9 (clustered regulatory interspaced short palindromic repeats/CRISPR-associated protein 9 technology, in the last decade. CRISPR loci are transcribed into ncRNA and eventually form a functional complex with Cas9 and further guide the complex to cleave complementary invading DNA. The CRISPR-Cas technology has been successfully applied in model plants such as Arabidopsis and tobacco and important crops like wheat, maize and rice. However, all these studies are focused on protein coding genes. Information about targeting non-coding genes is scarce. Hitherto, the CRISPR-Cas technology has been exclusively used in vertebrate systems to engineer miRNA/lncRNAs, but it is still relatively unexplored in plants. While briefing miRNAs, lncRNAs and applications of the CRISPR-Cas technology in human and animals, this review essentially elaborates several strategies to overcome the challenges of applying the CRISPR-Cas technology in editing ncRNAs in plants and the future perspective of this field.

  11. Development of an automatic validation system for simulation codes of the fusion research; Entwicklung eines automatischen Validierungssystems fuer Simulationscodes der Fusionsforschung

    Energy Technology Data Exchange (ETDEWEB)

    Galonska, Andreas

    2010-03-15

    In the present master thesis the development oa an automatic validation system for the simulation code ERO is documented. This 3D Monte-carlo code models the transport of impurities as well as plasma-wall interaction processes and has great importance for the fusion research. The validation system is based on JuBE (Julich Benchmarking Environment), the flexibility of which allows a slight extension of the system to other codes, for instance such, which are operated in the framework of the EU Task Force ITM (Integrated Tokamak Modelling). The chosen solution - JuBE and a special program for the ''intellectual'' comparison of actual and reference-edition data of ERO is described and founded. The use of this program and the configuration of JuBE are detailedly described. Simulations to different plasma experiments, which serve as reference cases for the automatic validation, are explained. The working of the system is illustrated by the description of a test case. This treats the failure localization and improvement in the parallelization of an important ERO module (tracking of physically eroded particle). It is demonstrated, how the system reacts in an erroneous validation and the subsequently performed error correction leads to a positive result. Finally a speed-up curve of the parallelization is established by means of the output data of JuBE.

  12. Three-dimensional all-speed CFD code for safety analysis of nuclear reactor containment: Status of GASFLOW parallelization, model development, validation and application

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Jianjun, E-mail: jianjun.xiao@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Travis, John R., E-mail: jack_travis@comcast.com [Engineering and Scientific Software Inc., 3010 Old Pecos Trail, Santa Fe, NM 87505 (United States); Royl, Peter, E-mail: peter.royl@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Necker, Gottfried, E-mail: gottfried.necker@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Svishchev, Anatoly, E-mail: anatoly.svishchev@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Jordan, Thomas, E-mail: thomas.jordan@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany)

    2016-05-15

    Highlights: • 3-D scalable semi-implicit pressure-based CFD code for containment safety analysis. • Robust solution algorithm valid for all-speed flows. • Well validated and widely used CFD code for hydrogen safety analysis. • Code applied in various types of nuclear reactor containments. • Parallelization enables high-fidelity models in large scale containment simulations. - Abstract: GASFLOW is a three dimensional semi-implicit all-speed CFD code which can be used to predict fluid dynamics, chemical kinetics, heat and mass transfer, aerosol transportation and other related phenomena involved in postulated accidents in nuclear reactor containments. The main purpose of the paper is to give a brief review on recent GASFLOW code development, validations and applications in the field of nuclear safety. GASFLOW code has been well validated by international experimental benchmarks, and has been widely applied to hydrogen safety analysis in various types of nuclear power plants in European and Asian countries, which have been summarized in this paper. Furthermore, four benchmark tests of a lid-driven cavity flow, low Mach number jet flow, 1-D shock tube and supersonic flow over a forward-facing step are presented in order to demonstrate the accuracy and wide-ranging capability of ICE’d ALE solution algorithm for all-speed flows. GASFLOW has been successfully parallelized using the paradigms of Message Passing Interface (MPI) and domain decomposition. The parallel version, GASFLOW-MPI, adds great value to large scale containment simulations by enabling high-fidelity models, including more geometric details and more complex physics. It will be helpful for the nuclear safety engineers to better understand the hydrogen safety related physical phenomena during the severe accident, to optimize the design of the hydrogen risk mitigation systems and to fulfill the licensing requirements by the nuclear regulatory authorities. GASFLOW-MPI is targeting a high

  13. Validation of the model of Critical Heat Flux COBRA-TF compared experiments of Post-Dryout performed by the Royal Institute of Technology (KTH); Validacion del Modelo de Critical Heat Flux de COBRA-TF frente a los Experimentos de Post-Dryout realizados por el Royal Institute of Technology (KTH)

    Energy Technology Data Exchange (ETDEWEB)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-07-01

    In this work is a validation of the results obtained with different existing correlations for predicting the value and location of the CTF code CHF, using them for experiments of Post-Dryout conducted by the Royal Institute of Technology (KTH) in Stockholm, Sweden. (Author)

  14. Validation of the Everyday Technology Use Questionnaire in a Japanese context

    OpenAIRE

    Camilla Malinowsky; Anders Kottorp; Rumi Tanemura; Eric Asaba; Toru Nagao; Kazue Noda; Jiro Sagara; Peter Bontje; Lena Rosenberg; Louise Nygård

    2015-01-01

    Background/Objective: The Everyday Technology Use Questionnaire (ETUQ), which evaluates the perceived relevance of and the perceived ability in everyday technology (ET) use, has demonstrated acceptable psychometric properties in Swedish studies of older adults. The aim of this study was to examine the reliability and validity of the ETUQ in a Japanese context in older Japanese adults. Methods: A sample of older Japanese adults (n = 164) including persons with (n = 32) and without (n = 132)...

  15. [Care with the child's health and validation of an educational technology for riverside families].

    Science.gov (United States)

    Teixeira, Elizabeth; de Almeida Siqueira, Aldo; da Silva, Joselice Pereira; Lavor, Lília Cunha

    2011-01-01

    This study aimed to assess the knowledge and ways of caring for the child health 0-5 years between riverine (Phase 1), and to validate an educational technology (Phase 2). It was carried out a descriptive qualitative study. With the mothers, focus groups and content analysis were used, and with judges-specialists and target-public-applied, forms. The study revealed that the concern with the care of a child between the riverine families permeates the adversity daily, with dedication and commitment of these families in maintaining the health of their children. The sensitivity listening of mothers indicated the need for a closer relationship between nursing professionals and family. The validation of the educational technology was convergent, within the parameters considered adequate.

  16. Autonomous rendezvous and docking: A commercial approach to on-orbit technology validation

    Science.gov (United States)

    Tchoryk, Peter, Jr.; Whitten, Raymond P.

    1991-01-01

    SpARC, in conjunction with its corporate affiliates, is planning an on-orbit validation of autonomous rendezvous and docking (ARD) technology. The emphasis in this program is to utilize existing technology and commercially available components wherever possible. The primary subsystems to be validated by this demonstration include GPS receivers for navigation, a video-based sensor for proximity operations, a fluid connector mechanism to demonstrate fluid resupply capability, and a compliant, single-point docking mechanism. The focus for this initial experiment will be ELV based and will make use of two residual Commercial Experiment Transporter (COMET) service modules. The first COMET spacecraft will be launched in late 1992 and will serve as the target vehicle. After the second COMET spacecraft has been launched in late 1994, the ARD demonstration will take place. The service module from the second COMET will serve as the chase vehicle.

  17. Validity of diagnostic codes and prevalence of physician-diagnosed psoriasis and psoriatic arthritis in southern Sweden--a population-based register study.

    Directory of Open Access Journals (Sweden)

    Sofia Löfvendahl

    Full Text Available OBJECTIVE: To validate diagnostic codes for psoriasis and psoriatic arthritis (PsA and estimate physician-diagnosed prevalence of psoriasis and PsA in the Skåne region, Sweden. METHODS: In the Skåne Healthcare Register (SHR, all healthcare consultations are continuously collected for all inhabitants in the Skåne region (population 1.2 million. During 2005-2010 we identified individuals with ≥1 physician-consultations consistent with psoriasis (ICD-10. Within this group we also identified those diagnosed with PsA. We performed a validation by reviewing medical records in 100 randomly selected cases for psoriasis and psoriasis with PsA, respectively. Further, we estimated the pre- and post-validation point prevalence by December 31, 2010. RESULTS: We identified 16 171 individuals (psoriasis alone: n = 13 185, psoriasis with PsA n = 2 986. The proportion of ICD-10 codes that could be confirmed by review of medical records was 81% for psoriasis and 63% for psoriasis with PsA with highest percentage of confirmed codes for cases diagnosed ≥2 occasions in specialized care. For 19% and 29% of the cases respectively it was not possible to determine diagnosis due to insufficient information. Thus, the positive predicted value (PPV of one ICD-10 code for psoriasis and psoriasis with PsA ranged between 81-100% and 63-92%, respectively. Assuming the most conservative PPV, the post-validation prevalence was 1.23% (95% CI: 1.21-1.25 for psoriasis (with or without PsA, 1.02% (95% CI: 1.00-1.03 for psoriasis alone and 0.21% (95% CI: 0.20-0.22 for psoriasis with PsA. The post-validation prevalence of PsA in the psoriasis cohort was 17.3% (95% CI: 16.65-17.96. CONCLUSIONS: The proportion of diagnostic codes in SHR that could be verified varied with frequency of diagnostic codes and level of care highlighting the importance of sensitivity analyses using different case ascertainment criteria. The prevalence of physician-diagnosed psoriasis and Ps

  18. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    Energy Technology Data Exchange (ETDEWEB)

    Jernigan, Dann A.; Blanchat, Thomas K.

    2010-09-01

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparison between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.

  19. Instrument for assessing mobile technology acceptability in diabetes self-management: a validation and reliability study.

    Science.gov (United States)

    Frandes, Mirela; Deiac, Anca V; Timar, Bogdan; Lungeanu, Diana

    2017-01-01

    Nowadays, mobile technologies are part of everyday life, but the lack of instruments to assess their acceptability for the management of chronic diseases makes their actual adoption for this purpose slow. The objective of this study was to develop a survey instrument for assessing patients' attitude toward and intention to use mobile technology for diabetes mellitus (DM) self-management, as well as to identify sociodemographic characteristics and quality of life factors that affect them. We first conducted the documentation and instrument design phases, which were subsequently followed by the pilot study and instrument validation. Afterward, the instrument was administered 103 patients (median age: 37 years; range: 18-65 years) diagnosed with type 1 or type 2 DM, who accepted to participate in the study. The reliability and construct validity were assessed by computing Cronbach's alpha and using factor analysis, respectively. The instrument included statements about the actual use of electronic devices for DM management, interaction between patient and physician, attitude toward using mobile technology, and quality of life evaluation. Cronbach's alpha was 0.9 for attitude toward using mobile technology and 0.97 for attitude toward using mobile device applications for DM self-management. Younger patients (Spearman's ρ=-0.429; Ptruthfulness and easiness to use.

  20. Solid Warehouse Material Management System Based on ERP and Bar Code Technology

    Institute of Scientific and Technical Information of China (English)

    ZHANG Cheng; WANG Jie; YUAN Bing; WU Chao; HU Qiao-dan

    2004-01-01

    This paper presents a manufacturing material management system based on ERP, which is combined with industrial bar code information collection and material management, and carries out extensive research on the system structure and function model, as well as a detailed application scheme.

  1. Workshop report - A validation study of Navier-Stokes codes for transverse injection into a Mach 2 flow

    Science.gov (United States)

    Eklund, Dean R.; Northam, G. B.; Mcdaniel, J. C.; Smith, Cliff

    1992-01-01

    A CFD (Computational Fluid Dynamics) competition was held at the Third Scramjet Combustor Modeling Workshop to assess the current state-of-the-art in CFD codes for the analysis of scramjet combustors. Solutions from six three-dimensional Navier-Stokes codes were compared for the case of staged injection of air behind a step into a Mach 2 flow. This case was investigated experimentally at the University of Virginia and extensive in-stream data was obtained. Code-to-code comparisons have been made with regard to both accuracy and efficiency. The turbulence models employed in the solutions are believed to be a major source of discrepancy between the six solutions.

  2. Potential prognostic long non-coding RNA identification and their validation in predicting survival of patients with multiple myeloma.

    Science.gov (United States)

    Hu, Ai-Xin; Huang, Zhi-Yong; Zhang, Lin; Shen, Jian

    2017-04-01

    Multiple myeloma, a typical hematological malignancy, is characterized by malignant proliferation of plasma cells. This study was to identify differently expressed long non-coding RNAs to predict the survival of patients with multiple myeloma efficiently. Gene expressing profiles of diagnosed patients with multiple myeloma, GSE24080 (559 samples) and GSE57317 (55 samples), were downloaded from Gene Expression Omnibus database. After processing, survival-related long non-coding RNAs were identified by Cox regression analysis. The prognosis of multiple myeloma patients with differently expressed long non-coding RNAs was predicted by Kaplan-Meier analysis. Meanwhile, stratified analysis was performed based on the concentrations of serum beta 2-microglobulin (S-beta 2m), albumin, and lactate dehydrogenase of multiple myeloma patients. Gene set enrichment analysis was performed to further explore the functions of identified long non-coding RNAs. A total of 176 long non-coding RNAs significantly related to the survival of multiple myeloma patients (p multiple myeloma. Gene set enrichment analysis-identified pathways of cell cycle, focal adhesion, and G2-M checkpoint were associated with these long non-coding RNAs. A total of 176 long non-coding RNAs, especially RP1-286D6.1, AC008875.2, MTMR9L, AC069360.2, and AL512791.1, were potential biomarkers to evaluate the prognosis of multiple myeloma patients. These long non-coding RNAs participated indispensably in many pathways associated to the development of multiple myeloma; however, the molecular mechanisms need to be further studied.

  3. The relationship development assessment - research version: preliminary validation of a clinical tool and coding schemes to measure parent-child interaction in autism.

    Science.gov (United States)

    Larkin, Fionnuala; Guerin, Suzanne; Hobson, Jessica A; Gutstein, Steven E

    2015-04-01

    The aim of this project was to replicate and extend findings from two recent studies on parent-child relatedness in autism (Beurkens, Hobson, & Hobson, 2013; Hobson, Tarver, Beurkens, & Hobson, 2013, under review) by adapting an observational assessment and coding schemes of parent-child relatedness for the clinical context and examining their validity and reliability. The coding schemes focussed on three aspects of relatedness: joint attentional focus (Adamson, Bakeman, & Deckner, 2004), the capacity to co-regulate an interaction and the capacity to share emotional experiences. The participants were 40 children (20 with autism, 20 without autism) aged 6-14, and their parents. Parent-child dyads took part in the observational assessment and were coded on these schemes. Comparisons were made with standardised measures of autism severity (Autism Diagnostic Observation Schedule, ADOS: Lord, Rutter, DiLavore, & Risi, 2001; Social Responsiveness Scale, SRS: Constantino & Gruber, 2005), relationship quality (Parent Child Relationship Inventory, PCRI: Gerard, 1994) and quality of parent-child interaction (Dyadic Coding Scales, DCS: Humber & Moss, 2005). Inter-rater reliability was very good and, as predicted, codes both diverged from the measure of parent-child relationship and converged with a separate measure of parent-child interaction quality. A detailed profile review revealed nuanced areas of group and individual differences which may be specific to verbally-able school-age children. The results support the utility of the Relationship Development Assessment - Research Version for clinical practice. © The Author(s) 2013.

  4. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors; Validacion del codigo AZTRAN 1.1 con problemas Benchmark de reactores LWR

    Energy Technology Data Exchange (ETDEWEB)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Xolocostli M, J. V.; Gomez T, A. M., E-mail: amhed.jvq@gmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S{sub N}, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO{sub 2} cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  5. Validity of the International Classification of Diseases 10th revision code for hyperkalaemia in elderly patients at presentation to an emergency department and at hospital admission

    Science.gov (United States)

    Fleet, Jamie L; Shariff, Salimah Z; Gandhi, Sonja; Weir, Matthew A; Jain, Arsh K; Garg, Amit X

    2012-01-01

    Objectives Evaluate the validity of the International Classification of Diseases, 10th revision (ICD-10) code for hyperkalaemia (E87.5) in two settings: at presentation to an emergency department and at hospital admission. Design Population-based validation study. Setting 12 hospitals in Southwestern Ontario, Canada, from 2003 to 2010. Participants Elderly patients with serum potassium values at presentation to an emergency department (n=64 579) and at hospital admission (n=64 497). Primary outcome Sensitivity, specificity, positive-predictive value and negative-predictive value. Serum potassium values in patients with and without a hyperkalaemia code (code positive and code negative, respectively). Results The sensitivity of the best-performing ICD-10 coding algorithm for hyperkalaemia (defined by serum potassium >5.5 mmol/l) was 14.1% (95% CI 12.5% to 15.9%) at presentation to an emergency department and 14.6% (95% CI 13.3% to 16.1%) at hospital admission. Both specificities were greater than 99%. In the two settings, the positive-predictive values were 83.2% (95% CI 78.4% to 87.1%) and 62.0% (95% CI 57.9% to 66.0%), while the negative-predictive values were 97.8% (95% CI 97.6% to 97.9%) and 96.9% (95% CI 96.8% to 97.1%). In patients who were code positive for hyperkalaemia, median (IQR) serum potassium values were 6.1 (5.7 to 6.8) mmol/l at presentation to an emergency department and 6.0 (5.1 to 6.7) mmol/l at hospital admission. For code-negative patients median (IQR) serum potassium values were 4.0 (3.7 to 4.4) mmol/l and 4.1 (3.8 to 4.5) mmol/l in each of the two settings, respectively. Conclusions Patients with hospital encounters who were ICD-10 E87.5 hyperkalaemia code positive and negative had distinct higher and lower serum potassium values, respectively. However, due to very low sensitivity, the incidence of hyperkalaemia is underestimated. PMID:23274674

  6. PhpHMM Tool for Generating Speech Recogniser Source Codes Using Web Technologies

    Directory of Open Access Journals (Sweden)

    R. Krejčí

    2011-01-01

    Full Text Available This paper deals with the “phpHMM” software tool, which facilitates the development and optimisation of speech recognition algorithms. This tool is being developed in the Speech Processing Group at the Department of Circuit Theory, CTU in Prague, and it is used to generate the source code of a speech recogniser by means of the PHP scripting language and the MySQL database. The input of the system is a model of speech in a standard HTK format and a list of words to be recognised. The output consists of the source codes and data structures in C programming language, which are then compiled into an executable program. This tool is operated via a web interface.

  7. Recommendations for Technology Development and Validation Activities in Support of the Origins Program

    Science.gov (United States)

    Capps, Richard W. (Editor)

    1996-01-01

    The Office of Space Science (OSS) has initiated mission concept studies and associated technology roadmapping activities for future large space optical systems. The scientific motivation for these systems is the study of the origins of galaxies, stars, planetary systems and, ultimately, life. Collectively, these studies are part of the 'Astronomical Search for Origins and Planetary Systems Program' or 'Origins Program'. A series of at least three science missions and associated technology validation flights is currently envisioned in the time frame between the year 1999 and approximately 2020. These would be the Space Interferometry Mission (SIM), a 10-meter baseline Michelson stellar interferometer; the Next Generation Space Telescope (NGST), a space-based infrared optimized telescope with aperture diameter larger than four meters; and the Terrestrial Planet Finder (TPF), an 80-meter baseline-nulling Michelson interferometer described in the Exploration of Neighboring Planetary Systems (ExNPS) Study. While all of these missions include significant technological challenges, preliminary studies indicate that the technological requirements are achievable. However, immediate and aggressive technology development is needed. The Office of Space Access and Technology (OSAT) is the primary sponsor of NASA-unique technology for missions such as the Origins series. For some time, the OSAT Space Technology Program has been developing technologies for large space optical systems, including both interferometers and large-aperture telescopes. In addition, technology investments have been made by other NASA programs, including OSS; other government agencies, particularly the Department of Defense; and by the aerospace industrial community. This basis of prior technology investment provides much of the rationale for confidence in the feasibility of the advanced Origins missions. In response to the enhanced interest of both the user community and senior NASA management in large

  8. Passive Nosetip Technology (PANT) Program. Volume XVIII. Nosetip Analyses Using the EROS Computer Code

    Science.gov (United States)

    1975-06-01

    transfer Distribution Comarisons 3-7 3-6 Ran 207 Camphor Shape Change Prediction (Reg - 10 x 101/ft) 3-9 3-7 Run 208 Camphor Shape Change Prediction...environment. The film coefficient approach enables the modeling of heterogeneous reaction and sublimation kinetics, unequal species diffusion coefficients...ilar predictions. As an exercise of the shape change numerical procedures in the EROS com- puter code, two camphor shape change solutions were generated

  9. Validation methodology for the evaluation of thermal-hydraulic sub-channel codes devoted to LOCA simulations

    Energy Technology Data Exchange (ETDEWEB)

    Seiler, N.; Ruyer, P.; Biton, B., E-mail: nathalie.seiler@irsn.fr, E-mail: pierre.ruyer@irsn.fr [IRSN/DPAM/SEMCA/LEMAR, CE Cadarache, Saint Paul lez Durance (France)

    2011-07-01

    This study focuses on thermal-hydraulic simulations, at sub-channel scale, of a damaged PWR reactor core during a Loss Of Coolant Accident (LOCA). The aim of this study is to accurately simulate the thermal-hydraulics to provide the thermal-mechanical code DRACCAR with an accurate wall heat transfer law. This latter code is developed by the French Safety Institute “Institut de Radioprotection et de Surete Nucleaire” (IRSN) to evaluate the thermics and deformations of fuel assemblies within the core. The present paper first describes the methodology considered to evaluate the capabilities of existing codes CATHARE-3 and CESAR to simulate dispersed droplet flows at a sub-channel scale and then provides some first evaluations of them. (author)

  10. CMMI congress. International codes, technology and sustainability for the minerals industry. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    Papers cover three main themes: international reporting standards; new technology and competition (seeing through rock, controlling fragmentation, asset utilisation); and sustainability for the minerals industry. Four papers have been abstracted separately for the Coal Abstracts database.

  11. Use of gadolinium burnable absorbers in VVER Type Reactors. Validation of WIMS-D/4 code; Empleo del gadolinio como absorbente quemable en los reactores nucleares VVER. Validacion del codigo WIMS-D/4

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez Cardona, Caridad M.; Guerra Valdes, Ramiro; Lopez Aldama, Daniel [Centro de Tecnologia Nuclear, La Habana (Cuba)

    1996-07-01

    Burnable absorbers are not used in current operating WWERs, but in order to optimize the fuel cycle and enhance operational safety, one should also introduce gadolinium or a similar burnable absorber in these reactors. For this purpose adequate tools for properly calculating local effects in hexagonal geometries should be developed and validated. The present gives main results in validating the WIMS-D/4 lattice code for Gd burnable absorber bearing WWER lattices. To validate the code experimental and calculational benchmarks proposed in a IAEA Coordinated Research Program were solved. A code system for the optimization of the Gd axial distribution in a WWER reactor was developed and it also presented here. (author)

  12. Proposal to consistently apply the International Code of Nomenclature of Prokaryotes (ICNP) to names of the oxygenic photosynthetic bacteria (cyanobacteria), including those validly published under the International Code of Botanical Nomenclature (ICBN)/International Code of Nomenclature for algae, fungi and plants (ICN), and proposal to change Principle 2 of the ICNP.

    Science.gov (United States)

    Pinevich, Alexander V

    2015-03-01

    This taxonomic note was motivated by the recent proposal [Oren & Garrity (2014) Int J Syst Evol Microbiol 64, 309-310] to exclude the oxygenic photosynthetic bacteria (cyanobacteria) from the wording of General Consideration 5 of the International Code of Nomenclature of Prokaryotes (ICNP), which entails unilateral coverage of these prokaryotes by the International Code of Nomenclature for algae, fungi, and plants (ICN; formerly the International Code of Botanical Nomenclature, ICBN). On the basis of key viewpoints, approaches and rules in the systematics, taxonomy and nomenclature of prokaryotes it is reciprocally proposed to apply the ICNP to names of cyanobacteria including those validly published under the ICBN/ICN. For this purpose, a change to Principle 2 of the ICNP is proposed to enable validation of cyanobacterial names published under the ICBN/ICN rules. © 2015 IUMS.

  13. Cost-Effective ISS Space-Environment Technology Validation of Advanced Roll-Out Solar Array (ROSA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — DSS proposes to systematically mature, mitigate risk for; and perform hardware-based ground validations / demonstrations of a low-cost, high technology payoff,...

  14. Euler Technology Assessment for Preliminary Aircraft Design: Compressibility Predictions by Employing the Cartesian Unstructured Grid SPLITFLOW Code

    Science.gov (United States)

    Finley, Dennis B.; Karman, Steve L., Jr.

    1996-01-01

    The objective of the second phase of the Euler Technology Assessment program was to evaluate the ability of Euler computational fluid dynamics codes to predict compressible flow effects over a generic fighter wind tunnel model. This portion of the study was conducted by Lockheed Martin Tactical Aircraft Systems, using an in-house Cartesian-grid code called SPLITFLOW. The Cartesian grid technique offers several advantages, including ease of volume grid generation and reduced number of cells compared to other grid schemes. SPLITFLOW also includes grid adaption of the volume grid during the solution to resolve high-gradient regions. The SPLITFLOW code predictions of configuration forces and moments are shown to be adequate for preliminary design, including predictions of sideslip effects and the effects of geometry variations at low and high angles-of-attack. The transonic pressure prediction capabilities of SPLITFLOW are shown to be improved over subsonic comparisons. The time required to generate the results from initial surface data is on the order of several hours, including grid generation, which is compatible with the needs of the design environment.

  15. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  16. 强鲁棒性QR码水印技术%QR Code Watermark Technology with Strong Robustness

    Institute of Scientific and Technical Information of China (English)

    王子煜; 孙刘杰; 李孟涛

    2012-01-01

    A digital watermarking technology was put forward, which can enhance the robustness of digital watermarking system. This method encodes the watermark information as quick response code (QR Code) and embeds the QR code as a digital watermark into the intermediate frequency of the digital image by two discrete wavelet transform. This method increases the robustness under the premise of invisibility. Comparative experi- ment was carried out and the results showed that QR code as a digital watermark increases the security and robustness, and is applicable for any kind of watermarking algorithm.%提出了一种增强数字水印系统鲁棒性的数字水印技术。算法将标识信息编码为快速响应矩阵码(QR码),将其作为数字水印,利用离散小波变换方法将QR码嵌入到了载体图像二级小波分解后的中频系数中,在保证水印不可见性的前提下能增强水印的鲁棒性。设计了对比实验,实验结果表明,将QR码作为数字水印,能增强水印安全性和鲁棒性,且该方法适用于任何水印算法。

  17. Instrument for assessing mobile technology acceptability in diabetes self-management: a validation and reliability study

    Directory of Open Access Journals (Sweden)

    Frandes M

    2017-02-01

    Full Text Available Mirela Frandes,1 Anca V Deiac,2 Bogdan Timar,1,3 Diana Lungeanu1,2 1Department of Functional Sciences, “Victor Babes” University of Medicine and Pharmacy of Timisoara, 2Department of Mathematics, Polytechnic University of Timisoara, 3Third Medical Clinic, Emergency Hospital of Timisoara, Timisoara, Romania Background: Nowadays, mobile technologies are part of everyday life, but the lack of instruments to assess their acceptability for the management of chronic diseases makes their actual adoption for this purpose slow.Objective: The objective of this study was to develop a survey instrument for assessing patients’ attitude toward and intention to use mobile technology for diabetes mellitus (DM self-management, as well as to identify sociodemographic characteristics and quality of life factors that affect them.Methods: We first conducted the documentation and instrument design phases, which were subsequently followed by the pilot study and instrument validation. Afterward, the instrument was administered 103 patients (median age: 37 years; range: 18–65 years diagnosed with type 1 or type 2 DM, who accepted to participate in the study. The reliability and construct validity were assessed by computing Cronbach’s alpha and using factor analysis, respectively.Results: The instrument included statements about the actual use of electronic devices for DM management, interaction between patient and physician, attitude toward using mobile technology, and quality of life evaluation. Cronbach’s alpha was 0.9 for attitude toward using mobile technology and 0.97 for attitude toward using mobile device applications for DM self-management. Younger patients (Spearman’s ρ=-0.429; P<0.001 with better glycemic control (Spearman’s ρ=-0.322; P<0.001 and higher education level (Kendall’s τ=0.51; P<0.001 had significantly more favorable attitude toward using mobile assistive applications for DM control. Moreover, patients with a higher quality of

  18. Validation of an electroseismic and seismoelectric modeling code, for layered earth models, by the explicit homogeneous space solutions

    NARCIS (Netherlands)

    Grobbe, N.; Slob, E.C.

    2013-01-01

    We have developed an analytically based, energy fluxnormalized numerical modeling code (ESSEMOD), capable of modeling the wave propagation of all existing ElectroSeismic and SeismoElectric source-receiver combinations in horizontally layered configurations. We compare the results of several of these

  19. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    Directory of Open Access Journals (Sweden)

    Frisoni Manuela

    2016-01-01

    Full Text Available ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1 containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2 containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E are shown and discussed in this paper.

  20. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  1. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  2. Educational Technology Acceptance across Cultures: A Validation of the Unified Theory of Acceptance and Use of Technology in the Context of Turkish National Culture

    Science.gov (United States)

    Gogus, Aytac; Nistor, Nicolae; Riley, Richard W.; Lerche, Thomas

    2012-01-01

    The Unified Theory of Acceptance and Use of Technology (UTAUT; Venkatesh et al., 2003, 2012) proposes a major model of educational technology acceptance (ETA) which has been yet validated only in few languages and cultures. Therefore, this study aims at extending the applicability of UTAUT to Turkish culture. Based on acceptance and cultural data…

  3. Validating a measure to assess factors that affect assistive technology use by students with disabilities in elementary and secondary education.

    Science.gov (United States)

    Zapf, Susan A; Scherer, Marcia J; Baxter, Mary F; H Rintala, Diana

    2016-01-01

    The purpose of this study was to measure the predictive validity, internal consistency and clinical utility of the Matching Assistive Technology to Child & Augmentative Communication Evaluation Simplified (MATCH-ACES) assessment. Twenty-three assistive technology team evaluators assessed 35 children using the MATCH-ACES assessment. This quasi-experimental study examined the internal consistency, predictive validity and clinical utility of the MATCH-ACES assessment. The MATCH-ACES assessment predisposition scales had good internal consistency across all three scales. A significant relationship was found between (a) high student perseverance and need for assistive technology and (b) high teacher comfort and interest in technology use (p = (0).002). Study results indicate that the MATCH-ACES assessment has good internal consistency and validity. Predisposition characteristics of student and teacher combined can influence the level of assistive technology use; therefore, assistive technology teams should assess predisposition factors of the user when recommending assistive technology. Implications for Rehabilitation Educational and medical professionals should be educated on evidence-based assistive technology assessments. Personal experience and psychosocial factors can influence the outcome use of assistive technology. Assistive technology assessments must include an intervention plan for assistive technology service delivery to measure effective outcome use.

  4. Euler technology assessment for preliminary aircraft design employing OVERFLOW code with multiblock structured-grid method

    Science.gov (United States)

    Treiber, David A.; Muilenburg, Dennis A.

    1995-01-01

    The viability of applying a state-of-the-art Euler code to calculate the aerodynamic forces and moments through maximum lift coefficient for a generic sharp-edge configuration is assessed. The OVERFLOW code, a method employing overset (Chimera) grids, was used to conduct mesh refinement studies, a wind-tunnel wall sensitivity study, and a 22-run computational matrix of flow conditions, including sideslip runs and geometry variations. The subject configuration was a generic wing-body-tail geometry with chined forebody, swept wing leading-edge, and deflected part-span leading-edge flap. The analysis showed that the Euler method is adequate for capturing some of the non-linear aerodynamic effects resulting from leading-edge and forebody vortices produced at high angle-of-attack through C(sub Lmax). Computed forces and moments, as well as surface pressures, match well enough useful preliminary design information to be extracted. Vortex burst effects and vortex interactions with the configuration are also investigated.

  5. Color coded multiple access scheme for bidirectional multiuser visible light communications in smart home technologies

    Science.gov (United States)

    Tiwari, Samrat Vikramaditya; Sewaiwar, Atul; Chung, Yeon-Ho

    2015-10-01

    In optical wireless communications, multiple channel transmission is an attractive solution to enhancing capacity and system performance. A new modulation scheme called color coded multiple access (CCMA) for bidirectional multiuser visible light communications (VLC) is presented for smart home applications. The proposed scheme uses red, green and blue (RGB) light emitting diodes (LED) for downlink and phosphor based white LED (P-LED) for uplink to establish a bidirectional VLC and also employs orthogonal codes to support multiple users and devices. The downlink transmission for data user devices and smart home devices is provided using red and green colors from the RGB LEDs, respectively, while uplink transmission from both types of devices is performed using the blue color from P-LEDs. Simulations are conducted to verify the performance of the proposed scheme. It is found that the proposed bidirectional multiuser scheme is efficient in terms of data rate and performance. In addition, since the proposed scheme uses RGB signals for downlink data transmission, it provides flicker-free illumination that would lend itself to multiuser VLC system for smart home applications.

  6. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  7. Instrument for assessing mobile technology acceptability in diabetes self-management: a validation and reliability study

    Science.gov (United States)

    Frandes, Mirela; Deiac, Anca V; Timar, Bogdan; Lungeanu, Diana

    2017-01-01

    Background Nowadays, mobile technologies are part of everyday life, but the lack of instruments to assess their acceptability for the management of chronic diseases makes their actual adoption for this purpose slow. Objective The objective of this study was to develop a survey instrument for assessing patients’ attitude toward and intention to use mobile technology for diabetes mellitus (DM) self-management, as well as to identify sociodemographic characteristics and quality of life factors that affect them. Methods We first conducted the documentation and instrument design phases, which were subsequently followed by the pilot study and instrument validation. Afterward, the instrument was administered 103 patients (median age: 37 years; range: 18–65 years) diagnosed with type 1 or type 2 DM, who accepted to participate in the study. The reliability and construct validity were assessed by computing Cronbach’s alpha and using factor analysis, respectively. Results The instrument included statements about the actual use of electronic devices for DM management, interaction between patient and physician, attitude toward using mobile technology, and quality of life evaluation. Cronbach’s alpha was 0.9 for attitude toward using mobile technology and 0.97 for attitude toward using mobile device applications for DM self-management. Younger patients (Spearman’s ρ=−0.429; P<0.001) with better glycemic control (Spearman’s ρ=−0.322; P<0.001) and higher education level (Kendall’s τ=0.51; P<0.001) had significantly more favorable attitude toward using mobile assistive applications for DM control. Moreover, patients with a higher quality of life presented a significantly more positive attitude toward using modern technology (Spearman’s ρ=0.466; P<0.001). Conclusion The instrument showed good reliability and internal consistency, making it suitable for measuring the acceptability of mobile technology for DM self-management. Additionally, we found that even

  8. Creating Tomorrow's Technologists: Contrasting Information Technology Curriculum in North American Library and Information Science Graduate Programs against Code4lib Job Listings

    Science.gov (United States)

    Maceli, Monica

    2015-01-01

    This research study explores technology-related course offerings in ALA-accredited library and information science (LIS) graduate programs in North America. These data are juxtaposed against a text analysis of several thousand LIS-specific technology job listings from the Code4lib jobs website. Starting in 2003, as a popular library technology…

  9. Creating Tomorrow's Technologists: Contrasting Information Technology Curriculum in North American Library and Information Science Graduate Programs against Code4lib Job Listings

    Science.gov (United States)

    Maceli, Monica

    2015-01-01

    This research study explores technology-related course offerings in ALA-accredited library and information science (LIS) graduate programs in North America. These data are juxtaposed against a text analysis of several thousand LIS-specific technology job listings from the Code4lib jobs website. Starting in 2003, as a popular library technology…

  10. Preliminary assessment of existing experimental data for validation ofreactor physics codes and data for NGNP design and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Terry, W. K.; Jewell, J. K.; Briggs, J. B.; Taiwo, T. A.; Park, W.S.; Khalil, H. S.

    2005-10-25

    The Next Generation Nuclear Plant (NGNP), a demonstration reactor and hydrogen production facility proposed for construction at the INEEL, is expected to be a high-temperature gas-cooled reactor (HTGR). Computer codes used in design and safety analysis for the NGNP must be benchmarked against experimental data. The INEEL and ANL have examined information about several past and present experimental and prototypical facilities based on HTGR concepts to assess the potential of these facilities for use in this benchmarking effort. Both reactors and critical facilities applicable to pebble-bed and prismatic block-type cores have been considered. Four facilities--HTR-PROTEUS, HTR-10, ASTRA, and AVR--appear to have the greatest potential for use in benchmarking codes for pebble-bed reactors. Similarly, for the prismatic block-type reactor design, two experiments have been ranked as having the highest priority--HTTR and VHTRC.

  11. Trace Code Validation for BWR Spray Cooling Injection and CCFL Condition Based on GÖTA Facility Experiments

    Directory of Open Access Journals (Sweden)

    Stefano Racca

    2012-01-01

    Full Text Available Best estimate codes have been used in the past thirty years for the design, licensing, and safety of NPP. Nevertheless, large efforts are necessary for the qualification and the assessment of such codes. The aim of this work is to study the main phenomena involved in the emergency spray cooling injection in a Swedish-designed BWR. For this purpose, data from the Swedish separate effect test facility GÖTA have been simulated using TRACE version 5.0 Patch 2. Furthermore, uncertainty calculations have been performed with the propagation of input errors method, and the identification of the input parameters that mostly influence the peak cladding temperature has been performed.

  12. Development and Validation of the Computer Technology Literacy Self-Assessment Scale for Taiwanese Elementary School Students

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the…

  13. A Radiation-Hydrodynamics Code Comparison for Laser-Produced Plasmas: FLASH versus HYDRA and the Results of Validation Experiments

    CERN Document Server

    Orban, Chris; Chawla, Sugreev; Wilks, Scott C; Lamb, Donald Q

    2013-01-01

    The potential for laser-produced plasmas to yield fundamental insights into high energy density physics (HEDP) and deliver other useful applications can sometimes be frustrated by uncertainties in modeling the properties and expansion of these plasmas using radiation-hydrodynamics codes. In an effort to overcome this and to corroborate the accuracy of the HEDP capabilities recently added to the publicly available FLASH radiation-hydrodynamics code, we present detailed comparisons of FLASH results to new and previously published results from the HYDRA code used extensively at Lawrence Livermore National Laboratory. We focus on two very different problems of interest: (1) an Aluminum slab irradiated by 15.3 and 76.7 mJ of "pre-pulse" laser energy and (2) a mm-long triangular groove cut in an Aluminum target irradiated by a rectangular laser beam. Because this latter problem bears a resemblance to astrophysical jets, Grava et al., Phys. Rev. E, 78, (2008) performed this experiment and compared detailed x-ray int...

  14. Verification Calculation Results to Validate the Procedures and Codes for Pin-by-Pin Power Computation in VVER Type Reactors with MOX Fuel Loading

    Energy Technology Data Exchange (ETDEWEB)

    Chizhikova, Z.N.; Kalashnikov, A.G.; Kapranova, E.N.; Korobitsyn, V.E.; Manturov, G.N.; Tsiboulia, A.A.

    1998-12-01

    One of the important problems for ensuring the VVER type reactor safety when the reactor is partially loaded with MOX fuel is the choice of appropriate physical zoning to achieve the maximum flattening of pin-by-pin power distribution. When uranium fuel is replaced by MOX one provided that the reactivity due to fuel assemblies is kept constant, the fuel enrichment slightly decreases. However, the average neutron spectrum fission microscopic cross-section for {sup 239}Pu is approximately twice that for {sup 235}U. Therefore power peaks occur in the peripheral fuel assemblies containing MOX fuel which are aggravated by the interassembly water. Physical zoning has to be applied to flatten the power peaks in fuel assemblies containing MOX fuel. Moreover, physical zoning cannot be confined to one row of fuel elements as is the case with a uniform lattice of uranium fuel assemblies. Both the water gap and the jump in neutron absorption macroscopic cross-sections which occurs at the interface of fuel assemblies with different fuels make the problem of calculating space-energy neutron flux distribution more complicated since it increases nondiffusibility effects. To solve this problem it is necessary to update the current codes, to develop new codes and to verify all the codes including nuclear-physical constants libraries employed. In so doing it is important to develop and validate codes of different levels--from design codes to benchmark ones. This paper presents the results of the burnup calculation for a multiassembly structure, consisting of MOX fuel assemblies surrounded by uranium dioxide fuel assemblies. The structure concerned can be assumed to model a fuel assembly lattice symmetry element of the VVER-1000 type reactor in which 1/4 of all fuel assemblies contains MOX fuel.

  15. Validation of ICD-9-CM/ICD-10 coding algorithms for the identification of patients with acetaminophen overdose and hepatotoxicity using administrative data

    Directory of Open Access Journals (Sweden)

    Shaheen Abdel

    2007-10-01

    Full Text Available Abstract Background Acetaminophen overdose is the most common cause of acute liver failure (ALF. Our objective was to develop coding algorithms using administrative data for identifying patients with acetaminophen overdose and hepatic complications. Methods Patients hospitalized for acetaminophen overdose were identified using population-based administrative data (1995–2004. Coding algorithms for acetaminophen overdose, hepatotoxicity (alanine aminotransferase >1,000 U/L and ALF (encephalopathy and international normalized ratio >1.5 were derived using chart abstraction data as the reference and logistic regression analyses. Results Of 1,776 potential acetaminophen overdose cases, the charts of 181 patients were reviewed; 139 (77% had confirmed acetaminophen overdose. An algorithm including codes 965.4 (ICD-9-CM and T39.1 (ICD-10 was highly accurate (sensitivity 90% [95% confidence interval 84–94%], specificity 83% [69–93%], positive predictive value 95% [89–98%], negative predictive value 71% [57–83%], c-statistic 0.87 [0.80–0.93]. Algorithms for hepatotoxicity (including codes for hepatic necrosis, toxic hepatitis and encephalopathy and ALF (hepatic necrosis and encephalopathy were also highly predictive (c-statistics = 0.88. The accuracy of the algorithms was not affected by age, gender, or ICD coding system, but the acetaminophen overdose algorithm varied between hospitals (c-statistics 0.84–0.98; P = 0.003. Conclusion Administrative databases can be used to identify patients with acetaminophen overdose and hepatic complications. If externally validated, these algorithms will facilitate investigations of the epidemiology and outcomes of acetaminophen overdose.

  16. Study of cold neutron sources: Implementation and validation of a complete computation scheme for research reactor using Monte Carlo codes TRIPOLI-4.4 and McStas

    Energy Technology Data Exchange (ETDEWEB)

    Campioni, Guillaume; Mounier, Claude [Commissariat a l' Energie Atomique, CEA, 31-33, rue de la Federation, 75752 Paris cedex (France)

    2006-07-01

    The main goal of the thesis about studies of cold neutrons sources (CNS) in research reactors was to create a complete set of tools to design efficiently CNS. The work raises the problem to run accurate simulations of experimental devices inside reactor reflector valid for parametric studies. On one hand, deterministic codes have reasonable computation times but introduce problems for geometrical description. On the other hand, Monte Carlo codes give the possibility to compute on precise geometry, but need computation times so important that parametric studies are impossible. To decrease this computation time, several developments were made in the Monte Carlo code TRIPOLI-4.4. An uncoupling technique is used to isolate a study zone in the complete reactor geometry. By recording boundary conditions (incoming flux), further simulations can be launched for parametric studies with a computation time reduced by a factor 60 (case of the cold neutron source of the Orphee reactor). The short response time allows to lead parametric studies using Monte Carlo code. Moreover, using biasing methods, the flux can be recorded on the surface of neutrons guides entries (low solid angle) with a further gain of running time. Finally, the implementation of a coupling module between TRIPOLI- 4.4 and the Monte Carlo code McStas for research in condensed matter field gives the possibility to obtain fluxes after transmission through neutrons guides, thus to have the neutron flux received by samples studied by scientists of condensed matter. This set of developments, involving TRIPOLI-4.4 and McStas, represent a complete computation scheme for research reactors: from nuclear core, where neutrons are created, to the exit of neutrons guides, on samples of matter. This complete calculation scheme is tested against ILL4 measurements of flux in cold neutron guides. (authors)

  17. PyNeb: a new tool for analyzing emission lines. I. Code description and validation of results

    Science.gov (United States)

    Luridiana, V.; Morisset, C.; Shaw, R. A.

    2015-01-01

    Analysis of emission lines in gaseous nebulae yields direct measures of physical conditions and chemical abundances and is the cornerstone of nebular astrophysics. Although the physical problem is conceptually simple, its practical complexity can be overwhelming since the amount of data to be analyzed steadily increases; furthermore, results depend crucially on the input atomic data, whose determination also improves each year. To address these challenges we created PyNeb, an innovative code for analyzing emission lines. PyNeb computes physical conditions and ionic and elemental abundances and produces both theoretical and observational diagnostic plots. It is designed to be portable, modular, and largely customizable in aspects such as the atomic data used, the format of the observational data to be analyzed, and the graphical output. It gives full access to the intermediate quantities of the calculation, making it possible to write scripts tailored to the specific type of analysis one wants to carry out. In the case of collisionally excited lines, PyNeb works by solving the equilibrium equations for an n-level atom; in the case of recombination lines, it works by interpolation in emissivity tables. The code offers a choice of extinction laws and ionization correction factors, which can be complemented by user-provided recipes. It is entirely written in the python programming language and uses standard python libraries. It is fully vectorized, making it apt for analyzing huge amounts of data. The code is stable and has been benchmarked against IRAF/NEBULAR. It is public, fully documented, and has already been satisfactorily used in a number of published papers.

  18. Bar Code Technology of Interactive Teaching%条形码技术的互动教学

    Institute of Scientific and Technical Information of China (English)

    林胜青

    2014-01-01

    学习现代科学技术,并结合在生活中实际应用,尽量提倡师生用互动式教学,推动当今的现代化教育,提高教育质量。本文介绍利用互动式教学方法,讲授条形码的制作原理及实际应用。%Learning modern science and technology,and connecting with the practical application in our daily life,try to advocate using interactive teaching between teachers and students,promote the today's modern education,improve the quality of education. This article shows that using interactive teaching methods,teaching of bar code principle and prac-tical application.

  19. Euler Technology Assessment - SPLITFLOW Code Applications for Stability and Control Analysis on an Advanced Fighter Model Employing Innovative Control Concepts

    Science.gov (United States)

    Jordan, Keith J.

    1998-01-01

    This report documents results from the NASA-Langley sponsored Euler Technology Assessment Study conducted by Lockheed-Martin Tactical Aircraft Systems (LMTAS). The purpose of the study was to evaluate the ability of the SPLITFLOW code using viscous and inviscid flow models to predict aerodynamic stability and control of an advanced fighter model. The inviscid flow model was found to perform well at incidence angles below approximately 15 deg, but not as well at higher angles of attack. The results using a turbulent, viscous flow model matched the trends of the wind tunnel data, but did not show significant improvement over the Euler solutions. Overall, the predictions were found to be useful for stability and control design purposes.

  20. The Cubesat Radiometer Radio Frequency Interference Technology Validation (CubeRRT) Mission

    Science.gov (United States)

    Misra, S.; Johnson, J. T.; Ball, C.; Chen, C. C.; Smith, G.; McKelvey, C.; Andrews, M.; O'Brien, A.; Kocz, J.; Jarnot, R.; Brown, S. T.; Piepmeier, J. R.; Lucey, J.; Miles, L. R.; Bradley, D.; Mohammed, P.

    2016-12-01

    Passive microwave measurements made below 40GHz have experienced increased amounts of man-made radio frequency interference (RFI) over the past couple of decades. Such RFI has had a degenerative impact on various important geophysical retrievals such as soil-moisture, sea-surface salinity, atmospheric water vapor, precipitation etc. The commercial demand for spectrum allocation has increased over the past couple of years - infringing on frequencies traditionally reserved for scientific uses such as Earth observation at passive microwave frequencies. With the current trend in shared spectrum allocations, future microwave radiometers will have to co-exist with terrestrial RFI sources. The CubeSat Radiometer Radio Frequency Interference Technology Validation (CubeRRT) mission is developing a 6U Cubesat system to demonstrate RFI detection and filtering technologies for future microwave radiometer remote sensing missions. CubeRRT will operate between 6-40GHz, and demonstrate on-board real-time RFI detection on Earth brightness temperatures tuned over 1GHz steps. The expected launch date for CubeRRT is early 2018. Digital subsystems for higher frequency microwave radiometry require a larger bandwidth, as well as more processing power and on-board operation capabilities for RFI filtering. Real-time and on-board RFI filtering technology development is critical for future missions to allow manageable downlink data volumes. The enabling CubeRRT technology is a digital FPGA-based spectrometer with a bandwidth of 1 GHz that is capable of implementing advanced RFI filtering algorithms that use the kurtosis and cross-frequency RFI detection methods in real-time on board the spacecraft. The CubeRRT payload consists of 3 subsystems: a wideband helical antenna, a tunable analog radiometer subsystem, and a digital backend. The following presentation will present an overview of the system and results from the latest integration and test.

  1. CERESVis: A QC Tool for CERES that Leverages Browser Technology for Data Validation

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Heckert, E.; Chen, Y.; Doelling, D.

    2015-12-01

    In this poster, we are going to present three user interfaces that CERES team uses to validate pixel-level data. Besides our home grown tools, we will aslo present the browser technology that we use to provide interactive interfaces, such as jquery, HighCharts and Google Earth. We pass data to the users' browsers and use the browsers to do some simple computations. The three user interfaces are: Thumbnails -- it displays hundrends images to allow users to browse 24-hour data files in few seconds. Multiple-synchronized cursors -- it allows users to compare multiple images side by side. Bounding Boxes and Histograms -- it allows users to draw multiple bounding boxes on an image and the browser computes/display the histograms.

  2. Development and validation of science, technology, engineering and mathematics (STEM) based instructional material

    Science.gov (United States)

    Gustiani, Ineu; Widodo, Ari; Suwarma, Irma Rahma

    2017-05-01

    This study is intended to examine the development and validation of simple machines instructional material that developed based on Science, Technology, Engineering and Mathematics (STEM) framework that provides guidance to help students learn and practice for real life and enable individuals to use knowledge and skills they need to be an informed citizen. Sample of this study consist of one class of 8th grader at a junior secondary school in Bandung, Indonesia. To measure student learning, a pre-test and post-test were given before and after implementation of the STEM based instructional material. In addition, a questionnaire of readability was given to examine the clarity and difficulty level of each page of instructional material. A questionnaire of students' response towards instructional material given to students and teachers at the end of instructional material reading session to measure layout aspects, content aspects and utility aspects of instructional material for being used in the junior secondary school classroom setting. The results show that readability aspect and students' response towards STEM based instructional material of STEM based instructional material is categorized as very high. Pretest and posttest responses revealed that students retained significant amounts information upon completion of the STEM instructional material. Student overall learning gain is 0.67 which is categorized as moderate. In summary, STEM based instructional material that was developed is valid enough to be used as educational materials necessary for conducting effective STEM education.

  3. Analysis/plot generation code with significance levels computed using Kolmogorov-Smirnov statistics valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.

  4. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  5. What Technology Skills Do Developers Need? A Text Analysis of Job Listings in Library and Information Science (LIS from Jobs.code4lib.org

    Directory of Open Access Journals (Sweden)

    Monica Maceli

    2015-09-01

    Full Text Available Technology plays an indisputably vital role in library and information science (LIS work; this rapidly moving landscape can create challenges for practitioners and educators seeking to keep pace with such change.  In pursuit of building our understanding of currently sought technology competencies in developer-oriented positions within LIS, this paper reports the results of a text analysis of a large collection of job listings culled from the Code4lib jobs website.  Beginning over a decade ago as a popular mailing list covering the intersection of technology and library work, the Code4lib organization's current offerings include a website that collects and organizes LIS-related technology job listings.  The results of the text analysis of this dataset suggest the currently vital technology skills and concepts that existing and aspiring practitioners may target in their continuing education as developers.

  6. PyNeb: a new tool for analyzing emission lines. I. Code description and validation of results

    CERN Document Server

    Luridiana, Valentina; Shaw, Richard A

    2014-01-01

    Analysis of emission lines in gaseous nebulae yields direct measures of physical conditions and chemical abundances and is the cornerstone of nebular astrophysics. Although the physical problem is conceptually simple, its practical complexity can be overwhelming since the amount of data to be analyzed steadily increases; furthermore, results depend crucially on the input atomic data, whose determination also improves each year. To address these challenges we created PyNeb, an innovative code for analyzing emission lines. PyNeb computes physical conditions and ionic and elemental abundances, and produces both theoretical and observational diagnostic plots. It is designed to be portable, modular, and largely customizable in aspects such as the atomic data used, the format of the observational data to be analyzed, and the graphical output. It gives full access to the intermediate quantities of the calculation, making it possible to write scripts tailored to the specific type of analysis one wants to carry out. I...

  7. 卫星通信中的信道编码与调制技术%Channel Coding and Modulation Technology for Satelite Communication

    Institute of Scientific and Technical Information of China (English)

    周珊; 沈永言

    2016-01-01

    本文首先研究了卫星通信中的信道编码与调制技术,并对广泛应用于卫星通信的DV B-S系列标准中的信道编码与调制技术进行了对比分析,最后提出了卫星通信采用更高级信道编码和调制技术的必然性。%This paper studies the channel coding and modulation technology for satellite communication firstly.Then the comparative analyses of channel coding and modulation technology in DVB-S Series standard which is widely used in satellite communications are made. Finally the inevitability of using more advanced channel coding and modulation technology in satellite communication is indicated.

  8. California Diploma Project Technical Report III: Validity Study--Validity Study of the Health Sciences and Medical Technology Standards

    Science.gov (United States)

    McGaughy, Charis; Bryck, Rick; de Gonzalez, Alicia

    2012-01-01

    This study is a validity study of the recently revised version of the Health Science Standards. The purpose of this study is to understand how the Health Science Standards relate to college and career readiness, as represented by survey ratings submitted by entry-level college instructors of health science courses and industry representatives. For…

  9. Aircraft Loss of Control: Problem Analysis for the Development and Validation of Technology Solutions

    Science.gov (United States)

    Belcastro, Christine M.; Newman, Richard L.; Crider, Dennis A.; Klyde, David H.; Foster, John V.; Groff, Loren

    2016-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes. LOC can result from a wide spectrum of precursors (or hazards), often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and the validation process must provide a means of assessing system effectiveness and coverage of these hazards. This paper provides a detailed description of a methodology for analyzing LOC as a dynamics and control problem for the purpose of developing effective technology solutions. The paper includes a definition of LOC based on several recent publications, a detailed description of a refined LOC accident analysis process that is illustrated via selected example cases, and a description of planned follow-on activities for identifying future potential LOC risks and the development of LOC test scenarios. Some preliminary considerations for LOC of Unmanned Aircraft Systems (UAS) and for their safe integration into the National Airspace System (NAS) are also discussed.

  10. Validation of the Intrapersonal Technology Integration Scale: Assessing the Influence of Intrapersonal Factors that Influence Technology Integration

    Science.gov (United States)

    Niederhauser, Dale S.; Perkmen, Serkan

    2008-01-01

    Teachers' beliefs about their self-efficacy for integrating technology, their outcome expectations for integrating technology, and their interest in using technology to support student learning influence their intentions for incorporating technology into their instructional practices. To date, instruments developed to examine the relationships…

  11. Implementation of a Transition Model in a NASA Code and Validation Using Heat Transfer Data on a Turbine Blade

    Science.gov (United States)

    Ameri, Ali A.

    2012-01-01

    The purpose of this report is to summarize and document the work done to enable a NASA CFD code to model laminar-turbulent transition process on an isolated turbine blade. The ultimate purpose of the present work is to down-select a transition model that would allow the flow simulation of a variable speed power turbine to be accurately performed. The flow modeling in its final form will account for the blade row interactions and their effects on transition which would lead to accurate accounting for losses. The present work only concerns itself with steady flows of variable inlet turbulence. The low Reynolds number k- model of Wilcox and a modified version of the same model will be used for modeling of transition on experimentally measured blade pressure and heat transfer. It will be shown that the k- model and its modified variant fail to simulate the transition with any degree of accuracy. A case is thus made for the adoption of more accurate transition models. Three-equation models based on the work of Mayle on Laminar Kinetic Energy were explored. The three-equation model of Walters and Leylek was thought to be in a relatively mature state of development and was implemented in the Glenn-HT code. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Surface heat transfer rate serves as sensitive indicator of transition. With the newly implemented model, it was shown that the simulation of transition process is much improved over the baseline k- model for the single Reynolds number and pressure ratio attempted; while agreement with heat transfer data became more satisfactory. Armed with the new transition model, total-pressure losses of computed three-dimensional flow of E3 tip section cascade were compared to the experimental data for a range of incidence angles. The results obtained, form a partial loss bucket for the chosen blade

  12. Validation of a Node-Centered Wall Function Model for the Unstructured Flow Code FUN3D

    Science.gov (United States)

    Carlson, Jan-Renee; Vasta, Veer N.; White, Jeffery

    2015-01-01

    In this paper, the implementation of two wall function models in the Reynolds averaged Navier-Stokes (RANS) computational uid dynamics (CFD) code FUN3D is described. FUN3D is a node centered method for solving the three-dimensional Navier-Stokes equations on unstructured computational grids. The first wall function model, based on the work of Knopp et al., is used in conjunction with the one-equation turbulence model of Spalart-Allmaras. The second wall function model, also based on the work of Knopp, is used in conjunction with the two-equation k-! turbulence model of Menter. The wall function models compute the wall momentum and energy flux, which are used to weakly enforce the wall velocity and pressure flux boundary conditions in the mean flow momentum and energy equations. These wall conditions are implemented in an implicit form where the contribution of the wall function model to the Jacobian are also included. The boundary conditions of the turbulence transport equations are enforced explicitly (strongly) on all solid boundaries. The use of the wall function models is demonstrated on four test cases: a at plate boundary layer, a subsonic di user, a 2D airfoil, and a 3D semi-span wing. Where possible, different near-wall viscous spacing tactics are examined. Iterative residual convergence was obtained in most cases. Solution results are compared with theoretical and experimental data for several variations of grid spacing. In general, very good comparisons with data were achieved.

  13. TECATE - a code for anisotropic thermoelasticity in high-average-power laser technology. Phase 1 final report

    Energy Technology Data Exchange (ETDEWEB)

    Gelinas, R.J.; Doss, S.K.; Carlson, N.N.

    1985-01-01

    This report describes a totally Eulerian code for anisotropic thermoelasticity (code name TECATE) which may be used in evaluations of prospective crystal media for high-average-power lasers. The present TECATE code version computes steady-state distributions of material temperatures, stresses, strains, and displacement fields in 2-D slab geometry. Numerous heat source and coolant boundary condition options are available in the TECATE code for laser design considerations. Anisotropic analogues of plane stress and plane strain evaluations can be executed for any and all crystal symmetry classes. As with all new and/or large physics codes, it is likely that some code imperfections will emerge at some point in time.

  14. Single Use Letter Report for the Verification and Validation of the RADNUC-2A and ORIGEN2 S.2 Computer Codes

    Energy Technology Data Exchange (ETDEWEB)

    PACKER, M.J.

    2000-06-20

    This report documents the verification and validation (V&V) activities undertaken to support the use of the RADNUC2-A and ORIGEN2 S.2 computer codes for the specific application of calculating isotopic inventories and decay heat loadings for Spent Nuclear Fuel Project (SNFP) activities as described herein. Two recent applications include the reports HNF-SD-SNF-TI-009, 105-K Basin Material Design Basis Feed Description for Spent Nuclear Fuel Project Facilities, Volume 1, Fuel (Praga, 1998), and HNF-3035, Rev. 0B, MCO Gas Composition for Low Reactive Surface Areas (Packer, 1998). Representative calculations documented in these two reports were repeated using RADNUC2-A, and the results were identical to the documented results. This serves as verification that version 2A of Radnuc was used for the applications noted above; the same version was tested herein, and perfect agreement was shown. Comprehensive V&V is demonstrated for RADNUC2-A in Appendix A.

  15. Measurement of neutron-induced activation cross-sections using spallation source at JINR and neutronic validation of the Dubna code

    Indian Academy of Sciences (India)

    Manish Sharma; V Kumar; H Kumawat; J Adam; V S Barashenkov; S Ganesan; S Golovatiouk; S K Gupta; S Kailas; M I Krivopustov; H S Palsania; V Pronskikh; V M Tsoupko-Sitnikov; N Vladimirova; H Westmeier; W Westmeier

    2007-02-01

    A beam of 1 GeV proton coming from Dubna Nuclotron colliding with a lead target surrounded by 6 cm paraffin produces spallation neutrons. A Th-foil was kept on lead target (neutron spallation source) in a direct stream of neutrons for activation and other samples of 197Au, 209Bi, 59Co, 115In and 181Ta were irradiated by moderated beam of neutrons passing through 6 cm paraffin moderator. The gamma spectra of irradiated samples were analyzed using gamma spectrometry and DEIMOS software to measure the neutron cross-section. For this purpose neutron fluence at the positions of samples is also estimated using PREPRO software. The results of cross-sections for reactions 232Th(, ), 232Th(, 2), 197Au(, ), 197Au(, ), 197Au(, ), 59Co(, ), 59Co(, ), 181Ta(, ) and 181Ta(, ) are given in this paper. Neutronics validation of the Dubna Cascade Code is also done using cross-section data by other experiments.

  16. Single Use Letter Report for the Verification and Validation of the RADNUC-2A and ORIGEN2 S.2 Computer Codes

    Energy Technology Data Exchange (ETDEWEB)

    PACKER, M.J.

    2000-06-20

    This report documents the verification and validation (V&V) activities undertaken to support the use of the RADNUC2-A and ORIGEN2 S.2 computer codes for the specific application of calculating isotopic inventories and decay heat loadings for Spent Nuclear Fuel Project (SNFP) activities as described herein. Two recent applications include the reports HNF-SD-SNF-TI-009, 105-K Basin Material Design Basis Feed Description for Spent Nuclear Fuel Project Facilities, Volume 1, Fuel (Praga, 1998), and HNF-3035, Rev. 0B, MCO Gas Composition for Low Reactive Surface Areas (Packer, 1998). Representative calculations documented in these two reports were repeated using RADNUC2-A, and the results were identical to the documented results. This serves as verification that version 2A of Radnuc was used for the applications noted above; the same version was tested herein, and perfect agreement was shown. Comprehensive V&V is demonstrated for RADNUC2-A in Appendix A.

  17. Creating, implementing, and validating a virtual learning model in web 2.0 technologies for higher education.

    OpenAIRE

    Zambrano, William Ricardo; Pontificia Universidad Javeriana; Medina, Victor Hugo; Universidad Pontificia de Salamanca

    2010-01-01

    In this article we examine different world educational models supported on Information Technologies (ITs) and their impact, in order to produce a Virtual Learning Model in Web 2.0 Technologies for Higher education in Colombia. We resorted to applied and theoretical, qualitative and quantitative research methods covered by an area usually known as descriptive and co-relational studies. The Model was implemented and then validated in two academic courses. The method basically comprised activiti...

  18. Validation of TITAN2D flow model code for pyroclastic flows and debris avalanches at Soufrière Hills Volcano, Montserrat, BWI

    Science.gov (United States)

    Widiwijayanti, C.; Voight, B.; Hidayat, D.; Patra, A.; Pitman, E.

    2004-12-01

    Soufrière Hills Volcano (SHV), Montserrat, has experienced numerous episodes of dome collapses since 1996. They range from relatively small rockfalls to major dome collapses, several >10x106 m3, and one >100x106 m3 (Calder, Luckett, Sparks and Voight 2002; Voight et al. 2002). The hazard implications for such events are significant at both local and regional scales, and include pyroclastic surges, explosions, and tsunami. Problems arise in forecasting and hazards mitigation, particularly in zoning for populated areas. Determining the likely extent of flow deposits is important for hazard zonation. For this, detailed mapping (topography of source areas and paths, material properties, structure, track roughness and erosion) has an important role, giving clues on locations of future collapse and runout paths. Here we present an application of a numerical computation model of geophysical mass flow using the TITAN2D code (Patra et al. 2004; Pitman et al. 2004), to simulate dome collapses at SHV. The majority of collapse-type pyroclastic flows at SHV are consistent with an initiation by gravitational collapse of oversteepened flanks of the dome. If the gravity controls the energy for such processes, then the flow tracks can be predicted on the basis of topography, and friction influences runout. TITAN2D is written to simulate this type of volcanic flow, and the SHV database is used to validate the code and provide calibrated data on friction properties. The topographic DEM was successively updated by adding flow deposit thicknesses for previous collapses. Simulation results were compared to observed flow parameters, including flow path, deposit volume, duration, velocity, and runout distance of individual flows, providing calibration data on internal and bed friction, and demonstrating the validity and limitations of such modeling for practical volcanic hazard assessment.

  19. Model-based Design for Embedded C Code Realization and Validation%基于模型的嵌入式C代码的实现与验证

    Institute of Scientific and Technical Information of China (English)

    徐超坤; 朱婷; 李威宣

    2011-01-01

    以51芯片为例,讲述了模型的建立、调试与验证,以及基于模型的嵌入式C代码的自动生成及软硬件在环测试。实践表明,该基于模型的设计方法可显著提高工作效率、缩短研发周期、降低开发成本,并且增加了代码的安全性与鲁棒性,有效降低了产品软件开发的风险。%Taking 51 chip as example, this article describes the model building and validation, model-based embedded C code automatic generation and software/hardware in the loop testing. Practice shows that the method can significantly improve work efficiency, shorten the development cycle, reduce development costs, increase code security and robustness, and effectively avoid the risks in product software development.

  20. Internal dosimetry with the Monte Carlo code GATE: validation using the ICRP/ICRU female reference computational model

    Science.gov (United States)

    Villoing, Daphnée; Marcatili, Sara; Garcia, Marie-Paule; Bardiès, Manuel

    2017-03-01

    The purpose of this work was to validate GATE-based clinical scale absorbed dose calculations in nuclear medicine dosimetry. GATE (version 6.2) and MCNPX (version 2.7.a) were used to derive dosimetric parameters (absorbed fractions, specific absorbed fractions and S-values) for the reference female computational model proposed by the International Commission on Radiological Protection in ICRP report 110. Monoenergetic photons and electrons (from 50 keV to 2 MeV) and four isotopes currently used in nuclear medicine (fluorine-18, lutetium-177, iodine-131 and yttrium-90) were investigated. Absorbed fractions, specific absorbed fractions and S-values were generated with GATE and MCNPX for 12 regions of interest in the ICRP 110 female computational model, thereby leading to 144 source/target pair configurations. Relative differences between GATE and MCNPX obtained in specific configurations (self-irradiation or cross-irradiation) are presented. Relative differences in absorbed fractions, specific absorbed fractions or S-values are below 10%, and in most cases less than 5%. Dosimetric results generated with GATE for the 12 volumes of interest are available as supplemental data. GATE can be safely used for radiopharmaceutical dosimetry at the clinical scale. This makes GATE a viable option for Monte Carlo modelling of both imaging and absorbed dose in nuclear medicine.

  1. Test and validation of CFD codes for the simulation of accident-typical phenomena in the reactor containment; Erprobung und Validierung von CFD-Codes fuer die Simulation von unfalltypischen Phaenomenen im Sicherheitseinschluss

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, Berthold; Stewering, Joern; Sonnenkalb, Martin

    2014-03-15

    CFD (Computational Fluid Dynamic) simulation techniques have a growing relevance for the simulation and assessment of accidents in nuclear reactor containments. Some fluid dynamic problems like the calculation of the flow resistances in a complex geometry, turbulence calculations or the calculation of deflagrations could only be solved exactly for very simple cases. These fluid dynamic problems could not be represented by lumped parameter models and must be approximated numerically. Therefore CFD techniques are discussed by a growing international community in conferences like the CFD4NRS-conference. Also the number of articles with a CFD topic is increasing in professional journals like Nuclear Engineering and Design. CFD tools like GASFLOW or GOTHIC are already in use in European nuclear site licensing processes for future nuclear power plants like EPR or AP1000 and the results of these CFD tools are accepted by the authorities. For these reasons it seems to be necessary to build up national competences in the field of CFD techniques and it is important to validate and assess the existing CFD tools. GRS continues the work for the validation and assessment of CFD codes for the simulation of accident scenarios in a nuclear reactor containment within the framework of the BMWi sponsored project RS1500. The focus of this report is on the following topics: - Further validation of condensation models from GRS, FZJ and ANSYS and development of a new condensate model. - Validation of a new turbulence model which was developed by the University of Stuttgart in cooperation with ANSYS. - The formation and dissolution of light gas stratifications are analyzed by large scale experiments. These experiments were simulated by GRS. - The AREVA correlations for hydrogen recombiners (PARs) could be improved by GRS after the analysis of experimental data. Relevant experiments were simulated with this improved recombiner correlation. - Analyses on the simulation of H{sub 2

  2. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    Science.gov (United States)

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  3. Validating a Measure of Teacher Intentions to Integrate Technology in Education in Turkey, Spain and the USA

    Science.gov (United States)

    Perkmen, Serkan; Antonenko, Pavlo; Caracuel, Alfonso

    2016-01-01

    The main purpose of this study was to examine the validity of the Teacher Intentions to Integrate Technology in Education Scale using pre-service teacher samples from three countries on three continents--Turkey, Spain and the United States. Study participants were 550 pre-service teachers from three universities in Turkey, Spain and the USA (219,…

  4. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    Science.gov (United States)

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  5. The Development and Validation of a Measure of Student Attitudes toward Science, Technology, Engineering, and Math (S-STEM)

    Science.gov (United States)

    Unfried, Alana; Faber, Malinda; Stanhope, Daniel S.; Wiebe, Eric

    2015-01-01

    Using an iterative design along with multiple methodological approaches and a large representative sample, this study presents reliability, validity, and fairness evidence for two surveys measuring student attitudes toward science, technology, engineering, and math (S-STEM) and interest in STEM careers for (a) 4th- through 5th-grade students…

  6. Real Virtuality: A Code of Ethical ConductRecommendations for Good Scientific Practice and the Consumers of VR-Technology

    Directory of Open Access Journals (Sweden)

    Michael eMadary

    2016-02-01

    Full Text Available The goal of this article is to present a first list of ethical concerns that may arise from research and personal use of virtual reality (VR and related technology, and to offer concrete recommendations for minimizing those risks. Many of the recommendations call for focused research initiatives. In the first part of the article, we discuss the relevant evidence from psychology that motivates our concerns. In section 1.1, we cover some of the main results suggesting that one’s environment can influence one’s psychological states, as well as recent work on inducing illusions of embodiment. Then, in section 1.2, we go on to discuss recent evidence indicating that immersion in VR can have psychological effects that last after leaving the virtual environment. In the second part of the article we turn to the risks and recommendations. We begin, in section 2.1, with the research ethics of VR, covering six main topics: the limits of experimental environments, informed consent, clinical risks, dual-use, online research, and a general point about the limitations of a code of conduct for research. Then, in section 2.2, we turn to the risks of VR for the general public, covering four main topics: long-term immersion, neglect of the social and physical environment, risky content, and privacy. We offer concrete recommendations for each of these ten topics, summarized in Table 1.

  7. Functional design and implementation with on-line programmable technology in optical fiber communication pulse code modulation test system

    Science.gov (United States)

    Xu, Yuan; Ding, Huan; Gao, Youtang

    2010-10-01

    In order to complete the functional design in the fiber optical communication pulse code modulation test system, taking advantage of CPLD / FPGA and SOPC technology, software solutions used to design system hardware features and control functions, thereby the whole system could attain optimisation in the logic control as well as encoding and decoding functional designs on the motherboard, enabling this system fulfill the capacities varying from simple digital simulation transmission modulate to the high speed fiber optical communication network information encoding and decoding functions. Simultaneously the application of logarithmic pressure companding technique, PCM encoding and decoding system to improve the small signal quantizing SNR(Signal-to-Noise Ratio), TP3067 adopting A rate thirteen broken lines to carry on signal pressure companding. When the signal at a certain stage, the quantizing SNR is invariable(as signal receives uniform quantization in this phase, therefore the quantizing SNR drops along with signal amplititude decreasing). Test results are as follows: ideal various signal encoding and decoding system waveforms, high performance parameters , achieve the desired designing aim, a entirely new approach to realize different kinds of information encoding and decoding model building and implementation, saving development costs, improving design efficiency, satisfactory actual results, stable operation.

  8. Independent Validation and Verification of Process Design and Optimization Technology Diagnostic and Control of Natural Gas Fired Furnaces via Flame Image Analysis Technology

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Daryl [ORNL

    2009-05-01

    The United States Department of Energy, Industrial Technologies Program has invested in emerging Process Design and Optimizations Technologies (PDOT) to encourage the development of new initiatives that might result in energy savings in industrial processes. Gas fired furnaces present a harsh environment, often making accurate determination of correct air/fuel ratios a challenge. Operation with the correct air/fuel ratio and especially with balanced burners in multi-burner combustion equipment can result in improved system efficiency, yielding lower operating costs and reduced emissions. Flame Image Analysis offers a way to improve individual burner performance by identifying and correcting fuel-rich burners. The anticipated benefit of this technology is improved furnace thermal efficiency, and lower NOx emissions. Independent validation and verification (V&V) testing of the FIA technology was performed at Missouri Forge, Inc., in Doniphan, Missouri by Environ International Corporation (V&V contractor) and Enterprise Energy and Research (EE&R), the developer of the technology. The test site was selected by the technology developer and accepted by Environ after a meeting held at Missouri Forge. As stated in the solicitation for the V&V contractor, 'The objective of this activity is to provide independent verification and validation of the performance of this new technology when demonstrated in industrial applications. A primary goal for the V&V process will be to independently evaluate if this technology, when demonstrated in an industrial application, can be utilized to save a significant amount of the operating energy cost. The Seller will also independently evaluate the other benefits of the demonstrated technology that were previously identified by the developer, including those related to product quality, productivity, environmental impact, etc'. A test plan was provided by the technology developer and is included as an appendix to the summary report

  9. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    Molten Salt Reactor (MSR), which was confirmed as one of the six Generation IV reactor types by the GIF (Generation IV International Forum in 2008), recently draws a lot of attention all around the world. Due to the application of liquid fuels the MSR can be regarded as the most special one among those six GEN-IV reactor types in a sense. A unique advantage of using liquid nuclear fuel lies in that the core melting accident can be thoroughly eliminated. Besides, a molten salt reactor can have several fuel options, for instance, the fuel can be based on {sup 235}U, {sup 232}Th-{sup 233}U, {sup 238}U-{sup 239}Pu cycle or even the spent nuclear fuel (SNF), so the reactor can be operated as a breeder or as an actinides burner both with fast, thermal or epi-thermal neutron spectrum and hence, it has excellent features of the fuel sustainability and for the non-proliferation. Furthermore, the lower operating pressure not only means a lower risk of the explosion as well as the radioactive leakage but also implies that the reactor vessel and its components can be lightweight, thus lowering the cost of equipments. So far there is no commercial MSR being operated. However, the MSR concept and its technical validation dates back to the 1960s to 1970s, when the scientists and engineers from ORNL (Oak Ridge National Laboratory) in the United States managed to build and run the world's first civilian molten salt reactor called MSRE (Molten Salt Reactor Experiment). The MSRE was an experimental liquid-fueled reactor with 10 MW thermal output using {sup 4}LiF-BeF{sub 2}-ZrF{sub 4}-UF{sub 4} as the fuel also as the coolant itself. The MSRE is usually taken as a very important reference case for many current researches to validate their codes and simulations. Without exception it works also as a benchmark for this thesis. The current thesis actually consists of two main parts. The first part is about the validation of the current code for the old MSRE concept, while the second

  10. Assessing the validity of using serious game technology to analyze physician decision making.

    Science.gov (United States)

    Mohan, Deepika; Angus, Derek C; Ricketts, Daniel; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R; Yealy, Donald M; Barnato, Amber E

    2014-01-01

    Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity. We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01). Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03). We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the

  11. Assessing the validity of using serious game technology to analyze physician decision making.

    Directory of Open Access Journals (Sweden)

    Deepika Mohan

    Full Text Available BACKGROUND: Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes have emerged as a method of studying physician decision making. However, little is known about their validity. METHODS: We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines. We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case. We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. FINDINGS: We recruited 209 physicians, of whom 168 (79% began and 142 (68% completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C: 10.9 [SD 4.8] vs. cognitive load (CL:10.7 [SD 5.6], p = 0.74, despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01. Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20, but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03. CONCLUSIONS: We found that physicians made decisions consistent with actual practice, that we could

  12. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  13. Validation of WIMS-SNAP code systems for calculations in TRIGA-MARK II type reactors; Validacion del sistema de codigos WIMS-SNAP para calculos en reactores nucleares tipo TRIGA-MARK II

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Valle, S.; Lopez Aldama, D. [Centro de Investigaciones Nucleares, Tecnologicas y Ambientales, La Habana (Cuba). E-mail: svalle@ctn.isctn.edu.cu

    2000-07-01

    The following paper contributes to validate the Nuclear Engineering Department methods to carry out calculations in TRIGA reactors solving a Benchmark. The benchmark is analyzed with the WIMS-D/4-SNAP/3D code system and using the cross section library WIMS-TRIGA. A brief description of the DSN method is presented used in WIMS/d{sup 4} code and also the SNAP-3d code is shortly explained. The results are presented and compared with the experimental values. In other hand the possible error sources are analyzed. (author)

  14. Application of Bar Code Technology in Aircraft Spare Parts Management%条码技术在航材管理中的应用

    Institute of Scientific and Technical Information of China (English)

    李卫灵; 郭星香; 刘臣宇; 周斌

    2011-01-01

    为提高航材管理的自动化管理水平,提出在航材管理中使用二维QR条码技术,并阐述了编码方案和业务流程改造方案.%Application of barcode technology will greatly enhance the level of automation of the aircraft spare parts management. This paper discusses the use of the QR code in the aircraft spare parts management, then explains the scenario of coding and working procedure reengineering.

  15. Identification of mRNA-like non-coding RNAs and validation of a mighty one named MAR in Panax ginseng

    Institute of Scientific and Technical Information of China (English)

    Meizhen Wang; Bin Wu; Chao Chen; Shanfa Lu

    2015-01-01

    Increasing evidence suggests that long non‐coding RNAs (lncRNAs) play significant roles in plants. However, little is known about lncRNAs in Panax ginseng C. A. Meyer, an economical y significant medicinal plant species. A total of 3,688 mRNA‐like non‐coding RNAs (mlncRNAs), a class of lncRNAs, were identified in P. ginseng. Approximately 40%of the identified mlncRNAs were processed into smal RNAs, implying their regulatory roles via smal RNA‐mediated mechanisms. Eleven miRNA‐generating mlncRNAs also pro-duced siRNAs, suggesting the coordinated production of miRNAs and siRNAs in P. ginseng. The mlncRNA‐derived smal RNAs might be 21‐, 22‐, or 24‐nt phased and could be generated from both or only one strand of mlncRNAs, or from super long hairpin structures. A ful‐length mlncRNA, termed MAR (multiple‐function‐associated mlncRNA), was cloned. It gener-ated the most abundant siRNAs. The MAR siRNAs were predominantly 24‐nt and some of them were distributed in a phased pattern. A total of 228 targets were predicted for 71 MAR siRNAs. Degradome sequencing validated 68 predicted targets involved in diverse metabolic pathways, suggesting the significance of MAR in P. ginseng. Consistently, MAR was detected in al tissues analyzed and responded to methyl jasmonate (MeJA) treatment. It sheds light on the function of mlncRNAs in plants.

  16. Identification of mRNA-like non-coding RNAs and validation of a mighty one named MAR in Panax ginseng.

    Science.gov (United States)

    Wang, Meizhen; Wu, Bin; Chen, Chao; Lu, Shanfa

    2015-03-01

    Increasing evidence suggests that long non-coding RNAs (lncRNAs) play significant roles in plants. However, little is known about lncRNAs in Panax ginseng C. A. Meyer, an economically significant medicinal plant species. A total of 3,688 mRNA-like non-coding RNAs (mlncRNAs), a class of lncRNAs, were identified in P. ginseng. Approximately 40% of the identified mlncRNAs were processed into small RNAs, implying their regulatory roles via small RNA-mediated mechanisms. Eleven miRNA-generating mlncRNAs also produced siRNAs, suggesting the coordinated production of miRNAs and siRNAs in P. ginseng. The mlncRNA-derived small RNAs might be 21-, 22-, or 24-nt phased and could be generated from both or only one strand of mlncRNAs, or from super long hairpin structures. A full-length mlncRNA, termed MAR (multiple-function-associated mlncRNA), was cloned. It generated the most abundant siRNAs. The MAR siRNAs were predominantly 24-nt and some of them were distributed in a phased pattern. A total of 228 targets were predicted for 71 MAR siRNAs. Degradome sequencing validated 68 predicted targets involved in diverse metabolic pathways, suggesting the significance of MAR in P. ginseng. Consistently, MAR was detected in all tissues analyzed and responded to methyl jasmonate (MeJA) treatment. It sheds light on the function of mlncRNAs in plants. © 2014 Institute of Botany, Chinese Academy of Sciences.

  17. Extension of radiative transfer code MOMO, matrix-operator model to the thermal infrared - Clear air validation by comparison to RTTOV and application to CALIPSO-IIR

    Science.gov (United States)

    Doppler, Lionel; Carbajal-Henken, Cintia; Pelon, Jacques; Ravetta, François; Fischer, Jürgen

    2014-09-01

    1-D radiative transfer code Matrix-Operator Model (MOMO), has been extended from [0.2-3.65 μm] the band to the whole [0.2-100 μm] spectrum. MOMO can now be used for the computation of a full range of radiation budgets (shortwave and longwave). This extension to the longwave part of the electromagnetic radiation required to consider radiative transfer processes that are features of the thermal infrared: the spectroscopy of the water vapor self- and foreign-continuum of absorption at 12 μm and the emission of radiation by gases, aerosol, clouds and surface. MOMO's spectroscopy module, Coefficient of Gas Absorption (CGASA), has been developed for computation of gas extinction coefficients, considering continua and spectral line absorptions. The spectral dependences of gas emission/absorption coefficients and of Planck's function are treated using a k-distribution. The emission of radiation is implemented in the adding-doubling process of the matrix operator method using Schwarzschild's approach in the radiative transfer equation (a pure absorbing/emitting medium, namely without scattering). Within the layer, the Planck-function is assumed to have an exponential dependence on the optical-depth. In this paper, validation tests are presented for clear air case studies: comparisons to the analytical solution of a monochromatic Schwarzschild's case without scattering show an error of less than 0.07% for a realistic atmosphere with an optical depth and a blackbody temperature that decrease linearly with altitude. Comparisons to radiative transfer code RTTOV are presented for simulations of top of atmosphere brightness temperature for channels of the space-borne instrument MODIS. Results show an agreement varying from 0.1 K to less than 1 K depending on the channel. Finally MOMO results are compared to CALIPSO Infrared Imager Radiometer (IIR) measurements for clear air cases. A good agreement was found between computed and observed radiance: biases are smaller than 0.5 K

  18. 78 FR 23472 - Amendments to Existing Validated End-User Authorizations: CSMC Technologies Corporation in the...

    Science.gov (United States)

    2013-04-19

    ... include commodities, software, and technology, except those controlled for missile technology or crime... information against the VEU authorization criteria. Given the nature of the review, and in light of...

  19. Validating the Technology Proficiency Self-Assessment Questionnaire for 21st Century Learning (TPSA C-21)

    Science.gov (United States)

    Christensen, Rhonda; Knezek, Gerald

    2017-01-01

    Accurately measuring levels of technology proficiency in current and future classroom teachers are an important first step toward enhancing comfort level and confidence in integrating technology into the educational environment. The original Technology Proficiency Self-Assessment (TPSA) survey has maintained respectable psychometric properties for…

  20. Integration of electronic nose technology with spirometry: validation of a new approach for exhaled breath analysis

    NARCIS (Netherlands)

    de Vries, R.; Brinkman, P.; van der Schee, M.P.; Fens, N.; Dijkers, E.; Bootsma, S.K.; de Jongh, Franciscus H.C.; Sterk, P.J.

    2015-01-01

    New 'omics'-technologies have the potential to better define airway disease in terms of pathophysiological and clinical phenotyping. The integration of electronic nose (eNose) technology with existing diagnostic tests, such as routine spirometry, can bring this technology to 'point-of-care'. We

  1. Validating the Technology Proficiency Self-Assessment Questionnaire for 21st Century Learning (TPSA C-21)

    Science.gov (United States)

    Christensen, Rhonda; Knezek, Gerald

    2017-01-01

    Accurately measuring levels of technology proficiency in current and future classroom teachers are an important first step toward enhancing comfort level and confidence in integrating technology into the educational environment. The original Technology Proficiency Self-Assessment (TPSA) survey has maintained respectable psychometric properties for…

  2. Wake measurements for code validations

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose

    2009-01-01

    As part of the EU-TOPFARM project a large number of datasets have been identified for verification of wind farm climate models, aeroelastic load and production models of turbines subjected to three dimensional dynamic wake wind field and the aeroelastic production modeling of a whole wind farm de...

  3. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  4. Achievement Emotions in Technology Enhanced Learning: Development and Validation of Self-Report Instruments in the Italian Context

    Directory of Open Access Journals (Sweden)

    Daniela Raccanello

    2015-02-01

    Full Text Available The increased use of technology within the educational field gives rise to the need for developing valid instruments to measure key constructs associated with performance. We present some self-report instruments developed and/or validated in the Italian context that could be used to assess achievement emotions and correlates, within the theoretical framework of Pekrun’s control-value model. First, we propose some data related to the construction of two instruments developed to assess ten achievement emotions: the Brief Achievement Emotions Questionnaire, BR-AEQ, used with college students, and the Graduated Achievement Emotions Set, GR-AES, used with primary school students. Second, we describe some data concerning the validation within the Italian context of two instruments assessing achievement goals as antecedents of achievement emotions: the Achievement Goal Questionnaire-Revised, AGQ-R, and its more recent version based on the 3 X 2 achievement goal model.

  5. Research on universal combinatorial coding.

    Science.gov (United States)

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value.

  6. [Translation and validation of the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST 2.0) into Portuguese].

    Science.gov (United States)

    de Carvalho, Karla Emanuelle Cotias; Gois Júnior, Miburge Bolívar; Sá, Katia Nunes

    2014-01-01

    To translate and validate the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST 2.0) into Brazilian Portuguese. Certified translators translated and back-translated Quest. Content validity (CVI) was determined by 5 experts and, after the final version of B-Quest, a pre-test was applied to users of manual wheelchairs, walkers and crutches. The psychometric properties were tested to assure the validity of items and the reliability and stability of the scale. Data were obtained from 121 users of the above-mentioned devices. Our study showed a CVI of 91.66% and a satisfactory factor analysis referent to the two-dimensional structure of the instrument that ensured the representativeness of the items. The Cron-bach's alpha of the items device, service and total score of B-Quest were 0.862, 0.717 and 0.826, respectively. Test-retest stability conducted after a time interval of 2 months was analyzed using Spearman's correlation test, which showed high correlation (ρ >0.6) for most items. The study suggests that the B-Quest is a reliable, representative, and valid instrument to measure the satisfaction of users of assistive technology in Brazil. Copyright © 2014 Elsevier Editora Ltda. All rights reserved.

  7. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  8. Anti-malicious Code Technology Based on SSDT Restoration%基于SSDT恢复的反恶意代码技术

    Institute of Scientific and Technical Information of China (English)

    陈萌

    2009-01-01

    通过修改系统服务调度表(SSDT),恶意代码可以避免被杀毒软件或反恶意软件清除.针对SSDT挂钩技术,通过系统文件霞定位,实现基于SSDT恢复的反恶意代码技术,阐述Ntoskrnl.exe文件的重装方法和Nmskrnl.exe文件偏移比较法.实验结果证明,该技术能使恶意代码和木马程序失效,保障系统安全.%By amending System Service Dispatch Table(SSDT), the malicious code can avoid being cleared by antivirus software or anti-malice software. Aiming at SDDT hook technology, this paper realizes an anti-malicious code technology based on SSDT restoration through system file relocation. It expatiates the method of resetting Ntoskrnl.exe files and the method of shifting comparison of Ntoskml.exe files. Experimental results show that this technology can invalidate malicious codes and Trojan programs and guarantee system security.

  9. Validity of Business Strategy as Driver in Technology Management – A Critical Discussion

    DEFF Research Database (Denmark)

    Tambo, Torben; Østergaard, Klaus

    2015-01-01

    Frameworks for technological development are increasingly requiring that technology must be developed in accordance with the corporate business strategy. It is an interesting tendency that technological development should reflect and interact with central change processes of the enterprise...... in connecting technological design tightly to the business strategy. The purpose of this paper is to advance a research agenda, where long-term orientation of technology is connected to the necessary tools for obtaining insight in assessing adequacy, reliability and quality of business strategy and evaluation....... This is however colliding with challenges in case of normative or prescriptive strategies: Such strategies can be erroneous, misrepresenting, unsubstantiated, short lived, and centrered narrowly in internal top-level management processes. This paper discusses advantages, disadvantages and alternatives...

  10. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim

    2016-10-01

    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  11. Validation of a commercial TPS based on the VMC(++) Monte Carlo code for electron beams: commissioning and dosimetric comparison with EGSnrc in homogeneous and heterogeneous phantoms.

    Science.gov (United States)

    Ferretti, A; Martignano, A; Simonato, F; Paiusco, M

    2014-02-01

    The aim of the present work was the validation of the VMC(++) Monte Carlo (MC) engine implemented in the Oncentra Masterplan (OMTPS) and used to calculate the dose distribution produced by the electron beams (energy 5-12 MeV) generated by the linear accelerator (linac) Primus (Siemens), shaped by a digital variable applicator (DEVA). The BEAMnrc/DOSXYZnrc (EGSnrc package) MC model of the linac head was used as a benchmark. Commissioning results for both MC codes were evaluated by means of 1D Gamma Analysis (2%, 2 mm), calculated with a home-made Matlab (The MathWorks) program, comparing the calculations with the measured profiles. The results of the commissioning of OMTPS were good [average gamma index (γ) > 97%]; some mismatches were found with large beams (size ≥ 15 cm). The optimization of the BEAMnrc model required to increase the beam exit window to match the calculated and measured profiles (final average γ > 98%). Then OMTPS dose distribution maps were compared with DOSXYZnrc with a 2D Gamma Analysis (3%, 3 mm), in 3 virtual water phantoms: (a) with an air step, (b) with an air insert, and (c) with a bone insert. The OMTPD and EGSnrc dose distributions with the air-water step phantom were in very high agreement (γ ∼ 99%), while for heterogeneous phantoms there were differences of about 9% in the air insert and of about 10-15% in the bone region. This is due to the Masterplan implementation of VMC(++) which reports the dose as "dose to water", instead of "dose to medium".

  12. Establishing and evaluating bar-code technology in blood sampling system: a model based on human centered human-centered design method.

    Science.gov (United States)

    Chou, Shin-Shang; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Tseng, Kuan-Jui; Kuo, Shu-Chen

    2012-01-01

    This study intended to use a human-centered design study method to develop a bar-code technology in blood sampling process. By using the multilevel analysis to gather the information, the bar-code technology has been constructed to identify the patient's identification, simplify the work process, and prevent medical error rates. A Technology Acceptance Model questionnaire was developed to assess the effectiveness of system and the data of patient's identification and sample errors were collected daily. The average scores of 8 items users' perceived ease of use was 25.21(3.72), 9 items users' perceived usefulness was 28.53(5.00), and 14 items task-technology fit was 52.24(7.09), the rate of patient identification error and samples with order cancelled were down to zero, however, new errors were generated after the new system deployed; which were the position of barcode stickers on the sample tubes. Overall, more than half of nurses (62.5%) were willing to use the new system.

  13. Global Analysis of Non-coding Small RNAs in Arabidopsis in Response to Jasmonate Treatment by Deep Sequencing Technology

    Institute of Scientific and Technical Information of China (English)

    Bosen Zhang; Zhiping Jin; Daoxin Xie

    2012-01-01

    In plants,non-coding small RNAs play a vital role in plant development and stress responses.To explore the possible role of non-coding small RNAs in the regulation of the jasmonate (JA) pathway,we compared the non-coding small RNAs between the JA-deficient aos mutant and the JA-treated wild type Arabidopsis via high-throughput sequencing.Thirty new miRNAs and 27 new miRNA candidates were identified through bioinformatics approach.Forty-nine known miRNAs (belonging to 24 families),15 new miRNAs and new miRNA candidates (belonging to 11 families) and 3 tasiRNA families were induced by JA,whereas 1 new miRNA,1 tasiRNA family and 22 known miRNAs (belonging to 9 families) were repressed by JA.

  14. Clinical validation of NGS technology for HLA: An early adopter's perspective.

    Science.gov (United States)

    Weimer, Eric T

    2016-10-01

    Clinical validation of NGS for HLA typing has been a topic of interest with many laboratories investigating the merits. NGS has proven effective at reducing ambiguities and costs while providing more detailed information on HLA genes not previously sequenced. The ability of NGS to multiplex many patients within a single run presents unique challenges and sequencing new regions of HLA genes requires application of our knowledge of genetics to accurately determine HLA typing. This review represents my laboratory's experience in validation of NGS for HLA typing. It describes the obstacles faced with validation of NGS and is broken down into pre-analytic, analytic, and post-analytic challenges. Each section includes solutions to address them.

  15. Global Positioning System Technology (GPS for Psychological Research: A Test of Convergent and Nomological Validity

    Directory of Open Access Journals (Sweden)

    Pedro eWolf

    2013-06-01

    Full Text Available The purpose of this paper is to examine the convergent and nomological validity of a GPS-based measure of daily activity, operationalized as Number of Places Visited (NPV. Relations among the GPS-based measure and two self-report measures of NPV, as well as relations among NPV and two factors made up of self-reported individual differences were examined. The first factor was composed of variables related to an Active Lifestyle (AL (e.g. positive affect, extraversion… and the second factor was composed of variables related to a Sedentary Lifestyle (SL (e.g. depression, neuroticism…. NPV was measured over a four-day period. This timeframe was made up of two week and two weekend days. A bi-variate analysis established one level of convergent validity and a Split-Plot GLM examined convergent validity, nomological validity, and alternative hypotheses related to constraints on activity throughout the week simultaneously. The first analysis revealed significant correlations among NPV measures- weekday, weekend, and the entire four day blocks, supporting the convergent validity of the Diary-, Google Maps-, and GPS-NPV measures. Results from the second analysis, indicating non-significant mean differences in NPV regardless of method, also support this conclusion. We also found that AL is a statistically significant predictor of NPV no matter how NPV was measured. We did not find a statically significant relation among NPV and SL. These results permit us to infer that the GPS-based NPV measure has convergent and nomological validity.

  16. Progress and status of the OpenMC Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Romano, P. K.; Herman, B. R.; Horelik, N. E.; Forget, B.; Smith, K. [Massachusetts Institute of Technology, Department of Nuclear Science and Engineering, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States); Siegel, A. R. [Argonne National Laboratory, Theory and Computing Sciences and Nuclear Engineering Division (United States)

    2013-07-01

    The present work describes the latest advances and progress in the development of the OpenMC Monte Carlo code, an open-source code originating from the Massachusetts Institute of Technology. First, an overview of the development workflow of OpenMC is given. Various enhancements to the code such as real-time XML input validation, state points, plotting, OpenMP threading, and coarse mesh finite difference acceleration are described. (authors)

  17. XML based tools for assessing potential impact of advanced technology space validation

    Science.gov (United States)

    Some, Raphael R.; Weisbin, Charles

    2004-01-01

    A hierarchical XML database and related analysis tools are being developed by the New Millennium Program to provide guidance on the relative impact, to future NASA missions, of advanced technologies under consideration for developmental funding.

  18. A survey on the high reliability software verification and validation technology for instrumentation and control in NPP.

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Lee, Chang Soo; Dong, In Sook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    This document presents the technical status of the software verification and validation (V and V) efforts to support developing and licensing digital instrumentation and control (I and C) systems in nuclear power plants. We have reviewed codes and standards to be concensus criteria among vendor, licensee and licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 of the United States cope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. These technical issues let us know the development direction of our own software V and V methodology. (Author) 13 refs., 2 figs.,.

  19. Rhodopsin in plasma from patients with diabetic retinopathy - development and validation of digital ELISA by Single Molecule Array (Simoa) technology

    DEFF Research Database (Denmark)

    Petersen, Eva Rabing Brix; Olsen, Dorte Aalund; Christensen, Henry

    2017-01-01

    BACKGROUND: Diabetic retinopathy (DR) is the most frequent cause of blindness among younger adults in the western world. No blood biomarkers exist to detect DR. Hypothetically, Rhodopsin concentrations in blood has been suggested as an early marker for retinal damage. The aim of this study...... was therefore to develop and validate a Rhodopsin assay by employing digital ELISA technology, and to investigate whether Rhodopsin concentrations in diabetes patients with DR are elevated compared with diabetes patients without DR. METHODS: A digital ELISA assay using a Simoa HD-1 Analyzer (Quanterix......©, Lexington, MA 02421, USA) was developed and validated and applied on a cohort of diabetes patients characterised with (n=466) and without (n=144) DR. RESULTS: The Rhodopsin assay demonstrated a LOD of 0.26ng/l, a LLOQ of 3ng/l and a linear measuring range from 3 to 2500ng/l. Total CV% was 32%, 23%, 19...

  20. Utilization of a Text and Translation Application for Communication With a Foreign Deaf Family: A Call for Validation of This Technology-A Case Report.

    Science.gov (United States)

    Fernandez, Patrick G; Brockel, Megan A; Lipscomb, Lisa L; Ing, Richard J; Tailounie, Muayyad

    2017-07-15

    Effective communication with patients is essential to quality care. Obviously, language barriers significantly impact this and can increase the risk of poor patient outcomes. Smartphones and mobile health technology are valuable resources that are beginning to break down language barriers in health care. We present a case of a challenging language barrier where successful perioperative communication was achieved using mobile technology. Although quite beneficial, use of technology that is not validated exposes providers to unnecessary medicolegal risk. We hope to highlight the need for validation of such technology to ensure that these tools are an effective way to accurately communicate with patients in the perioperative setting.

  1. On Asymmetric Quantum MDS Codes

    CERN Document Server

    Ezerman, Martianus Frederic; Ling, San

    2010-01-01

    Assuming the validity of the MDS Conjecture, the weight distribution of all MDS codes is known. Using a recently-established characterization of asymmetric quantum error-correcting codes, linear MDS codes can be used to construct asymmetric quantum MDS codes with $d_{z} \\geq d_{x}\\geq 2$ for all possible values of length $n$ for which linear MDS codes over $\\F_{q}$ are known to exist.

  2. Validity of Eureka initiative: discourse by Italian Minister for University and Scientific and Technological Research

    Energy Technology Data Exchange (ETDEWEB)

    1990-07-01

    A broad review is given of the evolution of the aims and objectives of Eureka, a European based, coordinated international research and development program. Whereas initial projects were concentrated on the use of technology to restore areas which have suffered environmental damage, present proposals are being geared towards the development of preventive techniques. Robotics research is also being strengthened. With the aim of optimizing conditions for a more dynamic, collaborative research effort by participating high-tech firms, research centers and universities, a data bank is being developed whose aim is to identify and classify areas of technological and scientific expertise among participants. Efforts are being made to complement Eureka activities with European Community technological development goals and to augment the involvement of Third World countries.

  3. 车载网络的索引编码技术研究%Research on Index Coding Technology in Vehicular Network

    Institute of Scientific and Technical Information of China (English)

    夏彬; 王光浩; 吴越

    2015-01-01

    In vehicular network, broadcasting messages are not often able to arrive and receive properly due to the vulnerability of wireless channels and high mobility of vehicles. To solve this problem, this paper proposes an index coding based message broadcasting scheme in purpose of improving the message transmission efficiency. Index coding is a variant of source coding scheme that exploits the side information at different receivers, and this paper focuses on implementing index coding technology in the message broadcasting of vehicular networks. It proposes a distributed feedback based side information collection mechanism and an improved graph coloring algorithm to find the maximum clique,and the indexing coding is done. Simulation experimental results show that the scheme can reduce the number of transmissions,thus save wireless channel bandwidths and improve broadcasting efficiency.%在车载网络中,由于无线信道的脆弱性与车辆的高移动性,广播信息往往不能正确到达和接收。为解决该问题,提出一种基于索引编码的消息广播方案。该方案将索引编码应用于车载网络的信息广播中,可实现更高效的信息分发。给出一种基于分布式反馈机制以收集边信息,使用改进的图着色算法在边信息中寻找最大团,并运用最大团进行索引编码。仿真实验结果表明,该方案可以有效地减少最少传输次数,从而节约无线信道带宽,提高广播效率。

  4. A Trimodality Comparison of Volumetric Bone Imaging Technologies. Part I: Short-term Precision and Validity

    Science.gov (United States)

    Wong, Andy K. O.; Beattie, Karen A.; Min, Kevin K. H.; Webber, Colin E.; Gordon, Christopher L.; Papaioannou, Alexandra; Cheung, Angela M. W.; Adachi, Jonathan D.

    2016-01-01

    In vivo peripheral quantitative computed tomography (pQCT) and peripheral magnetic resonance imaging (pMRI) modalities can measure apparent bone microstructure at resolutions 200 μm or higher. However, validity and in vivo test-retest reproducibility of apparent bone microstructure have yet to be determined on 1.0 T pMRI (196 μm) and pQCT (200 μm). This study examined 67 women with a mean age of 74 ± 9 yr and body mass index of 27.65 ± 5.74 kg/m2, demonstrating validity for trabecular separation from pMRI, cortical thickness, and bone volume fraction from pQCT images compared with high-resolution pQCT (hr-pQCT), with slopes close to unity. However, because of partial volume effects, cortical and trabecular thickness of bone derived from pMRI and pQCT images matched hr-pQCT more only when values were small. Short-term reproducibility of bone outcomes was highest for bone volume fraction (BV/TV) and densitometric variables and lowest for trabecular outcomes measuring microstructure. Measurements at the tibia for pQCT images were more precise than at the radius. In part I of this 3-part series focused on trimodality comparisons of precision and validity, it is shown that pQCT images can yield valid and reproducible apparent bone structural outcomes, but because of longer scan time and potential for more motion, the pMRI protocol examined here remains limited in achieving reliable values. PMID:25129405

  5. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    Science.gov (United States)

    1993-11-01

    U.S. Department of Transportation PFe 1rs Aviation Administration DOT/FAA/CT-88/10 HANDBOOK- VOLUME H DIGITAL SYSTEMS VALIDATION - CHAPTER 18 tw...18-29 improve identification, control, and auditing of software. SCM and SQA methods in RTCA/DO-178A are drawn directly from proven methods of hardware...procedures, and practices; reviews and audits ; configuration management; medium control; testing; supplier control; and appropriate records. A brief

  6. Perceived Purchase Risk in the Technological Goods Purchase Context: An Instrument Development and Validation

    OpenAIRE

    Salehudin, Imam

    2010-01-01

    Each purchase decision is most likely to be a risky decision. Woodside and DeLozier (1976) proposed that consumer purchase-related behaviors correspond to the perceived level of risk in the purchase. Therefore, understanding consumer’s perceived purchase risk is paramount for marketers –especially marketers of high risk products. This study intends to develop a valid and reliable instrument in measuring consumer’s perceived purchase risk using the concept of perceived risk by Peter and Ryan (...

  7. A new application and experimental validation of moulding technology for ferrite magnet assisted synchronous reluctance machine

    DEFF Research Database (Denmark)

    Wu, Qian; Lu, Kaiyuan; Rasmussen, Peter Omand

    2016-01-01

    This paper introduces a new application of moulding technology to the installation of ferrite magnet material into the rotor flux barriers of Ferrite Magnet Assisted Synchronous Reluctance Machine (FASynRM). The feasibility of this application with respect to manufacturing process and motor...... performance has been demonstrated. In comparison to the conventional ferrite magnet installation approach, moulding technology has obvious advantages of improved mechanical strength of the multi-flux-barrier rotor structure, simplified installation process, reduced processing cost and in the same time...

  8. Translation and validation of the parent-adolescent communication scale: technology for DST/HIV prevention.

    Science.gov (United States)

    Gubert, Fabiane do Amaral; Vieira, Neiva Francenely Cunha; Pinheiro, Patrícia Neyva da Costa; Oriá, Mônica Oliveira Batista; de Almeida, Paulo César; de Araújo, Thábyta Silva

    2013-01-01

    accomplishment of the transcultural adaptation of the Parent-adolescent Communication Scale, which evaluates the frequency of communication between parents and children concerning the subjects related to sex, condom, DST, HIV and pregnancy. Methodological research of quantitative approach, accomplished with 313 adolescent pupils of the feminine sex in the 14 to 18 year age group in Fortaleza-CE. The content validity was carried through by means of the initial translation, back translation, pre-final version and final version, being analyzed by a committee of specialists; the reliability was verified by the Cronbach's Alpha and ascertained by testing the hypotheses and test-retest within five weeks. The scale was applied via computer in the online modality in the period November/2010 to January/2011. The version of the instrument in Portuguese presented an Alpha of 0.86 regarding the validity of the structure, was partially verified since the testing of the hypotheses of the contracted group was not confirmed. The version of the instrument adapted for Portuguese is considered valid and reliable in the study sample.

  9. Translation and validation of the Parent-adolescent Communication Scale: technology for DST/HIV prevention

    Directory of Open Access Journals (Sweden)

    Fabiane do Amaral Gubert

    2013-07-01

    Full Text Available OBJECTIVES: accomplishment of the transcultural adaptation of the Parent-adolescent Communication Scale, which evaluates the frequency of communication between parents and children concerning the subjects related to sex, condom, DST, HIV and pregnancy. METHOD: Methodological research of quantitative approach, accomplished with 313 adolescent pupils of the feminine sex in the 14 to 18 year age group in Fortaleza-CE. The content validity was carried through by means of the initial translation, back translation, pre-final version and final version, being analyzed by a committee of specialists; the reliability was verified by the Cronbach's Alpha and ascertained by testing the hypotheses and test-retest within five weeks. The scale was applied via computer in the online modality in the period November/2010 to January/2011. RESULTS: The version of the instrument in Portuguese presented an Alpha of 0.86 regarding the validity of the structure, was partially verified since the testing of the hypotheses of the contracted group was not confirmed. CONCLUSION: The version of the instrument adapted for Portuguese is considered valid and reliable in the study sample.

  10. OECD/NEA International Benchmark exercises: Validation of CFD codes applied nuclear industry; OECD/NEA internatiion Benchmark exercices: La validacion de los codigos CFD aplicados a la industria nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Pena-Monferrer, C.; Miquel veyrat, A.; Munoz-Cobo, J. L.; Chiva Vicent, S.

    2016-08-01

    In the recent years, due, among others, the slowing down of the nuclear industry, investment in the development and validation of CFD codes, applied specifically to the problems of the nuclear industry has been seriously hampered. Thus the International Benchmark Exercise (IBE) sponsored by the OECD/NEA have been fundamental to analyze the use of CFD codes in the nuclear industry, because although these codes are mature in many fields, still exist doubts about them in critical aspects of thermohydraulic calculations, even in single-phase scenarios. The Polytechnic University of Valencia (UPV) and the Universitat Jaume I (UJI), sponsored by the Nuclear Safety Council (CSN), have actively participated in all benchmark's proposed by NEA, as in the expert meetings,. In this paper, a summary of participation in the various IBE will be held, describing the benchmark itself, the CFD model created for it, and the main conclusions. (Author)

  11. A genome-wide survey of highly expressed non-coding RNAs and biological validation of selected candidates in Agrobacterium tumefaciens.

    Directory of Open Access Journals (Sweden)

    Keunsub Lee

    Full Text Available Agrobacterium tumefaciens is a plant pathogen that has the natural ability of delivering and integrating a piece of its own DNA into plant genome. Although bacterial non-coding RNAs (ncRNAs have been shown to regulate various biological processes including virulence, we have limited knowledge of how Agrobacterium ncRNAs regulate this unique inter-Kingdom gene transfer. Using whole transcriptome sequencing and an ncRNA search algorithm developed for this work, we identified 475 highly expressed candidate ncRNAs from A. tumefaciens C58, including 101 trans-encoded small RNAs (sRNAs, 354 antisense RNAs (asRNAs, 20 5' untranslated region (UTR leaders including a RNA thermosensor and 6 riboswitches. Moreover, transcription start site (TSS mapping analysis revealed that about 51% of the mapped mRNAs have 5' UTRs longer than 60 nt, suggesting that numerous cis-acting regulatory elements might be encoded in the A. tumefaciens genome. Eighteen asRNAs were found on the complementary strands of virA, virB, virC, virD, and virE operons. Fifteen ncRNAs were induced and 7 were suppressed by the Agrobacterium virulence (vir gene inducer acetosyringone (AS, a phenolic compound secreted by the plants. Interestingly, fourteen of the AS-induced ncRNAs have putative vir box sequences in the upstream regions. We experimentally validated expression of 36 ncRNAs using Northern blot and Rapid Amplification of cDNA Ends analyses. We show functional relevance of two 5' UTR elements: a RNA thermonsensor (C1_109596F that may regulate translation of the major cold shock protein cspA, and a thi-box riboswitch (C1_2541934R that may transcriptionally regulate a thiamine biosynthesis operon, thiCOGG. Further studies on ncRNAs functions in this bacterium may provide insights and strategies that can be used to better manage pathogenic bacteria for plants and to improve Agrobacterum-mediated plant transformation.

  12. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  13. Energy and resolution calibration of NaI(Tl) and LaBr{sub 3}(Ce) scintillators and validation of an EGS5 Monte Carlo user code for efficiency calculations

    Energy Technology Data Exchange (ETDEWEB)

    Casanovas, R., E-mail: ramon.casanovas@urv.cat [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Morant, J.J. [Servei de Proteccio Radiologica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain); Salvado, M. [Unitat de Fisica Medica, Facultat de Medicina i Ciencies de la Salut, Universitat Rovira i Virgili, ES-43201 Reus (Tarragona) (Spain)

    2012-05-21

    The radiation detectors yield the optimal performance if they are accurately calibrated. This paper presents the energy, resolution and efficiency calibrations for two scintillation detectors, NaI(Tl) and LaBr{sub 3}(Ce). For the two former calibrations, several fitting functions were tested. To perform the efficiency calculations, a Monte Carlo user code for the EGS5 code system was developed with several important implementations. The correct performance of the simulations was validated by comparing the simulated spectra with the experimental spectra and reproducing a number of efficiency and activity calculations. - Highlights: Black-Right-Pointing-Pointer NaI(Tl) and LaBr{sub 3}(Ce) scintillation detectors are used for gamma-ray spectrometry. Black-Right-Pointing-Pointer Energy, resolution and efficiency calibrations are discussed for both detectors. Black-Right-Pointing-Pointer For the two former calibrations, several fitting functions are tested. Black-Right-Pointing-Pointer A Monte Carlo user code for EGS5 was developed for the efficiency calculations. Black-Right-Pointing-Pointer The code was validated reproducing some efficiency and activity calculations.

  14. New Technology in Personnel Selection: The Validity and Acceptability of Multimedia Tests

    NARCIS (Netherlands)

    J.K. Oostrom (Janneke)

    2010-01-01

    textabstractThe advances in technology of the last fifty years, specifically the advent of the computer, its continuous improvements in functionality and capacity, and the growth of the internet, have affected almost every aspect of psychological testing in personnel selection practices. Since the 1

  15. Follow-On Cooperative Research and Development Agreement: MFIX to FLUENT Technology Transfer and Validation Studies Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Syamlal, Madhava [US Department of Energy, Washington, DC (United States); Guenther, Chris [US Department of Energy, Washington, DC (United States); O' Brien, Thomas J. [US Department of Energy, Washington, DC (United States); Benyahia, Sofiane [Fluent Inc., New York, NY (United States); Shi, Shaoping [Fluent Inc., New York, NY (United States)

    2005-03-01

    This report summarizes the effort by NETL and Fluent on the Cooperative Research and Development Agreement No. 00-F039 signed in May 2000. The objective of the CRADA was to transfer technology from NETL's MFIX code into the commercial software FLUENT so as to increase the computational speed, accuracy, and utility of FLUENT. During the period of this CRADA MFIX was used to develop granular flow theories and used for simulating gas-solids chemical reactors. The FLUENT and MFIX predictions were compared with each other and with experimental data generated at NETL. The granular kinetic theory in FLUENT was improved as a result of this work, and a gas-solids reaction (ozone decomposition) was used as a test case for the gas-solids chemical reaction capability in FLUENT. Also, under a separate project, work has begun to transfer the coal combustion and gasification model in MFIX to FLUENT.

  16. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    Science.gov (United States)

    2014-03-27

    want to express my sincere love, respect, and admiration for my wife, who motivated and supported me throughout this long endeavor; this document ...widely utilized radiation transport code is MCNP. First created at Los Alamos National Laboratory ( LANL ) in 1957, the code simulated neutral...explanation of the current capabilities of MCNP will occur within the next chapter of this document ; however, it is important to note that MCNP

  17. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...

  18. 基于二维码技术的物理实验开放平台%Open platform of physical experiment based on QR code technology

    Institute of Scientific and Technical Information of China (English)

    李娟妮; 周红; 门庭轩; 张心蕊

    2016-01-01

    In order to make full use of the popularity and convenience of mobile devices ,the QR code technology was applied to construct an open platform of physical experiment .The platform pro‐vided convenient access benefited from the characteristics of QR code ,such as good usability ,high re‐liability ,technical maturity and cost‐effective deployment . T his method had the advantages of low cost ,accurate and direct information retrieval ,easy maintenance ,good usability and easy docking with the existing technologies and platforms .%为了充分利用移动设备的普及性、便捷性,充分发挥开放式实验平台的优势,将二维码技术应用于物理实验开放平台的构建。利用二维码使用方便、可靠性高,技术成熟、部署成本低等特点,为实验开放平台提供了便捷的访问入口,具有学习成本低、检索信息定位准确、直接,便于维护,易与现有技术、平台对接等应用优势。

  19. WCDMA上行扰码序列快速检测技术%Fast Detection Technology for WCDMA Uplink Scramble Code

    Institute of Scientific and Technical Information of China (English)

    牛慧莹

    2016-01-01

    For the uplink multi⁃user separation in WCDMA system,this paper analyzes the structure of physical channel and the correlation character of scramble code in the uplink of WCDMA system,and proposes a rapid detection algorithm for uplink scramble code.This technology takes advantage of the parallel structure of FFT and the parallel computing capability of the GPU,carries out the fast calculation of cross⁃correlation function by FFT and implements multi-channel FFT in the GPU in parallel,greatly reducing the required time of uplink scramble code detection.The simulation results show that this technology can realize uplink multi⁃user fast sepa⁃ration in WCDMA system,which can be easily implemented and has broad application prospects.%针对WCDMA系统上行链路多用户分离问题,分析了WCDMA上行物理信道结构和扰码序列的相关特性,提出了一种基于WCDMA系统上行链路扰码序列的快速检测技术。该技术利用快速傅里叶变换的并行运算特点和GPU适合于进行大规模并行运算的优势,通过FFT进行互相关函数的快速运算并在GPU中并行实现多路FFT,极大地减少了WCDMA上行扰码序列检测所需时间。仿真结果表明,该方法可快速实现WCDMA系统上行链路多用户分离,并且易于工程实现,具有广阔的应用前景。

  20. Wavefront coding with adaptive optics

    Science.gov (United States)

    Agbana, Temitope E.; Soloviev, Oleg; Bezzubik, Vitalii; Patlan, Vsevolod; Verhaegen, Michel; Vdovin, Gleb

    2015-03-01

    We have implemented an extended depth of field optical system by wavefront coding with a micromachined membrane deformable mirror. This approach provides a versatile extension to standard wavefront coding based on fixed phase mask. First experimental results validate the feasibility of the use of adaptive optics for variable depth wavefront coding in imaging optical systems.

  1. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    2015-01-26

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that provides a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is

  2. Two-dimensional simulation of hydrogen iodide decomposition reaction using fluent code for hydrogen production using nuclear technology

    Directory of Open Access Journals (Sweden)

    Jung-Sik Choi

    2015-06-01

    Full Text Available The operating characteristics of hydrogen iodide (HI decomposition for hydrogen production were investigated using the commercial computational fluid dynamics code, and various factors, such as hydrogen production, heat of reaction, and temperature distribution, were studied to compare device performance with that expected for device development. Hydrogen production increased with an increase of the surface-to-volume (STV ratio. With an increase of hydrogen production, the reaction heat increased. The internal pressure and velocity of the HI decomposer were estimated through pressure drop and reducing velocity from the preheating zone. The mass of H2O was independent of the STV ratio, whereas that of HI decreased with increasing STV ratio.

  3. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  4. Validation of a new library of nuclear constants of the WIMS code; Validacion de una nueva biblioteca de constantes nucleares del Codigo WIMS

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar H, F. [Departamento de Experimentacion, Gerencia del Reactor, ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1991-10-15

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H{sup 1}, O{sup 16}, Al{sup 27}, U{sup 235} and U{sup 238} was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  5. Validation of a new library of nuclear constants of the WIMS code; Validacion de una nueva biblioteca de constantes nucleares del Codigo WIMS

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar H, F. [Departamento de Experimentacion, Gerencia del Reactor, ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1991-10-15

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H{sup 1}, O{sup 16}, Al{sup 27}, U{sup 235} and U{sup 238} was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  6. Point-of-care solution for osteoporosis management design, fabrication, and validation of new technology

    CERN Document Server

    Khashayar, Patricia

    2017-01-01

    This book addresses the important clinical problem of accurately diagnosing osteoporosis, and analyzes how Bone Turnover Markers (BTMs) can improve osteoporosis detection. In her research, the author integrated microfluidic technology with electrochemical sensing to embody a reaction/detection chamber to measure serum levels of different biomarkers, creating a microfluidic proteomic platform that can easily be translated into a biomarker diagnostic. The Osteokit System, a result of the integration of electrochemical system and microfluidic chips, is a unique design that offers the potential for greater sensitivity. The implementation, feasibility, and specificity of the Osteokit platform is demonstrated in this book, which is appropriate for researchers working on bone biology and mechanics, as well as clinicians.

  7. Development of the ANL plant dynamics code and control strategies for the supercritical carbon dioxide Brayton cycle and code validation with data from the Sandia small-scale supercritical carbon dioxide Brayton cycle test loop.

    Energy Technology Data Exchange (ETDEWEB)

    Moisseytsev, A.; Sienicki, J. J. (Nuclear Engineering Division)

    2011-11-07

    Significant progress has been made in the ongoing development of the Argonne National Laboratory (ANL) Plant Dynamics Code (PDC), the ongoing investigation and development of control strategies, and the analysis of system transient behavior for supercritical carbon dioxide (S-CO{sub 2}) Brayton cycles. Several code modifications have been introduced during FY2011 to extend the range of applicability of the PDC and to improve its calculational stability and speed. A new and innovative approach was developed to couple the Plant Dynamics Code for S-CO{sub 2} cycle calculations with SAS4A/SASSYS-1 Liquid Metal Reactor Code System calculations for the transient system level behavior on the reactor side of a Sodium-Cooled Fast Reactor (SFR) or Lead-Cooled Fast Reactor (LFR). The new code system allows use of the full capabilities of both codes such that whole-plant transients can now be simulated without additional user interaction. Several other code modifications, including the introduction of compressor surge control, a new approach for determining the solution time step for efficient computational speed, an updated treatment of S-CO{sub 2} cycle flow mergers and splits, a modified enthalpy equation to improve the treatment of negative flow, and a revised solution of the reactor heat exchanger (RHX) equations coupling the S-CO{sub 2} cycle to the reactor, were introduced to the PDC in FY2011. All of these modifications have improved the code computational stability and computational speed, while not significantly affecting the results of transient calculations. The improved PDC was used to continue the investigation of S-CO{sub 2} cycle control and transient behavior. The coupled PDC-SAS4A/SASSYS-1 code capability was used to study the dynamic characteristics of a S-CO{sub 2} cycle coupled to a SFR plant. Cycle control was investigated in terms of the ability of the cycle to respond to a linear reduction in the electrical grid demand from 100% to 0% at a rate of 5

  8. Administration of neuropsychological tests using interactive voice response technology in the elderly: validation and limitations.

    Science.gov (United States)

    Miller, Delyana Ivanova; Talbot, Vincent; Gagnon, Michèle; Messier, Claude

    2013-01-01

    Interactive voice response (IVR) systems are computer programs, which interact with people to provide a number of services from business to health care. We examined the ability of an IVR system to administer and score a verbal fluency task (fruits) and the digit span forward and backward in 158 community dwelling people aged between 65 and 92 years of age (full scale IQ of 68-134). Only six participants could not complete all tasks mostly due to early technical problems in the study. Participants were also administered the Wechsler Intelligence Scale fourth edition (WAIS-IV) and Wechsler Memory Scale fourth edition subtests. The IVR system correctly recognized 90% of the fruits in the verbal fluency task and 93-95% of the number sequences in the digit span. The IVR system typically underestimated the performance of participants because of voice recognition errors. In the digit span, these errors led to the erroneous discontinuation of the test: however the correlation between IVR scoring and clinical scoring was still high (93-95%). The correlation between the IVR verbal fluency and the WAIS-IV Similarities subtest was 0.31. The correlation between the IVR digit span forward and backward and the in-person administration was 0.46. We discuss how valid and useful IVR systems are for neuropsychological testing in the elderly.

  9. Administration of Neuropsychological Tests Using Interactive Voice Response Technology in the Elderly: Validation and Limitations

    Science.gov (United States)

    Miller, Delyana Ivanova; Talbot, Vincent; Gagnon, Michèle; Messier, Claude

    2013-01-01

    Interactive voice response (IVR) systems are computer programs, which interact with people to provide a number of services from business to health care. We examined the ability of an IVR system to administer and score a verbal fluency task (fruits) and the digit span forward and backward in 158 community dwelling people aged between 65 and 92 years of age (full scale IQ of 68–134). Only six participants could not complete all tasks mostly due to early technical problems in the study. Participants were also administered the Wechsler Intelligence Scale fourth edition (WAIS-IV) and Wechsler Memory Scale fourth edition subtests. The IVR system correctly recognized 90% of the fruits in the verbal fluency task and 93–95% of the number sequences in the digit span. The IVR system typically underestimated the performance of participants because of voice recognition errors. In the digit span, these errors led to the erroneous discontinuation of the test: however the correlation between IVR scoring and clinical scoring was still high (93–95%). The correlation between the IVR verbal fluency and the WAIS-IV Similarities subtest was 0.31. The correlation between the IVR digit span forward and backward and the in-person administration was 0.46. We discuss how valid and useful IVR systems are for neuropsychological testing in the elderly. PMID:23950755

  10. Administration of neuropsychological tests using interactive voice response technology in the elderly: validation and limitations

    Directory of Open Access Journals (Sweden)

    Delyana Ivanova Miller

    2013-08-01

    Full Text Available Interactive voice response systems (IVR are computer programs, which interact with people to provide a number of services from business to health care. We examined the ability of an IVR system to administer and score a verbal fluency task (fruits and the digit span forward and backward in 158 community dwelling people aged between 65 and 92 years of age (full scale IQ of 68 to 134. Only 6 participants could not complete all tasks mostly due to early technical problems in the study. Participants were also administered the WAIS-IV and WMS-IV sub-tests. The IVR system correctly recognized 90% of the fruits in the verbal fluency task and 93-95% of the number sequences in the digit span. The IVR system typically underestimated the performance of participants because of voice recognition errors. In the digit span, these errors led to the erroneous discontinuation of the test: however the correlation between IVR scoring and clinical scoring was still high (93-95%. The correlation between the IVR verbal fluency and the WAIS-IV Similarities sub-test was 0.31. The correlation between the IVR digit span forward and backward and the in-person administration was 0.46. We discuss how valid and useful IVR systems are for neuropsychological testing in the elderly.

  11. Technology for trauma: testing the validity of a smartphone app for pre-hospital clinicians.

    Science.gov (United States)

    Freshwater, Eleanor S; Crouch, Robert

    2015-01-01

    With the introduction of regional trauma networks in England, ambulance clinicians have been required to make triage decisions relating to severity of injury, and appropriate destination for the patient, which may require 'bypassing' the nearest Emergency Department. A 'Trauma Unit Bypass Tool' is utilised in this process. The Major Trauma Triage tool smartphone application (App) is a digital representation of a tool, available for clinicians to use on their smartphone. Prior to disseminating the application, validity and performance against the existing paper-based tool was explored. A case-based study using clinical scenarios was conducted. Scenarios, with appropriate triage decisions, were agreed by an expert panel. Ambulance clinicians were assigned to either the paper-based tool or smartphone app group and asked to make a triage decision using the available information. The positive predictive value (PPV) of each tool was calculated. The PPV of the paper tool was 0.76 and 0.86 for the smartphone app. User comments were mainly positive for both tools with no negative comments relating to the smartphone app. The smartphone app version of the Trauma Unit Bypass Tool performs at least as well as the paper version and can be utilised safely by pre-hospital clinicians in supporting triage decisions relating to potential major trauma. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  13. Remarks on CFD validation: A Boeing Commercial Airplane Company perspective

    Science.gov (United States)

    Rubbert, Paul E.

    1987-01-01

    Requirements and meaning of validation of computational fluid dynamics codes are discussed. Topics covered include: validating a code, validating a user, and calibrating a code. All results are presented in viewgraph format.

  14. Remarks on CFD validation: A Boeing Commercial Airplane Company perspective

    Science.gov (United States)

    Rubbert, Paul E.

    1987-01-01

    Requirements and meaning of validation of computational fluid dynamics codes are discussed. Topics covered include: validating a code, validating a user, and calibrating a code. All results are presented in viewgraph format.

  15. Modification and Validation of ATHLET Code for Sodium-cooled Fast Reactor Application%ATHLET程序的钠冷快堆应用扩展及其验证

    Institute of Scientific and Technical Information of China (English)

    周翀; Klaus Huber; 程旭

    2013-01-01

    System analysis code is important for the global simulation of the sodium-cooled fast reactor (SFR) system as well as transient and accident safety analysis .In this paper ,the best estimate system code ATHLET for light water reactors ,developed by Gesellschaft für Anlagen-und Reaktorsicherheit (GRS) in Germany ,was modified for SFR application .Thermal-dynamic and transport properties as well as heat transfer correlations for sodium were implemented into the ATHLET code .The modified code was then applied to simulate the Phenix reactor in France ,and validation of the code was conducted with the Phenix reactor natural convection test .The calculation results were compared with the test data .The results show that the modified ATHLET code has good applicability in simulating SFR systems .%系统分析程序是对钠冷快堆的冷却剂回路系统进行全局模拟、瞬态及事故安全分析的重要工具。本工作对德国核设施与反应堆安全机构(GRS)开发的轻水堆最佳估算系统程序ATHLET 进行修改,增加了钠的物性公式和传热关系式,将其适用范围扩展到钠冷快堆。为验证修改过的ATHLET程序,对法国凤凰(Phenix )反应堆系统建模,并对其自然对流实验进行模拟,将计算结果与实验数据进行比较。结果显示,ATHLET程序的钠冷快堆应用扩展具有良好的适用性。

  16. 彩色比特码自动识别技术在图书馆中的应用研究%Application Research on Color Bit Code Automatic Identification technology in the Libraries

    Institute of Scientific and Technical Information of China (English)

    李海华

    2012-01-01

    This paper introduces the basic identification principles and features of Color Bit Code Automatic Identification technology the application of the technology in many fields of foreign, analyzes the feasibility of domestic libraries by using Color Bit Code Automatic Identification technology. In the end of this paper, the paper has put forward the format of Color Bit Code Automatic Identification technology protocol.%对彩色比特码的定义及工作原理进行了概述,简单介绍了该技术在国外各领域的应用现状,分析了目前国内在图书馆领域应用彩色比特码自动识别技术的可行性,并提出了一种基于彩色比特码技术的图书管理协议格式。

  17. Predictive validation of modeled health technology assessment claims: lessons from NICE.

    Science.gov (United States)

    Belsey, Jonathan

    2015-01-01

    The use of cost-effectiveness modeling to prioritize healthcare spending has become a key foundation of UK government policy. Although the preferred method of evaluation-cost-utility analysis-is not without its critics, it represents a standard approach that can arguably be used to assess relative value for money across a range of disease types and interventions. A key limitation of economic modeling, however, is that its conclusions hinge on the input assumptions, many of which are derived from randomized controlled trials or meta-analyses that cannot be reliably linked to real-world performance of treatments in a broader clinical context. This means that spending decisions are frequently based on artificial constructs that may project costs and benefits that are significantly at odds with those that are achievable in reality. There is a clear agenda to carry out some form of predictive validation for the model claims, in order to assess not only whether the spending decisions made can be justified post hoc, but also to ensure that budgetary expenditure continues to be allocated in the most rational way. To date, however, no timely, effective system to carry out this testing has been implemented, with the consequence that there is little objective evidence as to whether the prioritization decisions made are actually living up to expectations. This article reviews two unfulfilled initiatives that have been carried out in the UK over the past 20 years, each of which had the potential to address this objective, and considers why they failed to deliver the expected outcomes.

  18. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses.

  19. Hearing Protection Device Personal Attenuation Rating Validation Technology%护听器个人防护值验证技术

    Institute of Scientific and Technical Information of China (English)

    刘玉飞

    2012-01-01

    本文介绍了关于护听器实际防护值的研究数据,和用于护听器个人防护效果验证的方法,以及3ME—A—Rfit^TM听力防护验证系统。研究表明,护听器的标称降噪值对其实际防护值的指示性非常差,目前也还没有一个可靠的方法来使用护听器的标称值。3ME—A—Rfit^TM听力防护验证系统是使用实地真耳内置麦克风(F—MIRE)方法,在实际工作现场中快速、定量地测定护听器在具体使用者身上取得的个人声衰减值(PAR),让安全管理人员对员工实际获得的防护水平有清晰的了解。%In this paper, research data about real-world performance of hearing protection devices (HPDs), and technologies to validate the personal attenuation of HPDs, and 3M E-A-RfitTM validation system were introduced. Field research suggested, the labeled attenuation data provides poor indication of the real world performance of HPDs, there is no reliable way to use labeled attenuaction data so far. E-A-RfitTM validation system is a quick and accurate method of estimating the personal attenuation rating for a given fitting of a pair of earplugs by using Fieht Mirerophone In Real Ear testing, it helps on clear understanding of the level of protection workers received from an HPD.

  20. Development and Validation of a Three-Dimensional Diffusion Code Based on a High Order Nodal Expansion Method for Hexagonal-z Geometry

    Directory of Open Access Journals (Sweden)

    Daogang Lu

    2016-01-01

    Full Text Available A three-dimensional, multigroup, diffusion code based on a high order nodal expansion method for hexagonal-z geometry (HNHEX was developed to perform the neutronic analysis of hexagonal-z geometry. In this method, one-dimensional radial and axial spatially flux of each node and energy group are defined as quadratic polynomial expansion and four-order polynomial expansion, respectively. The approximations for one-dimensional radial and axial spatially flux both have second-order accuracy. Moment weighting is used to obtain high order expansion coefficients of the polynomials of one-dimensional radial and axial spatially flux. The partially integrated radial and axial leakages are both approximated by the quadratic polynomial. The coarse-mesh rebalance method with the asymptotic source extrapolation is applied to accelerate the calculation. This code is used for calculation of effective multiplication factor, neutron flux distribution, and power distribution. The numerical calculation in this paper for three-dimensional SNR and VVER 440 benchmark problems demonstrates the accuracy of the code. In addition, the results show that the accuracy of the code is improved by applying quadratic approximation for partially integrated axial leakage and four-order approximation for one-dimensional axial spatially flux in comparison to flat approximation for partially integrated axial leakage and quadratic approximation for one-dimensional axial spatially flux.

  1. Electromagnetic Water Treatment: is it a Validated Technology?; El tratamiento electromagnetico del agua, una tecnologia comprobada?

    Energy Technology Data Exchange (ETDEWEB)

    Tamari, Serge; Arroyo Correa, Victor M.; Garcia, Nahun [Instituto Mexicano de Tecnologia del Agua, Jiutepec, Morelos (Mexico); Paredes Vallejo, Mario [Baja California (Mexico); Castro Gonzalez, Carlos H [Sonora (Mexico)

    2001-09-01

    Since the start of the century, several kinds of devices (magnets, coils, electrodes, antennas) have been developed to treat water electromagnetically. Compared to traditional methods of water treatment, such devices are said to be low-cost, easy to use, and maintenance free. Suppliers commonly recommend their use for many applications, such as crop irrigation, water supply for livestock and scale control in pipes. How can these devices be able to solve so different problems? According to a review of the available literature, the efficiency of the electromagnetic devices that are sold to treat water is questionable. Until now, the electromagnetic water treatment cannot be said to be a verified technology. [Spanish] Desde hace un siglo se han propuesto varios dispositivos (imanes, bobinas, electrodos, antenas) para el tratamiento electromagnetico del agua. En comparacion con los metodos tradicionales de tratamiento del agua, se dice que son economicos, de facil uso, y que practicamente no requieren mantenimiento. Comunmente, los proveedores los recomiendan para aplicaciones tan diversas como regar los cultivos, abrevar el ganado y controlar el sarro en las tuberias, como pueden resolver problemas tan distintos? De acuerdo con una revision de la literatura, existen muchas dudas en cuanto a la utilidad de los aparatos electromagneticos que se venden para el tratamiento del agua. Hasta la fecha, no ha sido posible definir con claridad las condiciones de uso que logren los beneficios mencionados.

  2. 条码技术与RFID技术在军工物流中的应用前景%Prospects of Bar Code Technology and RFID Technology Application in Military Logistics

    Institute of Scientific and Technical Information of China (English)

    苗卫华; 吴隽

    2011-01-01

    RFID technology and bar code technology has matured in recent years in both military logistics and the application of technology has become the development of China's military logistics information important part.This article describes the basic principles of these two technologies,and the characteristics of both analysis and comparison.This paper presents a combination of two technologies used in military logistics feasibility,and that the combination of both the conditions and prospects.%RFID技术与条码技术近年来已经日趋成熟,在军工物流中应用这两种技术已经成为我国军工物流信息化发展的重要组成部分。作者介绍了这两种技术的基本原理,并对两者的特点进行分析、比较,提出了两种技术结合应用于军工物流的可行性,并指出两者联合应用的条件和前景。

  3. Rewriting the Genetic Code.

    Science.gov (United States)

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  4. Validation of Software Gating: A Practical Technology for Respiratory Motion Correction in PET.

    Science.gov (United States)

    Kesner, Adam Leon; Chung, Jonathan Hero; Lind, Kimberly Erin; Kwak, Jennifer Jihyang; Lynch, David; Burckhardt, Darrell; Koo, Phillip Jahhyung

    2016-10-01

    Purpose To assess the performance of hardware- and software-gating technologies in terms of qualitative and quantitative characteristics of respiratory motion in positron emission tomography (PET) imaging. Materials and Methods Between 2010 and 2013, 219 fluorine 18 fluorodeoxyglucose PET examinations were performed in 116 patients for assessment of pulmonary nodules. All patients provided informed consent in this institutional review board-approved study. Acquisitions were reconstructed as respiratory-gated images by using hardware-derived respiratory triggers and software-derived signal (via an automated postprocessing method). Asymmetry was evaluated in the joint distribution of reader preference, and linear mixed models were used to evaluate differences in outcomes according to gating type. Results In blind reviews of reconstructed gated images, software was selected as superior 16.9% of the time (111 of 657 image sets; 95% confidence interval [CI]: 14.0%, 19.8%), and hardware was selected as superior 6.2% of the time (41 of 657 image sets; 95% CI: 4.4%, 8.1%). Of the image sets, 76.9% (505 of 657; 95% CI: 73.6%, 80.1%) were judged as having indistinguishable motion quality. Quantitative analysis demonstrated that the two gating strategies exhibited similar performance, and the performance of both was significantly different from that of nongated images. The mean increase ± standard deviation in lesion maximum standardized uptake value was 42.2% ± 38.9 between nongated and software-gated images, and lesion full width at half maximum values decreased by 9.9% ± 9.6. Conclusion Compared with vendor-supplied respiratory-gating hardware methods, software gating performed favorably, both qualitatively and quantitatively. Fully automated gating is a feasible approach to motion correction of PET images. (©) RSNA, 2016 Online supplemental material is available for this article.

  5. Construction and validation of a tool to Assess the Use of Light Technologies at Intensive Care Units.

    Science.gov (United States)

    Marinho, Pabliane Matias Lordelo; Campos, Maria Pontes de Aguiar; Rodrigues, Eliana Ofélia Llapa; Gois, Cristiane Franca Lisboa; Barreto, Ikaro Daniel de Carvalho

    2016-12-19

    to construct and validate a tool to assess the use of light technologies by the nursing team at Intensive Care Units. methodological study in which the tool was elaborated by means of the psychometric method for construction based on the categorization of health technologies by Merhy and Franco, from the National Humanization Policy, using the Nursing Intervention Classification taxonomy to categorize the domains of the tool. Agreement Percentages and Content Validity Indices were used for the purpose of validation. The result of the application of the Interrater Agreement Percentage exceeded the recommended level of 80%, highlighting the relevance for the proposed theme in the assessment, with an agreement rate of 99%. the tool was validated with four domains (Bond, Autonomy, Welcoming and Management) and nineteen items that assess the use of light technologies at Intensive Care Units. construir e validar um instrumento para avaliação do uso de tecnologias leves, pela equipe de enfermagem, em Unidades de Terapia Intensiva. estudo metodológico no qual o instrumento foi elaborado utilizando o método psicométrico para construção com base na categorização das tecnologias em saúde de Merhy e Franco, da Política Nacional de Humanização, utilizando-se a taxonomia Nursing Intervention Classification para categorizar os domínios do instrumento. Utilizou-se o Percentual de Concordância e o Índice de Validade de Conteúdo (IVC) para validação. o resultado da aplicação do Percentual de Concordância entre os juízes foi superior ao recomendado de 80%, havendo destaque na avaliação da pertinência ao tema proposto, apresentando um percentual de concordância de 99%. o instrumento foi validado com quatro domínios (Vínculo, Autonomia, Acolhimento e Gestão) e dezenove itens que avaliam o uso das tecnologias leves em Unidade de Terapia Intensiva. construir y validar un instrumento para evaluación del uso de tecnologías leves, por el equipo de enfermer

  6. Field Evaluations Test Plan for Validation of Alternative Low-Emission Surface Preparation/Depainting Technologies for Structural Steel

    Science.gov (United States)

    Lewis, Pattie

    2005-01-01

    and approve alternative surface preparation technologies for use at NASA and AFSPC installations. Materials and processes will be evaluated with the goal of selecting those processes that will improve corrosion protection at critical systems, facilitate easier maintenance activity, extend maintenance cycles, eliminate flight hardware contamination and reduce the amount of hazardous waste generated. This Field Evaluations Test Plan defines the field evaluation and testing requirements for validating alternative surface preparation/depainting technologies and supplements the JTP. The field evaluations will be performed at Stennis Space Center, Mississippi, under the oversight of the Project Engineer. Additional field evaluations may be performed at other NASA centers or AFSPC facilities.

  7. Modernizing the MagIC Paleomagnetic and Rock Magnetic Database Technology Stack to Encourage Code Reuse and Reproducible Science

    Science.gov (United States)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.

    2016-12-01

    The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.

  8. Construction and execution of experiments at the multi-purpose thermal hydraulic test facility TOPFLOW for generic investigations of two-phase flows and the development and validation of CFD codes - Final report

    OpenAIRE

    2010-01-01

    The works aimed at the further development and validation of models for CFD codes. For this reason, the new thermal-hydraulic test facility TOPFLOW was erected and equipped with wire-mesh sensors with high spatial and time resolution. Vertical test sections with nominal diameters of DN50 and DN200 operating with air-water as well as steam-water two-phase flows provided results on the evaluation of flow patterns, on the be¬haviour of the interfacial area as well as on interfacial momentum and ...

  9. Validation of the Monte Carlo criticality program KENO IV and the Hansen-Roach sixteen-energy-group-cross sections for high-assay uranium systems. [KENO IV criticality code

    Energy Technology Data Exchange (ETDEWEB)

    Handley, G. R.; Masters, L. C.; Stachowiak, R. V.

    1981-04-10

    Validation of the Monte Carlo criticality code, KENO IV, and the Hansen-Roach sixteen-energy-group cross sections was accomplished by calculating the effective neutron multiplication constant, k/sub eff/, of 29 experimentally critical assemblies which had uranium enrichments of 92.6% or higher in the uranium-235 isotope. The experiments were chosen so that a large variety of geometries and of neutron energy spectra were covered. Problems, calculating the k/sub eff/ of systems with high-uranium-concentration uranyl nitrate solution that were minimally reflected or unreflected, resulted in the separate examination of five cases.

  10. Combustion chamber analysis code

    Science.gov (United States)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-05-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  11. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  12. 相位编码信号的综合旁瓣能量改善技术%The integrated sidelobe level improved technology for phase coded waveform

    Institute of Scientific and Technical Information of China (English)

    曾祥能; 张永顺; 何峰; 董臻

    2012-01-01

    The unsatisfactory integrated sidelobe energy performance limits the phase coded signal applied in the observation of extended targets.Based on the phase coded waveform with uniform sidelobe to reduce sidelobe energy,the mismatched filter technology is utilized to process echo for producing nearly uniform sidelobe.Moreover,the original transmitted signal components is extracted from the echo without any interference,the sidelobe canceller is constructed with the conjugate signal of itself but shifted by one bit.As a result of the sidelobe cancel and other preprocessing,the phase coded signal integrated sidelobe level ratio is improved remarkably,and also obtained advanced Doppler tolerance that similar to linear frequency modulated signal.The proposed method may make the phase coded waveform obtain widely radar apply such as observation space-to-ground.%相位编码信号的综合旁瓣性能不佳是制约其在分布式目标观测应用中的主要因素。根据相位编码信号的均匀旁瓣结构有利于抑制旁瓣能量的性质,采用失配滤波技术对接收回波处理,获得了基本均匀的旁瓣输出,进一步通过接收回波提取出原始发射信号分量,通过一位时移、取复共轭形成了旁瓣对消器,经过旁瓣对消处理后极大地改善了相位编码信号的积分旁瓣比指标值,并且获得了类似线性调频信号的多普勒容忍性,本方法将使得相位编码信号在雷达对地观测等方面获得很好的应用前景。

  13. Comparison and validation of the results of the AZNHEX v.1.0 code with the MCNP code simulating the core of a fast reactor cooled with sodium; Comparacion y validacion de los resultados del codigo AZNHEX v.1.0 con el codigo MCNP simulando el nucleo de un reactor rapido refrigerado con sodio

    Energy Technology Data Exchange (ETDEWEB)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Esquivel E, J., E-mail: blink19871@hotmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The development of the AZTLAN platform for the analysis and design of nuclear reactors is led by Instituto Nacional de Investigaciones Nucleares (ININ) and divided into four working groups, which have well-defined activities to achieve significant progress in this project individually and jointly. Within these working groups is the users group, whose main task is to use the codes that make up the AZTLAN platform to provide feedback to the developers, and in this way to make the final versions of the codes are efficient and at the same time reliable and easy to understand. In this paper we present the results provided by the AZNHEX v.1.0 code when simulating the core of a fast reactor cooled with sodium at steady state. The validation of these results is a fundamental part of the platform development and responsibility of the users group, so in this research the results obtained with AZNHEX are compared and analyzed with those provided by the Monte Carlo code MCNP-5, software worldwide used and recognized. A description of the methodology used with MCNP-5 is also presented for the calculation of the interest variables and the difference that is obtained with respect to the calculated with AZNHEX. (Author)

  14. On the structure of Lattice code WIMSD-5B

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Young; Min, Byung Joo

    2004-03-15

    The WIMS-D code is a freely available thermal reactor physics lattice code used widely for thermal research and power reactor calculation. Now the code WIMS-AECL, developed on the basis of WIMS-D, has been used as one of lattice codes for the cell calculation in Canada and also, in 1998, the latest version WIMSD-5B is released for OECD/NEA Data Bank. While WIMS-KAERI was developed and has been used, originated from WIMS-D, in Korea, it was adjusted for the cell calculation of research reactor HANARO and so it has no confirmaty to CANDU reactor. Therefore, the code development applicable to cell calculation of CANDU reactor is necessary not only for technological independence and but also for the establishment of CANDU safety analysis system. A lattice code WIMSD-5B was analyzed in order to set the system of reactor physics computer codes, to be used in the assessment of void reactivity effect. In order to improve and validate WIMSD-5B code, the analysis of the structure of WIMSD-5B lattice code was made and so its structure, algorithm and the subroutines of WIMSD-5B were presented for the cluster type and the pij method modelling the CANDU-6 fuel

  15. Analysis of a tungsten sputtering experiment in DIII-D and code/data validation of high redeposition/reduced erosion

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, J.N., E-mail: brooksjn@purdue.edu [Purdue University, West Lafayette, IN (United States); Elder, J.D. [University of Toronto Institute for Aerospace Studies, Toronto (Canada); McLean, A.G. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Rudakov, D.L. [University of California San Diego, San Diego, CA (United States); Stangeby, P.C. [University of Toronto Institute for Aerospace Studies, Toronto (Canada); Wampler, W.R. [Sandia National Laboratories, Albuquerque, NM (United States)

    2015-05-15

    We analyze a DIII-D tokamak experiment where two tungsten spots on the removable DiMES divertor probe were exposed to 12 s of attached plasma conditions, with moderate strike point temperature and density (∼20 eV, ∼4.5 × 10{sup 19} m{sup −3}), and 3% carbon impurity content. Both very small (1 mm diameter) and small (1 cm diameter) deposited samples were used for assessing gross and net tungsten sputtering erosion. The analysis uses a 3-D erosion/redeposition code package (REDEP/WBC), with input from a diagnostic-calibrated near-surface plasma code (OEDGE), and with focus on charge state resolved impinging carbon ion flux and energy. The tungsten surfaces are primarily sputtered by the carbon, in charge states +1 to +4. We predict high redeposition (∼75%) of sputtered tungsten on the 1 cm spot—with consequent reduced net erosion—and this agrees well with post-exposure DiMES probe RBS analysis data. This study and recent related work is encouraging for erosion lifetime and non-contamination performance of tokamak reactor high-Z plasma facing components.

  16. A Validity Study on Predictors of Success in Resident Master’s Degree Programs at the Air Force Institute of Technology.

    Science.gov (United States)

    1987-09-01

    1. 0% DEPARTMENT OF THE AIR FORCE *AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio I Dzsnm ION SATE A I A edfor ... k 6 or more examinees is a useful and valid measurement i.e., within reliability limits (10:3). The GRE and the GMAT are divided into various

  17. Distributed multiple description coding

    CERN Document Server

    Bai, Huihui; Zhao, Yao

    2011-01-01

    This book examines distributed video coding (DVC) and multiple description coding (MDC), two novel techniques designed to address the problems of conventional image and video compression coding. Covering all fundamental concepts and core technologies, the chapters can also be read as independent and self-sufficient, describing each methodology in sufficient detail to enable readers to repeat the corresponding experiments easily. Topics and features: provides a broad overview of DVC and MDC, from the basic principles to the latest research; covers sub-sampling based MDC, quantization based MDC,

  18. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  19. Holographic codes

    CERN Document Server

    Latorre, Jose I

    2015-01-01

    There exists a remarkable four-qutrit state that carries absolute maximal entanglement in all its partitions. Employing this state, we construct a tensor network that delivers a holographic many body state, the H-code, where the physical properties of the boundary determine those of the bulk. This H-code is made of an even superposition of states whose relative Hamming distances are exponentially large with the size of the boundary. This property makes H-codes natural states for a quantum memory. H-codes exist on tori of definite sizes and get classified in three different sectors characterized by the sum of their qutrits on cycles wrapped through the boundaries of the system. We construct a parent Hamiltonian for the H-code which is highly non local and finally we compute the topological entanglement entropy of the H-code.

  20. Sharing code

    OpenAIRE

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  1. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    Science.gov (United States)

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  2. Convergence of electromagnetic coilgun design code and its experimental validation%电磁线圈炮计算程序的收敛性与实验验证

    Institute of Scientific and Technical Information of China (English)

    何勇; 高贵山; 宋盛义; 关永超; 程诚; 李业勋; 仇旭

    2014-01-01

    根据电磁线圈炮中推力线圈与电枢之间的感应耦合方程和电枢动态响应方程,考虑发射过程中推力线圈和电枢的欧姆加热,开发了电磁线圈炮的计算程序,给出了程序的算法和流程。分析了程序的收敛性,认为最大时间步长和电枢最大网格均依赖于驱动脉冲波形的上升时间。通过模拟结果与单级线圈炮实验测量结果的比较,验证了程序的有效性。模拟的线圈电流波形与实验波形吻合较好,模拟的出口速度比测量速度大约6.5%。程序收敛性和有效性的验证表明该程序可用于电磁线圈炮系统的初步设计。%A coilgun design code is programmed according to the formulas describing the inductive coupling between the thrust coils and the armature,the dynamic response of the armature,and the ohmic heating in the thrust coils and armature.The algorithms determining the operation status of the driven pulse forming network are presented.The convergence of the code is an-alyzed.The maximum time-step and mesh size of the armature are dependent on the rise time of the driven pulse current.The code is validated by comparing the simulated results with the experimental ones.The calculated coil current wave is in agreement with the experimental results.However,the calculated velocity is higher than the measured one about 6.5%.The convergence and validation of the code indicate that it can be used for the primary design for a coilgun system.

  3. 二维码防伪技术在可变数据印刷中的应用%Application of Anti-counterfeiting Technology Based on Two-dimensional Bar Code in Variable Data Printing

    Institute of Scientific and Technical Information of China (English)

    肖菲菲; 刘真

    2011-01-01

    The principle of two-dimensional bar code anti-counterfeiting technology was analyzed.A plan of applying two-dimensional code anti-counterfeiting technology in variable-data printing was put forward.On the basis of experiments on generation and recognition of two-dimensional bar code,anti-counterfeiting properties of two different two-dimensional bar codes(PDF417 code and QR code) were obtained and their difference was compared.The theoretical analysis and experiment results showed that anti-counterfeiting technology based on two-dimensional bar code can be used in variable data printing,besides,different two-dimensional bar codes have different anti-counterfeiting properties;different two-dimensional bar code should be selected according to specific requirements.%分析了二维码的防伪原理,提出了在可变数据印刷中应用二维码进行防伪的方案,在完成基于可变数据印刷的二维码的生成和识别实验的基础上,设计实验比较得出了不同二维码(PDF417码与QR码)的防伪特性的差异。理论分析及实验结果表明,利用二维码对可变数据印刷进行防伪的方案是可行的,并且不同二维码的防伪特性存在差异,在实际应用中可以根据具体的要求选用不同的二维码。

  4. Validation of 3D Code KATRIN For Fast Neutron Fluence Calculation of VVER-1000 Reactor Pressure Vessel by Ex-Vessel Measurements and Surveillance Specimens Results

    Science.gov (United States)

    Dzhalandinov, A.; Tsofin, V.; Kochkin, V.; Panferov, P.; Timofeev, A.; Reshetnikov, A.; Makhotin, D.; Erak, D.; Voloschenko, A.

    2016-02-01

    Usually the synthesis of two-dimensional and one-dimensional discrete ordinate calculations is used to evaluate neutron fluence on VVER-1000 reactor pressure vessel (RPV) for prognosis of radiation embrittlement. But there are some cases when this approach is not applicable. For example the latest projects of VVER-1000 have upgraded surveillance program. Containers with surveillance specimens are located on the inner surface of RPV with fast neutron flux maximum. Therefore, the synthesis approach is not suitable enough for calculation of local disturbance of neutron field in RPV inner surface behind the surveillance specimens because of their complicated and heterogeneous structure. In some cases the VVER-1000 core loading consists of fuel assemblies with different fuel height and the applicability of synthesis approach is also ambiguous for these fuel cycles. Also, the synthesis approach is not enough correct for the neutron fluence estimation at the RPV area above core top. Because of these reasons only the 3D neutron transport codes seem to be satisfactory for calculation of neutron fluence on the VVER-1000 RPV. The direct 3D calculations are also recommended by modern regulations.

  5. Validation of 3D Code KATRIN For Fast Neutron Fluence Calculation of VVER-1000 Reactor Pressure Vessel by Ex-Vessel Measurements and Surveillance Specimens Results

    Directory of Open Access Journals (Sweden)

    Dzhalandinov A.

    2016-01-01

    Full Text Available Usually the synthesis of two-dimensional and one-dimensional discrete ordinate calculations is used to evaluate neutron fluence on VVER-1000 reactor pressure vessel (RPV for prognosis of radiation embrittlement. But there are some cases when this approach is not applicable. For example the latest projects of VVER-1000 have upgraded surveillance program. Containers with surveillance specimens are located on the inner surface of RPV with fast neutron flux maximum. Therefore, the synthesis approach is not suitable enough for calculation of local disturbance of neutron field in RPV inner surface behind the surveillance specimens because of their complicated and heterogeneous structure. In some cases the VVER-1000 core loading consists of fuel assemblies with different fuel height and the applicability of synthesis approach is also ambiguous for these fuel cycles. Also, the synthesis approach is not enough correct for the neutron fluence estimation at the RPV area above core top. Because of these reasons only the 3D neutron transport codes seem to be satisfactory for calculation of neutron fluence on the VVER-1000 RPV. The direct 3D calculations are also recommended by modern regulations.

  6. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  7. Polar Codes

    Science.gov (United States)

    2014-12-01

    QPSK Gaussian channels . .......................................................................... 39 vi 1. INTRODUCTION Forward error correction (FEC...Capacity of BSC. 7 Figure 5. Capacity of AWGN channel . 8 4. INTRODUCTION TO POLAR CODES Polar codes were introduced by E. Arikan in [1]. This paper...Under authority of C. A. Wilgenbusch, Head ISR Division EXECUTIVE SUMMARY This report describes the results of the project “More reliable wireless

  8. Computer code applicability assessment for the advanced Candu reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wren, D.J.; Langman, V.J.; Popov, N.; Snell, V.G. [Atomic Energy of Canada Ltd (Canada)

    2004-07-01

    AECL Technologies, the 100%-owned US subsidiary of Atomic Energy of Canada Ltd. (AECL), is currently the proponents of a pre-licensing review of the Advanced Candu Reactor (ACR) with the United States Nuclear Regulatory Commission (NRC). A key focus topic for this pre-application review is the NRC acceptance of the computer codes used in the safety analysis of the ACR. These codes have been developed and their predictions compared against experimental results over extended periods of time in Canada. These codes have also undergone formal validation in the 1990's. In support of this formal validation effort AECL has developed, implemented and currently maintains a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper discusses the SQA program used to develop, qualify and maintain the computer codes used in ACR safety analysis, including the current program underway to confirm the applicability of these computer codes for use in ACR safety analyses. (authors)

  9. Robust Nonlinear Neural Codes

    Science.gov (United States)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  10. Technology of digital transmission based on vernier for IRIG-B(DC) time code%基于游标法的时统IRIG-B(DC)码的数字传输技术

    Institute of Scientific and Technical Information of China (English)

    王志林; 童斌; 王永岭

    2012-01-01

    This paper introduces the principle of vernier method for code conversion. Analyzes influence of synchronization accuracy for signal distortion and effect of clock insertion and blockade. Gives method of the digital transmission technology based on FPGA for IRIG-B(DC) time code.%介绍了游标法代码变换的原理,分析了信号畸变对同步精度的影响以及时钟插入与封锁的作用,给出了利用FPGA实现时统IRIG-B(DC)码数字传输技术的方法。

  11. QR码技术在枸杞产品追溯系统设计中的应用%Application of QR code technology in design of Chinese wolfberry product traceability system

    Institute of Scientific and Technical Information of China (English)

    王琛; 李剑蓓; 温淑萍; 李述成

    2016-01-01

    The main purpose of this thesis is to study the application of QR code technology in design of the Chinese wolf⁃berry product traceability system. The QR code technology and QR code to identify agricultural products were used in design of the Chinese wolfberry product traceability system,so as to improve the traceability of Chinese wolfberry products. The research results confirm that the application of QR code technology in design of the product traceability system can improve the traceabili⁃ty of Chinese wolfberry products,which is increased by 20%,compared with the previous system. In design of Chinese wolfber⁃ry product traceability system,application of QR code technology to identify agricultural products can improve the design level of Chinese wolfberry products traceability system,which has an effective application value and can improve the Chinese wolfber⁃ry products traceability.%通过在枸杞产品追溯系统设计中应用QR码技术,标识农产品信息,提升枸杞产品追溯性。结果表明,应用QR码技术设计枸杞产品追溯系统,可以提升枸杞产品的可追溯性,较之前提升20.0%。枸杞产品追溯系统设计中,应用QR码技术标识农产品,不仅可以提升枸杞产品追溯系统设计水平,也可以发挥有效应用价值,提升枸杞产品可追溯性。

  12. Validation of the coupling of mesh models to GEANT4 Monte Carlo code for simulation of internal sources of photons; Validacao do acoplamento de modelos mesh ao codigo Monte Carlo GEANT4 para simulacao de fontes de fotons internas

    Energy Technology Data Exchange (ETDEWEB)

    Caribe, Paulo Rauli Rafeson Vasconcelos, E-mail: raulycaribe@hotmail.com [Universidade Federal Rural de Pernambuco (UFRPE), Recife, PE (Brazil). Fac. de Fisica; Cassola, Vagner Ferreira; Kramer, Richard; Khoury, Helen Jamil [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear

    2013-07-01

    The use of three-dimensional models described by polygonal meshes in numerical dosimetry enables more accurate modeling of complex objects than the use of simple solid. The objectives of this work were validate the coupling of mesh models to the Monte Carlo code GEANT4 and evaluate the influence of the number of vertices in the simulations to obtain absorbed fractions of energy (AFEs). Validation of the coupling was performed to internal sources of photons with energies between 10 keV and 1 MeV for spherical geometries described by the GEANT4 and three-dimensional models with different number of vertices and triangular or quadrilateral faces modeled using Blender program. As a result it was found that there were no significant differences between AFEs for objects described by mesh models and objects described using solid volumes of GEANT4. Since that maintained the shape and the volume the decrease in the number of vertices to describe an object does not influence so meant dosimetric data, but significantly decreases the time required to achieve the dosimetric calculations, especially for energies less than 100 keV.

  13. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  14. Validity of the recorded International Classification of Diseases, 10th edition diagnoses codes of bone metastases and skeletal-related events in breast and prostate cancer patients in the Danish National Registry of Patients

    Directory of Open Access Journals (Sweden)

    Annette Østergaard Jensen

    2009-07-01

    Full Text Available Annette Østergaard Jensen1, Mette Nørgaard1, Mellissa Yong2, Jon P Fryzek2, Henrik Toft Sørensen11Department of Clinical Epidemiology, Aarhus University hospital, Århus, Denmark; 2Global Epidemiology, Amgen inc., Thousands Oaks, CA, USAObjective: The clinical history of bone metastases and skeletal-related events (SREs secondary to cancers is not well understood. In support of studies of the natural history of bone metastases and SREs in Danish prostate and breast cancer patients, we estimated the sensitivity and specificity of hospital diagnoses for bone metastases and SREs (ie, radiation therapy to the bone, pathological or osteoporotic fractures, spinal cord compression and surgery to the bone in a nationwide medical registry in Denmark.Study design and setting: In North Jutland County, Denmark, we randomly sampled 100 patients with primary prostate cancer and 100 patients with primary breast cancer diagnoses from the National Registry of Patients (NRP, during the period January 1st, 2000 to December 31st, 2000 and followed them for up to five years after their cancer diagnosis. We used information from medical chart reviews as the reference for estimating sensitivity, and specificity of the NRP International Classification of Diseases, 10th edition (ICD-10 coding for bone metastases and SRE diagnoses. Results: For prostate cancer, the overall sensitivity of bone metastases or SRE coding in the NRP was 0.54 (95% confidence interval [CI]: 0.39–0.69, and the specificity was 0.96 (95% CI: 0.87–1.00. For breast cancer, the overall sensitivity of bone metastases or SRE coding in the NRP was 0.58 (95% CI: 0.34–0.80, and the specificity was 0.95 (95% CI: 0.88–0.99. Conclusion: We measured the validity of ICD-10 coding in the Danish NRP for bone metastases and SREs in prostate and breast cancer patients and found it has adequate sensitivity and high specificity. The NRP remains a valuable tool for clinical epidemiological studies of bone

  15. Research and Trends in the Field of Technology-Enhanced Learning from 2006 to 2011: A Content Analysis of Quick Response Code (QR-Code) and Its Application in Selected Studies

    Science.gov (United States)

    Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali

    2013-01-01

    This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…

  16. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  17. The stellar atmosphere simulation code Bifrost. Code description and validation

    NARCIS (Netherlands)

    Gudiksen, B.V.; Carlsson, M.; Hansteen, V.H.; Hayek, W.; Leenaarts, J.|info:eu-repo/dai/nl/304837946; Martínez-Sykora, J.

    2011-01-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere,

  18. Zip Codes - MDC_WCSZipcode

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — The WCSZipcode polygon feature class was created by Miami-Dade Enterprise Technology Department to be used in the WCS batch jobs to assign the actual zip code of...

  19. Algorithm Design and Validation for Adaptive Nonlinear Control Enhancement (ADVANCE) Technology Development for Resilient Flight Control Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI proposes to develop and test a framework referred to as the ADVANCE (Algorithm Design and Validation for Adaptive Nonlinear Control Enhancement), within which...

  20. Validation of 2-mm tissue microarray technology in gastric cancer. Agreement of 2-mm TMAs and full sections for Glut-1 and Hif-1 alpha.

    Science.gov (United States)

    Berlth, Felix; Mönig, Stefan P; Schlösser, Hans A; Maus, Martin; Baltin, Christoph T H; Urbanski, Alexander; Drebber, Uta; Bollschweiler, Elfriede; Hölscher, Arnulf H; Alakus, Hakan

    2014-07-01

    Tissue Microarray (TMA) is a widely used method to perform high-throughput immunohistochemical analyses on different tissues by arraying small sample cores from paraffin-fixed tissues into a single paraffin block. TMA-technology has been validated on numerous cancer tissues and also for gastric cancer studies, although it has not been validated for this tumor tissue so far. The objective of this study was to assess, whether the 2-mm TMA-technology is able to provide representative samples of gastric cancer tissue. TMA paraffin blocks were constructed by means of 220 formalin-fixed and paraffin-embedded gastric cancer samples with a sample diameter of 2 mm. The agreement of immunohistochemical stainings of Glut-1 and Hif-1 alpha in TMA sections and the original full sections was calculated using kappa statistics and direct adjustment. The congruence was substantial for Glut-1 (kappa 0.64) and Hif-1 alpha (kappa 0.70), but with an agreement of only 71% and 52% within the marker-positive cases of the full-section slides. Due to tumor heterogeneity primarily, the TMA technology with a 2-mm sample core shows relevant limitations in gastric cancer tissue. Although being helpful for tissue screening purposes, the 2-mm TMA technology cannot be recommended as a method equal to full-section investigations in gastric cancer. Copyright© 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  1. Generation and validation of PAX7 reporter lines from human iPS cells using CRISPR/Cas9 technology

    Directory of Open Access Journals (Sweden)

    Jianbo Wu

    2016-03-01

    Finally, by using a nuclease-dead Cas9 activator (dCas9-VP160 system, the promoter region of PAX7 has been targeted for transient gene induction to validate the GFP reporter activity. This was confirmed by flow cytometry analysis and immunostaining for PAX7 and GFP. This technical report provides a practical guideline for generation and validation of knock-in reporters using CRISPR/Cas9 system.

  2. Rapid Robot Design Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies will create a comprehensive software infrastructure for rapid validation of robot designs. The software will support push-button validation...

  3. Rapid Robot Design Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies will create a comprehensive software infrastructure for rapid validation of robotic designs. The software will support push-button validation...

  4. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  5. QR CODE IN LIBRARY PRACTICE SOME EXAMPLES

    OpenAIRE

    Ajay Shanker Mishra*, Sachin Kumar Umre, Pavan Kumar Gupta

    2017-01-01

    Quick Response (QR) code is one such technology which can cater to the user demand of providing access to resources through mobile. The main objective of this article to review the concept of Quick Response Code (QR code) and describe the practice of reading and generating QR codes. Research paper attempt to the basic concept, structure, technological pros and cons of the QR code. The literature is filled with potential uses for Quick Response (QR) codes in the library practices like e-resour...

  6. Factors Affecting Acceptance & Use of ReWIND: Validating the Extended Unified Theory of Acceptance and Use of Technology

    Science.gov (United States)

    Nair, Pradeep Kumar; Ali, Faizan; Leong, Lim Chee

    2015-01-01

    Purpose: This study aims to explain the factors affecting students' acceptance and usage of a lecture capture system (LCS)--ReWIND--in a Malaysian university based on the extended unified theory of acceptance and use of technology (UTAUT2) model. Technological advances have become an important feature of universities' plans to improve the…

  7. Using Early Concept Narratives to Collect Valid Customer Input about Breakthrough Technologies: The Effect of Application Visualization on Transportation

    NARCIS (Netherlands)

    Van den Hende, E.A.; Schoormans, J.P.L.; Morel, K.P.N.; Lashina, T.; Van Loenen, E.; De Boevere, E.I.

    2007-01-01

    The value of early customer input has long been recognized by companies. However, especially when breakthrough technologies are involved, more insight in valuable methods for collecting early customer input is needed. In this paper, we propose a method to evaluate a breakthrough technology with cust

  8. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    ; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....

  9. Bringing Fenton Hill into the Digital Age: Data Conversion in Support of the Geothermal Technologies Office Code Comparison Study Challenge Problems

    Energy Technology Data Exchange (ETDEWEB)

    White, Signe K.; Kelkar, Sharad M.; Brown, Don W.

    2016-03-01

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) was established by the U.S. Department of Energy to facilitate collaboration among members of the geothermal modeling community and to evaluate and improve upon the ability of existing codes to simulate thermal, hydrological, mechanical, and chemical processes associated with complex enhanced geothermal systems (EGS). The first stage of the project, which has been completed, involved comparing simulations for seven benchmark problems that were primarily designed using well-prescribed, simplified data sets. In the second stage, the participating teams are tackling two challenge problems based on the EGS research conducted in hot dry rock (HDR) at Fenton Hill, near Los Alamos, New Mexico. The Fenton Hill project, conducted by Los Alamos National Laboratory (LANL) from 1970 to 1995, was the world’s first HDR demonstration project. One of the criteria for selecting this experiment as the basis for the challenge problems was the amount and availability of data for generating model inputs. The Fenton Hill HDR system consisted of two reservoirs – an earlier Phase I reservoir tested from 1974 to 1981 and a deeper Phase II reservoir tested from 1980 to 1995. Detailed accounts of both phases of the HDR project have been presented in a number of books and reports, including a recently published summary of the lessons learned and a final report with a chronological description of the Fenton Hill project, prepared by LANL. Project documents and records have been archived and made public through the National Geothermal Data System (NGDS). Some of the data acquired from Phase II are available in electronic format readable on modern computers. These include the microseismic data from some of the important experiments (e.g. the massive hydraulic fracturing test conducted in 1983) and the injection/production wellhead data from the circulation tests conducted between 1992-1995. However, much of the data collected

  10. Exploring the concept of QR Code and the benefits of using QR Code for companies

    OpenAIRE

    Ji, Qianyu

    2014-01-01

    This research work concentrates on the concept of QR Code and the benefits of using QR Code for companies. The first objective of this research work is to study the general information of QR Code in order to guide people to understand the QR Code in detail. The second objective of this research work is to explore and analyze the essential and feasible technologies of QR Code for the sake of clearing the technologies of QR code. Additionally, this research work through QR Code best practices t...

  11. The Application of Bar Code Technology in Inspection of Laboratory Information Management System LIS%对在检验实验室信息管理系统LIS中应用条形码技术的探讨

    Institute of Scientific and Technical Information of China (English)

    卢方建

    2011-01-01

    The bar code technology in the clinical laboratory information system can improve laboratory automation,work efficiency, reduce errors and facilitate patient.%将务形码技术应用于检验实验室信息系统,提高实验室的自动化程度、工作效率,减少差错并方便病人。

  12. Application of Wireless Network and Technology of Information Bar Code in the Steel Warehouse Management System%无线网络及信息条码技术在钢材库管理系统的应用

    Institute of Scientific and Technical Information of China (English)

    田宏

    2011-01-01

    本文简要介绍了无线网络及信息条码技术在改造传统钢材库管理中的应用,重点叙述系统的构成及功能。%This paper is a simple statement that the application of wireless network and the technology of information bar code in the steel warehouse management system,the focal point is the constitute and function of the system.

  13. Development and testing of mobile technology for community park improvements: validity and reliability of the eCPAT application with youth.

    Science.gov (United States)

    Besenyi, Gina M; Diehl, Paul; Schooley, Benjamin; Turner-McGrievy, Brie M; Wilcox, Sara; Stanis, Sonja A Wilhelm; Kaczynski, Andrew T

    2016-12-01

    Creation of mobile technology environmental audit tools can provide a more interactive way for youth to engage with communities and facilitate participation in health promotion efforts. This study describes the development and validity and reliability testing of an electronic version of the Community Park Audit Tool (eCPAT). eCPAT consists of 149 items and incorporates a variety of technology benefits. Criterion-related validity and inter-rater reliability were evaluated using data from 52 youth across 47 parks in Greenville County, SC. A large portion of items (>70 %) demonstrated either fair or moderate to perfect validity and reliability. All but six items demonstrated excellent percent agreement. The eCPAT app is a user-friendly tool that provides a comprehensive assessment of park environments. Given the proliferation of smartphones, tablets, and other electronic devices among both adolescents and adults, the eCPAT app has potential to be distributed and used widely for a variety of health promotion purposes.

  14. Research on the Application of Two Dimensional Code Technology in the Settlement of Electric Power Materials%二维码技术在电力物资结算中的应用研究

    Institute of Scientific and Technical Information of China (English)

    刘金元; 陆野; 汪天睿

    2016-01-01

    This paper expounds the present situation of the two -dimensional code technology of power supplies in settlement,then explore the specific application of two-dimensional code technology in the related business process,finally the two-dimensional code technology in the material balance the next step in the direction of the application,to further improve the electric power material has an important guiding significance to the settlement system.%文中首先阐述了电力物资结算中二维码技术应用的现状,其次探索了二维码技术在相关业务流程中的具体应用,最后提出了二维码技术在物资结算中的下一步应用方向,对进一步完善电力物资结算体系具有重要的指导意义。

  15. Research on DSP automatic code generation technology with Matlab platform%Matlab平台DSP自动代码生成技术研究

    Institute of Scientific and Technical Information of China (English)

    王巧明; 李中健; 姜达郁

    2012-01-01

    Since it is difficult and time-consuming to programm for DSP, a method that synthetically uses Matlab, code, composer studio (CCS), and their embedded tools and connection softwares to generate code automatically is proposed. The research mainly focuses on the method of automatic code generation with DM642 EVM board. The edge detection experiment is taken to verify the performability and reliability of the method. The results show that the code generation method is not only efficient, but also flexible. The generated code can be executed smoothly in the DSP board with a good processing result.%针对DSP编程难度大,耗时长的问题,给出了一种综合运用Matlab软件、Code Composer Studio(CCS)软件及其内嵌工具和连接软件进行自动代码生成的方法.重点研究DM642 EVM板的自动代码生成方法,并以边缘检测实验为例,验证自动生成代码的可执行性.实验结果表明,该代码生成方法不仅具有极高的生成效率,而且灵活易用;生成的可执行代码可以在DSP板上顺利运行,并可取得非常好的处理结果.

  16. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe...... the codes succinctly using Gröbner bases....

  17. Cost-Effective ISS Space-Environment Technology Validation of Advanced Roll-Out Solar Array (ROSA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Effort proposed is for detailed planning, configuration and hardware definition of a low-cost, but high technology payoff, ISS-based flight experiment that will...

  18. Accelerator-based validation of shielding codes

    OpenAIRE

    Zeitlin, Cary; Heilbronn, Lawrence; Miller, Jack; Wilson, John W.

    2002-01-01

    The space radiation environment poses risks to astronaut health from a diverse set of sources, ranging from low-energy protons and electrons to highly-charged, high-energy atomic nuclei and their associated fragmentation products, including neutrons. The low-energy protons and electrons are the source of most of the radiation dose to Shuttle and ISS crews, while the more energetic particles that comprise the Galactic Cosmic Radiation (protons, He, and heavier nuclei up to Fe) will be th...

  19. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, and technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.

  20. Verification of ONED90 code

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Ki Bog; Zee, Sung Kyun; Lee, Chang Ho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    ONED90 developed by KAERI is a 1-dimensional 2-group diffusion theory code. For nuclear design and reactor simulation, the usage of ONED90 encompasses core follow calculation, load follow calculation, plant power control simulation, xenon oscillation simulation and control rod maneuvering, etc. In order to verify the validity of ONED90 code, two well-known benchmark problems are solved by ONED90 shows very similar result to reference solution. (Author) 11 refs., 5 figs., 13 tabs.