WorldWideScience

Sample records for technology validation codes

  1. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  2. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  3. Validation of International Classification of Diseases coding for bone metastases in electronic health records using technology-enabled abstraction.

    Science.gov (United States)

    Liede, Alexander; Hernandez, Rohini K; Roth, Maayan; Calkins, Geoffrey; Larrabee, Katherine; Nicacio, Leo

    2015-01-01

    The accuracy of bone metastases diagnostic coding based on International Classification of Diseases, ninth revision (ICD-9) is unknown for most large databases used for epidemiologic research in the US. Electronic health records (EHR) are the preferred source of data, but often clinically relevant data occur only as unstructured free text. We examined the validity of bone metastases ICD-9 coding in structured EHR and administrative claims relative to the complete (structured and unstructured) patient chart obtained through technology-enabled chart abstraction. Female patients with breast cancer with ≥1 visit after November 2010 were identified from three community oncology practices in the US. We calculated sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of bone metastases ICD-9 code 198.5. The technology-enabled abstraction displays portions of the chart to clinically trained abstractors for targeted review, thereby maximizing efficiency. We evaluated effects of misclassification of patients developing skeletal complications or treated with bone-targeting agents (BTAs), and timing of BTA. Among 8,796 patients with breast cancer, 524 had confirmed bone metastases using chart abstraction. Sensitivity was 0.67 (95% confidence interval [CI] =0.63-0.71) based on structured EHR, and specificity was high at 0.98 (95% CI =0.98-0.99) with corresponding PPV of 0.71 (95% CI =0.67-0.75) and NPV of 0.98 (95% CI =0.98-0.98). From claims, sensitivity was 0.78 (95% CI =0.74-0.81), and specificity was 0.98 (95% CI =0.98-0.98) with PPV of 0.72 (95% CI =0.68-0.76) and NPV of 0.99 (95% CI =0.98-0.99). Structured data and claims missed 17% of bone metastases (89 of 524). False negatives were associated with measurable overestimation of the proportion treated with BTA or with a skeletal complication. Median date of diagnosis was delayed in structured data (32 days) and claims (43 days) compared with technology-assisted EHR. Technology

  4. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  5. Development and validation of sodium fire codes

    International Nuclear Information System (INIS)

    Morii, Tadashi; Himeno Yoshiaki; Miyake, Osamu

    1989-01-01

    Development, verification, and validation of the spray fire code, SPRAY-3M, the pool fire codes, SOFIRE-M2 and SPM, the aerosol behavior code, ABC-INTG, and the simultaneous spray and pool fires code, ASSCOPS, are presented. In addition, the state-of-the-art of development of the multi-dimensional natural convection code, SOLFAS, for the analysis of heat-mass transfer during a fire, is presented. (author)

  6. Validating accident consequence assessment codes

    International Nuclear Information System (INIS)

    Viktorsson, C.; Kelly, G.N.; Nixon, W.

    1993-01-01

    The main objective of the study was to compare the predictions of participating codes for a range of postulated accidental releases and to assess the significance of any differences observed. Seven codes from various countries participated in the exercise: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), MECA2 (Spain) and OSCAAR (Japan). They calculated a wide range of consequences, for example: collective doses, early and late health effects, economic costs and the effect of countermeasures on people and agriculture. In each case, the probability distributions predicted by the codes were compared. (author)

  7. Validation of the reactor dynamics code HEXTRAN

    International Nuclear Information System (INIS)

    Kyrki-Rajamaeki, R.

    1994-05-01

    HEXTRAN is a new three-dimensional, hexagonal reactor dynamics code developed in the Technical Research Centre of Finland (VTT) for VVER type reactors. This report describes the validation work of HEXTRAN. The work has been made with the financing of the Finnish Centre for Radiation and Nuclear Safety (STUK). HEXTRAN is particularly intended for calculation of such accidents, in which radially asymmetric phenomena are included and both good neutron dynamics and two-phase thermal hydraulics are important. HEXTRAN is based on already validated codes. The models of these codes have been shown to function correctly also within the HEXTRAN code. The main new model of HEXTRAN, the spatial neutron kinetics model has been successfully validated against LR-0 test reactor and Loviisa plant measurements. Connected with SMABRE, HEXTRAN can be reliably used for calculation of transients including effects of the whole cooling system of VVERs. Further validation plans are also introduced in the report. (orig.). (23 refs., 16 figs., 2 tabs.)

  8. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  9. FACTAR 2.0 code validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Rock, R.C.K.; Wadsworth, S.L.

    1997-01-01

    The FACTAR code models the thermal and mechanical behaviour of a CANDU fuel channel under degraded cooling conditions. FACTAR is currently undergoing a process of validation against various data sets in order to qualify its use in nuclear safety analysis. This paper outlines the methodology being followed in this effort. The BTF-104 and BTF-105A tests, conducted at Chalk River Laboratories, have been chosen as the first in reactor tests to be used for FACTAR validation. The BTF experiments were designed to represent CANDU fuel behaviour under typical large LOCA conditions. The two tests are summarized briefly, and the results of code comparisons to experimental data are outlined. The comparisons demonstrate that FACTAR is able to accurately predict the values of selected key parameters. As anticipated in the validation plan, further work is required to fully quantify simulation biases for all parameters of interest. (author)

  10. Validation Report for ISAAC Computer Code

    International Nuclear Information System (INIS)

    Kim, Dong Ha; Song, Yong Mann; Park, Soo Yong; Jin, Young Ho; Kim, See Darl; Kim, Sang Baik

    2008-12-01

    A fully integrated severe accident code ISAAC was developed to simulate the accident scenarios that could lead to a severe core damage and eventually to the containment failure in CANDU reactors. Three ways of validation were adopted in this report. The first approach is to show the ISAAC results for the typical severe core damage sequences. In general, the ISAAC computer code shows the reasonable results in terms of the thermal hydraulic behavior as well as fission product transport from the PHTS to the containment. As the second step, the ISAAC results are compared against those from CATHENA and MAAP4-CANDU. In spite of the modeling differences, the overall trend is similar to each other. Especially, the major severe accident phenomena and the accident progression are similar to MAAP4-CANDU, though ISAAC predicts the accident progression faster. Finally ISAAC results are compared with the experimental data. The ISAAC models provide a good agreement with the measured data. Still more efforts are needed to validate the code by the code-to-code comparison and the comparison against the experimental data available

  11. German tests in support of code validation

    International Nuclear Information System (INIS)

    Clausmeyer, H.; Maile, K.

    1995-01-01

    This paper documents some of the most important code verifications and validation trials related to the different nuclear reactor concepts namely light water reactors (LWR), liquid metal fast breeder reactors (LMFBR) and high temperature reactors (HTR), respectively. Selected experimental and theoretical investigations demonstrate the effectiveness of those efforts which contribute either to a better understanding of the nuclear codes or to initiate necessary changes. Resistance against crack initiation, crack propagation and crack arrest are treated by linear elastic fracture mechanics for pressure vessels, tubes, plates and pipes submitted to tensile load, internal pressure or bending moment. Different test parameters are taken into account such as: materials composition, temperature, strain rate, surface roughness, notches, weldments etc. Test results are compared to calculation predictions and calculation of fracture mechanics parameters for a feedback to the code. (J.S.). 21 refs., 28 figs., 5 tabs

  12. Measurement of reactivity coefficients for code validation

    International Nuclear Information System (INIS)

    Nuding, Matthias; Loetsch, Thomas

    2005-01-01

    In the year 2003 measurements in the cold reactor state have been performed at the NPP KKI 2 in order to validate the codes that are used for reactor core calculations and especially for the proof of the shutdown margin that is produced by calculations only. For full power states code verification is quite easy because the calculations can be compared with different measured values, e.g. with the activation values determined by the aeroball system. For cold reactor states, however the data base is smaller, especially for reactor cores that are quite 'inhomogeneous' and have rather high Pu-fiss-and 235 U-contents. At the same time the cold reactor state is important regarding the shutdown margin. For these reasons the measurements mentioned above have been performed in order to check the accuracy of the codes that are used by the operator and by our organization for many years. Basically, boron concentrations and control rod worths for different configurations have been measured. The results of the calculation show a very good agreement with the measured values. Therefore, it can be stated that the operator's as well as our code system is suitable for routine use, e.g. during licensing procedures (Authors)

  13. Building Technologies Program Multi-Year Program Plan Technology Validation and Market Introduction 2008

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2008-01-01

    Building Technologies Program Multi-Year Program Plan 2008 for technology validation and market introduction, including ENERGY STAR, building energy codes, technology transfer application centers, commercial lighting initiative, EnergySmart Schools, EnergySmar

  14. Software verification and validation plan for the GWSCREEN code

    International Nuclear Information System (INIS)

    Rood, A.S.

    1993-05-01

    The purpose of this Software Verification and Validation Plan (SVVP) is to prescribe steps necessary to verify and validate the GWSCREEN code, version 2.0 to Quality Level B standards. GWSCREEN output is to be verified and validated by comparison with hand calculations, and by output from other Quality Level B computer codes. Verification and validation will also entail performing static and dynamic tests on the code using several analysis tools. This approach is consistent with guidance in the ANSI/ANS-10.4-1987, open-quotes Guidelines for Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry.close quotes

  15. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  16. Validating converted java code via symbolic execution

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, C.

    2017-01-01

    The testing approach described here has grown out of migration projects aimed at converting procedural programs in COBOL or PL/1 to object-oriented Java code. The code conversion itself is now automated but not completely. The human reengineer still has to make some adjustments to the automatically

  17. Planning guide for validation of fission product transport codes

    International Nuclear Information System (INIS)

    Jensen, D.D.; Haire, M.J.; Baldassare, J.E.; Hanson, D.L.

    1975-01-01

    The program for validating fission product transport codes utilized in the design of the high-temperature gas-cooled reactor (HTGR) is described herein. The importance of fission product code verification is discussed as it relates to achieving a competitive reactor system that fully complies with federal regulations. A brief description of the RAD, PAD, and FIPER codes and their validation status is given. Individual validation tests are described in detail, including test conditions and measurements to be evaluated, and accompanying test schedules. Also included are validation schedules for each code inclusive through fiscal year 1978. Codes will be appropriately validated and utilized for fission product predictions for the Delmarva Final Safety Analysis Report (FSAR) due for release in early 1978. (U.S.)

  18. Computer codes validation for conditions of core voiding

    International Nuclear Information System (INIS)

    Delja, A.; Hawley, P.

    2011-01-01

    Void generation during a Loss of Coolant Accident (LOCA) in a core of a CANDU reactor is of specific importance because of its strong coupling with reactor neutronics. The use of dynamic behaviour and computer code capability to predict void generation accurately in the temporal and spatial domain of the reactor core is fundamental for the determination of CANDU safety. The Canadian industry has used the RD-14M test facilities for its code validation. The validation exercises for the Canadian computer codes TUF and CATHENA were performed some years ago. Recently, the CNSC has gained access to the USNRC computer code TRACE. This has provided an opportunity to explore the use of this code in CANDU related applications. As a part of regulatory assessment and resolving identified Generic Issues (GI), and in an effort to build independent thermal hydraulic computer codes assessment capability within the CNSC, preliminary validation exercises were performed using the TRACE computer code for an evaluation of the void generation phenomena. The paper presents a preliminary assessment of the TRACE computer code for an RD-14M channel voiding test. It is also a validation exercise of void generation for the TRACE computer code. The accuracy of the obtained results is discussed and compared with previous validation assessments that were done using the CATHENA and TUF codes. (author)

  19. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  20. European Validation of the Integral Code ASTEC (EVITA)

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Neu, K.; Dorsselaere, J.P. Van

    2005-01-01

    The main objective of the European Validation of the Integral Code ASTEC (EVITA) project is to distribute the severe accident integral code ASTEC to European partners in order to apply the validation strategy issued from the VASA project (4th EC FWP). Partners evaluate the code capability through validation on reference experiments and plant applications accounting for severe accident management measures, and compare results with reference codes. The basis version V0 of ASTEC (Accident Source Term Evaluation Code)-commonly developed and basically validated by GRS and IRSN-was made available in late 2000 for the EVITA partners on their individual platforms. Users' training was performed by IRSN and GRS. The code portability on different computers was checked to be correct. A 'hot line' assistance was installed continuously available for EVITA code users. The actual version V1 has been released to the EVITA partners end of June 2002. It allows to simulate the front-end phase by two new modules:- for reactor coolant system 2-phase simplified thermal hydraulics (5-equation approach) during both front-end and core degradation phases; - for core degradation, based on structure and main models of ICARE2 (IRSN) reference mechanistic code for core degradation and on other simplified models. Next priorities are clearly identified: code consolidation in order to increase the robustness, extension of all plant applications beyond the vessel lower head failure and coupling with fission product modules, and continuous improvements of users' tools. As EVITA has very successfully made the first step into the intention to provide end-users (like utilities, vendors and licensing authorities) with a well validated European integral code for the simulation of severe accidents in NPPs, the EVITA partners strongly recommend to continue validation, benchmarking and application of ASTEC. This work will continue in Severe Accident Research Network (SARNET) in the 6th Framework Programme

  1. Vlasov Antenna Data for Electromagnetic Code Validation

    National Research Council Canada - National Science Library

    Greenwood, Andrew

    2003-01-01

    Measured antenna data is provided for validating computational electromagnetic (CEM) computer programs. The subject antenna is the Vlasov antenna, which is formed by cutting a hollow circular cylindrical waveguide at an oblique angle...

  2. Elicitation and validation of requirements-to-code traceability

    OpenAIRE

    Ghabi, Achraf

    2015-01-01

    eingereicht von: Achraf Ghabi Zusammenfassung in deutscher Sprache Parallelt. [Übers. des Autors]: Elicitation and validation of requirements-to-code traceability Universität Linz, Univ., Dissertation, 2015 OeBB

  3. Improvements, verifications and validations of the BOW code

    International Nuclear Information System (INIS)

    Yu, S.D.; Tayal, M.; Singh, P.N.

    1995-01-01

    The BOW code calculates the lateral deflections of a fuel element consisting of sheath and pellets, due to temperature gradients, hydraulic drag and gravity. the fuel element is subjected to restraint from endplates, neighboring fuel elements and the pressure tube. Many new features have been added to the BOW code since its original release in 1985. This paper outlines the major improvements made to the code and verification/validation results. (author)

  4. NDE reliability and advanced NDE technology validation

    International Nuclear Information System (INIS)

    Doctor, S.R.; Deffenbaugh, J.D.; Good, M.S.; Green, E.R.; Heasler, P.G.; Hutton, P.H.; Reid, L.D.; Simonen, F.A.; Spanner, J.C.; Vo, T.V.

    1989-01-01

    This paper reports on progress for three programs: (1) evaluation and improvement in nondestructive examination reliability for inservice inspection of light water reactors (LWR) (NDE Reliability Program), (2) field validation acceptance, and training for advanced NDE technology, and (3) evaluation of computer-based NDE techniques and regional support of inspection activities. The NDE Reliability Program objectives are to quantify the reliability of inservice inspection techniques for LWR primary system components through independent research and establish means for obtaining improvements in the reliability of inservice inspections. The areas of significant progress will be described concerning ASME Code activities, re-analysis of the PISC-II data, the equipment interaction matrix study, new inspection criteria, and PISC-III. The objectives of the second program are to develop field procedures for the AE and SAFT-UT techniques, perform field validation testing of these techniques, provide training in the techniques for NRC headquarters and regional staff, and work with the ASME Code for the use of these advanced technologies. The final program's objective is to evaluate the reliability and accuracy of interpretation of results from computer-based ultrasonic inservice inspection systems, and to develop guidelines for NRC staff to monitor and evaluate the effectiveness of inservice inspections conducted on nuclear power reactors. This program started in the last quarter of FY89, and the extent of the program was to prepare a work plan for presentation to and approval from a technical advisory group of NRC staff

  5. Large-scale sodium spray fire code validation (SOFICOV) test

    International Nuclear Information System (INIS)

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m 3 Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes

  6. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the

  7. Validity of vascular trauma codes at major trauma centres.

    Science.gov (United States)

    Altoijry, Abdulmajeed; Al-Omran, Mohammed; Lindsay, Thomas F; Johnston, K Wayne; Melo, Magda; Mamdani, Muhammad

    2013-12-01

    The use of administrative databases in vascular injury research has been increasing, but the validity of the diagnosis codes used in this research is uncertain. We assessed the positive predictive value (PPV) of International Classification of Diseases, tenth revision (ICD-10), vascular injury codes in administrative claims data in Ontario. We conducted a retrospective validation study using the Canadian Institute for Health Information Discharge Abstract Database, an administrative database that records all hospital admissions in Canada. We evaluated 380 randomly selected hospital discharge abstracts from the 2 main trauma centres in Toronto, Ont., St.Michael's Hospital and Sunnybrook Health Sciences Centre, between Apr. 1, 2002, and Mar. 31, 2010. We then compared these records with the corresponding patients' hospital charts to assess the level of agreement for procedure coding. We calculated the PPV and sensitivity to estimate the validity of vascular injury diagnosis coding. The overall PPV for vascular injury coding was estimated to be 95% (95% confidence interval [CI] 92.3-96.8). The PPV among code groups for neck, thorax, abdomen, upper extremity and lower extremity injuries ranged from 90.8 (95% CI 82.2-95.5) to 97.4 (95% CI 91.0-99.3), whereas sensitivity ranged from 90% (95% CI 81.5-94.8) to 98.7% (95% CI 92.9-99.8). Administrative claims hospital discharge data based on ICD-10 diagnosis codes have a high level of validity when identifying cases of vascular injury. Observational Study Level III.

  8. Coding Conversation between Intimates: A Validation Study of the Intimate Negotiation Coding System (INCS).

    Science.gov (United States)

    Ting-Toomey, Stella

    A study was conducted to test the reliability and validity of the Intimate Coding System (INCS)--an instrument designed to code verbal conversation in intimate relationships. Subjects, 34 married couples, completed Spanier's Dyadic Adjustment Scale, which elicited information about relational adjustment and satisfaction in intimate couples in…

  9. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  10. Contributions to the validation of the ASTEC V1 code

    International Nuclear Information System (INIS)

    Constantin, Marin; Rizoiu, Andrei; Turcu, Ilie

    2004-01-01

    In the frame of PHEBEN2 project (Validation of the severe accidents codes for applications to nuclear power plants, based on the PHEBUS FP experiments), a project developed within the EU research Frame Program 5 (FP5), the INR-Pitesti's team has received the task of determining the ASTEC code sensitivity. The PHEBEN2 project has been initiated in 1998 and gathered 13 partners from 6 EU member states. To the project 4 partners from 3 candidate states (Hungary, Bulgaria and Romania) joined later. The works were contracted with the European Commission (under FIKS-CT1999-00009 contract) that supports financially the research effort up to about 50%. According to the contract provisions, INR's team participated in developing the Working Package 1 (WP1) which refers to validation of the integral computation codes that use the PHOEBUS experimental data and the Working Package 3 (WP3) referring to the evaluation of the codes to be applied in nuclear power plants for risk evaluation, nuclear safety margin evaluation and determination/evaluation of the measures to be adopted in case of severe accident. The present work continues the efforts to validate preliminarily the ASTEC code. Focused are the the stand-alone sensitivity analyses applied to two most important modules of the code, namely DIVA and SOPHAEROS

  11. Development and validation of a nodal code for core calculation

    International Nuclear Information System (INIS)

    Nowakowski, Pedro Mariano

    2004-01-01

    The code RHENO solves the multigroup three-dimensional diffusion equation using a nodal method of polynomial expansion.A comparative study has been made between this code and present internationals nodal diffusion codes, resulting that the RHENO is up to date.The RHENO has been integrated to a calculation line and has been extend to make burnup calculations.Two methods for pin power reconstruction were developed: modulation and imbedded. The modulation method has been implemented in a program, while the implementation of the imbedded method will be concluded shortly.The validation carried out (that includes experimental data of a MPR) show very good results and calculation efficiency

  12. Lawrence Livermore National Laboratory Probabilistic Seismic Hazard Codes Validation

    International Nuclear Information System (INIS)

    Savy, J B

    2003-01-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time-period. LLNL has been developing the methodology and codes in support of the Nuclear Regulatory Commission (NRC) needs for reviews of site licensing of nuclear power plants, since 1978. A number of existing computer codes have been validated and still can lead to ranges of hazard estimates in some cases. Until now, the seismic hazard community had not agreed on any specific method for evaluation of these codes. The Earthquake Engineering Research Institute (EERI) and the Pacific Engineering Earthquake Research (PEER) center organized an exercise in testing of existing codes with the aim of developing a series of standard tests that future developers could use to evaluate and calibrate their own codes. Seven code developers participated in the exercise, on a voluntary basis. Lawrence Livermore National laboratory participated with some support from the NRC. The final product of the study will include a series of criteria for judging of the validity of the results provided by a computer code. This EERI/PEER project was first planned to be completed by June of 2003. As the group neared completion of the tests, the managing team decided that new tests were necessary. As a result, the present report documents only the work performed to this point. It demonstrates that the computer codes developed by LLNL perform all calculations correctly and as intended. Differences exist between the results of the codes tested, that are attributed to a series of assumptions, on the parameters and models, that the developers had to make. The managing team is planning a new series of tests to help in reaching a consensus on these assumptions

  13. A QR code identification technology in package auto-sorting system

    Science.gov (United States)

    di, Yi-Juan; Shi, Jian-Ping; Mao, Guo-Yong

    2017-07-01

    Traditional manual sorting operation is not suitable for the development of Chinese logistics. For better sorting packages, a QR code recognition technology is proposed to identify the QR code label on the packages in package auto-sorting system. The experimental results compared with other algorithms in literatures demonstrate that the proposed method is valid and its performance is superior to other algorithms.

  14. Experimental validation of the containment codes ASTARTE and SEURBNUK

    International Nuclear Information System (INIS)

    Kendall, K.C.; Arnold, L.A.; Broadhouse, B.J.; Jones, A.; Yerkess, A.; Benuzzi, A.

    1979-10-01

    The fast reactor containment codes ASTARTE and SEURBNUK are being validated against data from the COVA series of small scale experiments being performed jointly by the UKAEA and JRC Ispra. The experimental programme is nearly complete, and data are given. (U.K.)

  15. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  16. In-core fuel management code package validation for PWRs

    International Nuclear Information System (INIS)

    1995-08-01

    In the framework of its reactor physics activities conducted within its nuclear power programme, the IAEA has long provided its Member States with a forum for the exchange of technical information on in-core fuel management. This TECDOC discusses in-core fuel management code package validation for PWRs. 43 refs, figs and tabs

  17. Intercomparison and validation of computer codes for thermalhydraulic safety analysis of heavy water reactors

    International Nuclear Information System (INIS)

    2004-08-01

    Activities within the frame of the IAEA's Technical Working Group on Advanced Technologies for HWRs (TWG-HWR) are conducted in a project within the IAEA's subprogramme on nuclear power reactor technology development. The objective of the activities on HWRs is to foster, within the frame of the TWG-HWR, information exchange and co-operative research on technology development for current and future HWRs, with an emphasis on safety, economics and fuel resource sustainability. One of the activities recommended by the TWG-HWR was an international standard problem exercise entitled: Intercomparison and validation of computer codes for thermalhydraulics safety analyses. Intercomparison and validation of computer codes used in different countries for thermalhydraulics safety analyses will enhance the confidence in the predictions made by these codes. However, the intercomparison and validation exercise needs a set of reliable experimental data. The RD-14M Large-Loss Of Coolant Accident (LOCA) test B9401 simulating HWR LOCA behaviour that was conducted by Atomic Energy of Canada Ltd (AECL) was selected for this validation project. This report provides a comparison of the results obtained from six participating countries, utilizing four different computer codes. General conclusions are reached and recommendations made

  18. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  19. The WINCON programme - validation of fast reactor primary containment codes

    International Nuclear Information System (INIS)

    Sidoli, J.E.A.; Kendall, K.C.

    1988-01-01

    In the United Kingdom safety studies for the Commercial Demonstration Fast Reactor (CDFR) include an assessment of the capability of the primary containment in providing an adequate containment for defence against the hazards resulting from a hypothetical Whole Core Accident (WCA). The assessment is based on calculational estimates using computer codes supported by measured evidence from small-scale experiments. The hydrodynamic containment code SEURBNUK-EURDYN is capable of representing a prescribed energy release, the sodium coolant and cover gas, and the main containment and safety related internal structures. Containment loadings estimated using SEURBNUK-EURDYN are used in the structural dynamic code EURDYN-03 for the prediction of the containment response. The experiments serve two purposes, they demonstrate the response of the CDFR containment to accident loadings and provide data for the validation of the codes. This paper summarises the recently completed WINfrith CONtainment (WINCON) experiments that studied the response of specific features of current CDFR design options to WCA loadings. The codes have been applied to some of the experiments and a satisfactory prediction of the global response of the model containment is obtained. This provides confidence in the use of the codes in reactor assessments. (author)

  20. Validation of the TAC/BLOOST code (Contract research)

    International Nuclear Information System (INIS)

    Takamatsu, Kuniyoshi; Nakagawa, Shigeaki

    2005-06-01

    Safety demonstration tests using the High Temperature engineering Test Reactor (HTTR) are in progress to verify the inherent safety features for High Temperature Gas-cooled Reactors (HTGRs). The coolant flow reduction test by tripping gas circulators is one of the safety demonstration tests. The reactor power safely brings to a stable level without a reactor scram and the temperature transient of the reactor-core is very slow. The TAC/BLOOST code was developed to analyze reactor and temperature transient during the coolant flow reduction test taking account of reactor dynamics. This paper describes the validation result of the TAC/BLOOST code with the measured values of gas circulators tripping tests at 30% (9 MW). It was confirmed that the TAC/BLOOST code was able to analyze the reactor transient during the test. (author)

  1. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, Charles W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bartel, Timothy James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34D accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.

  2. Fluid/structure interaction and TUF code validation

    International Nuclear Information System (INIS)

    Toong, T.

    1997-01-01

    In the course of analysis of Steam Generator (SG) divider plate integrity under Loss-Of-Coolant-Accident (LOCA) conditions, it became clear that the pressure differential across the divider plate is strongly affected by the ensuing movement of the divider plate. The effect of the divider plate movement on the thermal-hydraulics has been studied in detail at Ontario Hydro. In this paper, emphasis is placed on the fundamental physics involved in the phenomena. Two physics problems, which can be solved analytically, are presented in this paper, which may be used for code validation: 1) The first one relates to the fundamental physics of pressure wave propagation generated by the motion of the piston in a pipe, 2) The second one deals with a lumped volume (or node in simulation codes) with inlet and outlet pipes, which is representative of the bowl of a SG with the tubes and the nozzle as inlet and outlet. Fluid/structure interaction modeling for SG divider plate integrity study has been implemented in the TUF code at Ontario Hydro. The structure-to-fluid part of the coding is tested against these two physical problems. The results have demonstrated the code capability for simulations of fluid/structure interaction problems. (author)

  3. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  4. Gamma streaming experiments for validation of Monte Carlo code

    International Nuclear Information System (INIS)

    Thilagam, L.; Mohapatra, D.K.; Subbaiah, K.V.; Iliyas Lone, M.; Balasubramaniyan, V.

    2012-01-01

    In-homogeneities in shield structures lead to considerable amount of leakage radiation (streaming) increasing the radiation levels in accessible areas. Development works on experimental as well as computational methods for quantifying this streaming radiation are still continuing. Monte Carlo based radiation transport code, MCNP is usually a tool for modeling and analyzing such problems involving complex geometries. In order to validate this computational method for streaming analysis, it is necessary to carry out some experimental measurements simulating these inhomogeneities like ducts and voids present in the bulk shields for typical cases. The data thus generated will be analysed by simulating the experimental set up employing MCNP code and optimized input parameters for the code in finding solutions for similar radiation streaming problems will be formulated. Comparison of experimental data obtained from radiation streaming experiments through ducts will give a set of thumb rules and analytical fits for total radiation dose rates within and outside the duct. The present study highlights the validation of MCNP code through the gamma streaming experiments carried out with the ducts of various shapes and dimensions. Over all, the present study throws light on suitability of MCNP code for the analysis of gamma radiation streaming problems for all duct configurations considered. In the present study, only dose rate comparisons have been made. Studies on spectral comparison of streaming radiation are in process. Also, it is planned to repeat the experiments with various shield materials. Since the penetrations and ducts through bulk shields are unavoidable in an operating nuclear facility the results on this kind of radiation streaming simulations and experiments will be very useful in the shield structure optimization without compromising the radiation safety

  5. Criticality Safety Code Validation with LWBR’s SB Cores

    Energy Technology Data Exchange (ETDEWEB)

    Putman, Valerie Lee

    2003-01-01

    The first set of critical experiments from the Shippingport Light Water Breeder Reactor Program included eight, simple geometry critical cores built with 233UO2-ZrO2, 235UO2-ZrO2, ThO2, and ThO2-233UO2 nuclear materials. These cores are evaluated, described, and modeled to provide benchmarks and validation information for INEEL criticality safety calculation methodology. In addition to consistency with INEEL methodology, benchmark development and nuclear data are consistent with International Criticality Safety Benchmark Evaluation Project methodology.Section 1 of this report introduces the experiments and the reason they are useful for validating some INEEL criticality safety calculations. Section 2 provides detailed experiment descriptions based on currently available experiment reports. Section 3 identifies criticality safety validation requirement sources and summarizes requirements that most affect this report. Section 4 identifies relevant hand calculation and computer code calculation methodologies used in the experiment evaluation, benchmark development, and validation calculations. Section 5 provides a detailed experiment evaluation. This section identifies resolutions for currently unavailable and discrepant information. Section 5 also reports calculated experiment uncertainty effects. Section 6 describes the developed benchmarks. Section 6 includes calculated sensitivities to various benchmark features and parameters. Section 7 summarizes validation results. Appendices describe various assumptions and their bases, list experimenter calculations results for items that were independently calculated for this validation work, report other information gathered and developed by SCIENTEC personnel while evaluating these same experiments, and list benchmark sample input and miscellaneous supplementary data.

  6. Validation of infant immunization billing codes in administrative data.

    Science.gov (United States)

    Schwartz, Kevin L; Tu, Karen; Wing, Laura; Campitelli, Michael A; Crowcroft, Natasha S; Deeks, Shelley L; Wilson, Sarah E; Wilson, Kumanan; Gemmill, Ian; Kwong, Jeffrey C

    2015-01-01

    Ontario has a single payer provincial health insurance program. Administrative data may provide a potentially robust source of information for post-marketing vaccine studies. Vaccine-specific immunization billing codes were introduced in 2011. Our objective was to validate Ontario's universal health care administrative datasets to assess infant immunization status. Electronic medical record data from the Electronic Medical Record Administrative data Linked Database (EMRALD) was used as the reference standard to calculate performance characteristics of the Ontario Health Insurance Plan (OHIP) database vaccine-specific and general immunization codes for 4 primary infant immunizations: diphtheria, tetanus, acellular pertussis, inactivated polio, Haemophilus influenzae type B (DTaP-IPV-Hib) combination vaccine, pneumococcal conjugate vaccine, measles, mumps, rubella (MMR) vaccine, and meningococcal conjugate serogroup C vaccine. OHIP billing claims had specificity ranging from 81% to 92%, sensitivity 70% to 83%, positive predictive value (PPV) 97% to 99%, and negative predictive value (NPV) 13% to 46% for identifying the various specific vaccines in administrative data. For cohorts vaccinated in the new code introduction phase, using both the vaccine-specific and general codes had higher sensitivity than the vaccine-specific codes alone. In conclusion, immunization billing claims from administrative data in Ontario had high specificity and PPV, moderate sensitivity, and low NPV. This study identifies some of the applications of utilizing administrative data for post-marketing vaccine studies. However, limitations of these data decrease their utility for measuring vaccine coverage and effectiveness. Therefore, the establishment of a comprehensive and linkable immunization registry should be a provincial priority.

  7. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  8. Software and codes for analysis of concentrating solar power technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2008-12-01

    This report presents a review and evaluation of software and codes that have been used to support Sandia National Laboratories concentrating solar power (CSP) program. Additional software packages developed by other institutions and companies that can potentially improve Sandia's analysis capabilities in the CSP program are also evaluated. The software and codes are grouped according to specific CSP technologies: power tower systems, linear concentrator systems, and dish/engine systems. A description of each code is presented with regard to each specific CSP technology, along with details regarding availability, maintenance, and references. A summary of all the codes is then presented with recommendations regarding the use and retention of the codes. A description of probabilistic methods for uncertainty and sensitivity analyses of concentrating solar power technologies is also provided.

  9. Validation of thermal hydraulic codes for fusion reactors safety

    International Nuclear Information System (INIS)

    Sardain, P.; Gulden, W.; Massaut, V.; Takase, K.; Merill, B.; Caruso, G.

    2006-01-01

    A significant effort has been done worldwide on the validation of thermal hydraulic codes, which can be used for the safety assessment of fusion reactors. This work is an item of an implementing agreement under the umbrella of the International Energy Agency. The European part is supported by EFDA. Several programmes related to transient analysis in water-cooled fusion reactors were run in order to assess the capabilities of the codes to treat the main physical phenomena governing the accidental sequences related to water/steam discharge into the vacuum vessel or the cryostat. The typical phenomena are namely the pressurization of a volume at low initial pressure, the critical flow, the flashing, the relief into an expansion volume, the condensation of vapor in a pressure suppression system, the formation of ice on a cryogenic structure, the heat transfer between walls and fluid in various thermodynamic conditions. · A benchmark exercise has been done involving different types of codes, from homogeneous equilibrium to six equations non-equilibrium models. Several cases were defined, each one focusing on a particular phenomenon. · The ICE (Ingress of Coolant Event) facility has been operated in Japan. It has simulated an in-vessel LOCA and the discharge of steam into a pressure suppression system. · The EVITA (European Vacuum Impingement Test Apparatus) facility has been operated in France. It has simulated ingress of coolant into the cryostat, i.e. into a volume at low initial pressure containing surfaces at cryogenic temperature. This paper gives the main lessons gained from these programs, in particular the possibilities for the improvement of the computer codes, extending their capabilities. For example, the water properties have been extended below the triple point. Ice formation models have been implemented. Work has also been done on condensation models. The remaining needs for R-and-D are also highlighted. (author)

  10. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    International Nuclear Information System (INIS)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.; Kallfelz, J.M.; Abdel-Khalik, S.I.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results of such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech

  11. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat this latter effect permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The benchmark exercise has resolved a potentially dangerous inadequacy in dissolver calculations. (author)

  12. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related

  13. Validation of the TIARA code to tritium inventory data

    International Nuclear Information System (INIS)

    Billone, M.C.

    1994-03-01

    The TIARA code has been developed to predict tritium inventory in Li 2 O breeder ceramic and to predict purge exit flow rate and composition. Inventory predictions are based on models for bulk diffusion, surface desorption, solubility and precipitation. Parameters for these models are determined from the results of laboratory annealing studies on unirradiated and irradiated Li 2 O. Inventory data from in-reactor purge flow tests are used for model improvement, fine-tuning of model parameters and validation. In this current work, the inventory measurement near the purge inlet from the BEATRIX-II thin-ring sample is used to fine tune the surface desorption model parameters for T > 470 degrees C, and the inventory measurement near the midplane from VOM-15H is used to fine tune the moisture solubility model parameters. predictions are then validated to the remaining inventory data from EXOTIC-2 (1 point), SIBELIUS (3 axial points), VOM-15H (2 axial points), CRITIC-1 (4 axial points), BEATRIX-II thin ring (3 axial points) and BEATRIX-II thick pellet (5 radial points). Thus. of the 20 data points, two we re used for fine tuning model parameters and 18 were used for validation. The inventory data span the range of 0.05--1.44 wppm with an average of 0.48 wppm. The data pertain to samples whose end-of-life temperatures were in the range of 490--1000 degrees C. On the average, the TIARA predictions agree quite well with the data (< 0.02 wppm difference). However, the root-mean-square deviation is 0.44 wppm, mostly due to over-predictions for the SIBELIUS samples and the higher-temperature radial samples from the BEATRIX-11 thick-pellet

  14. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Directory of Open Access Journals (Sweden)

    David A Springate

    Full Text Available Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs. If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1% were accompanied by a full set of published clinical codes and 32 (8.6% stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  15. Validation of Monte Carlo Geant4 code for a

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2017-01-01

    Full Text Available This study is aimed at validating the Monte Carlo Geant4.9.4 code for a 6 MV Varian linac configuring a 10 × 10 cm2 radiation field. For this purpose a user-friendly Geant4 code called G4Linac has been developed from scratch allowing an accurate modeling of a 6 MV Varian linac head and performing dose calculation in a homogeneous water phantom. Discarding the other accelerator parts where electrons are created, accelerated and deviated, a virtual source of 6 MeV electrons was considered. The parameters associated with this virtual source are often unknown. Those parameters are mean energy, sigma and its full width at half maximum has been adjusted by following our own methodology that has been developed in such a manner that the optimization phase will be fast and efficient, in fact, a small number of Monte Carlo simulations has been conducted simultaneously on a cluster of computers thanks to the Rocks cluster software. The calculated dosimetric functions in a 40 × 40 × 40 cm3 water phantom were compared to the measured ones thanks to the Gamma Index method, where the gamma criterion was fixed within 2%–1 mm accuracy. After optimization, it was observed that the proper mean energy, sigma and its full width at half maximum are 5.6 MeV, 0.42 MeV and 1.177 mm, respectively. Furthermore, we have made some changes in an existing bremsstrahlung splitting technique, due to which we have succeeded to reduce the CPU time spent by the treatment head simulation about five times.

  16. Reactor Fuel Isotopics and Code Validation for Nuclear Applications

    Energy Technology Data Exchange (ETDEWEB)

    Francis, Matthew W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Weber, Charles F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pigni, Marco T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-02-01

    Experimentally measured isotopic concentrations of well characterized spent nuclear fuel (SNF) samples have been collected and analyzed by previous researchers. These sets of experimental data have been used extensively to validate the accuracy of depletion code predictions for given sets of burnups, initial enrichments, and varying power histories for different reactor types. The purpose of this report is to present the diversity of data in a concise manner and summarize the current accuracy of depletion modeling. All calculations performed for this report were done using the Oak Ridge Isotope GENeration (ORIGEN) code, an internationally used irradiation and decay code solver within the SCALE comprehensive modeling and simulation code. The diversity of data given in this report includes key actinides, stable fission products, and radioactive fission products. In general, when using the current ENDF/B-VII.0 nuclear data libraries in SCALE, the major actinides are predicted to within 5% of the measured values. Large improvements were seen for several of the curium isotopes when using improved cross section data found in evaluated nuclear data file ENDF/B-VII.0 as compared to ENDF/B-V-based results. The impact of the flux spectrum on the plutonium isotope concentrations as a function of burnup was also shown. The general accuracy noted for the actinide samples for reactor types with burnups greater than 5,000 MWd/MTU was not observed for the low-burnup Hanford B samples. More work is needed in understanding these large discrepancies. The stable neodymium and samarium isotopes were predicted to within a few percent of the measured values. Large improvements were seen in prediction for a few of the samarium isotopes when using the ENDF/B-VII.0 libraries compared to results obtained with ENDF/B-V libraries. Very accurate predictions were obtained for 133Cs and 153Eu. However, the predicted values for the stable ruthenium and rhodium isotopes varied

  17. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  18. Status of the GAMMA-FR code validation - TES pipe rupture accident of HCCR TBS

    International Nuclear Information System (INIS)

    Jin, Hyung Gon; Lee, Dong Won; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Merrill, Brad J.; Ahn, Mu-Young; Cho, Seungyon

    2015-01-01

    GAMMA-FR code to code validation is conducted and it shows reasonable agreement, however, near wall effect on the effective thermal conductivity needs to be investigated for better results. The GAMMA-FR code was scheduled for validation during the next two years under UCLA-NFRI collaboration. Through this research, GAMMA-FR will be validated with representative fusion experiments and reference accident cases. The GAMMA-FR (Gas Multicomponent Mixture Transient Analysis for Fusion Reactors) code is an in-house system analysis code to predict the thermal hydraulic and chemical reaction phenomena expected to occur during the thermo-fluid transients in a nuclear fusion system. A safety analysis of the Korea TBS (Test Blanket System) for ITER (International Thermonuclear Experimental Reactor) is underway using this code. This paper describes validation strategy of GAMMA-FR and current status of the validation study with respect to 'TES pipe rupture accident of ITER TBM'

  19. Misclassification of hypertrophic cardiomyopathy: validation of diagnostic codes

    Directory of Open Access Journals (Sweden)

    Magnusson P

    2017-08-01

    Full Text Available Peter Magnusson,1,2 Andreas Palm,2,3 Eva Branden,2,4 Stellan Mörner5 1Cardiology Research Unit, Department of Medicine, Karolinska Institutet, Stockholm, 2Centre for Research and Development, Uppsala University, Region Gävleborg, Gävle, 3Department of Medical Sciences, Respiratory, Allergy and Sleep Research, Uppsala University, Uppsala, 4Department of Medicine, Karolinska Institutet, Stockholm, 5Heart Center and Department of Public Health and Clinical Medicine, Umeå University, Umeå, Sweden Purpose: To validate diagnostic codes for hypertrophic cardiomyopathy (HCM, analyze misclassfications, and estimate the prevalence of HCM in an unselected Swedish regional cohort.Patients and methods: Using the hospitals’ electronic medical records (used for the Swedish National Patient Register, we identified 136 patients from 2006 to 2016 with the HCM-related codes 142.1 and 142.2 (International Classification of Diseases.Results: Of a total of 129 residents in the catchment area, 88 patients were correctly classified as HCM (positive predictive value 68.2% and 41 patients (31.8% were misclassified as HCM. Among the 88 HCM patients (52.2% males, 74 were alive and 14 were dead (15.9%. This yields an HCM prevalence of 74/183,337, that is, 4.0 diagnosed cases per 10,000 in the adult population aged ≥18 years. The underlying diagnoses of misclassified cases were mainly hypertension (31.7% and aortic stenosis (22.0%. Other types of cardiomyopathies accounted for several cases of misclassification: dilated (nonischemic or ischemic, left ventricular noncompaction, and Takotsubo. Miscellaneous diagnoses were amyloidosis, pulmonary stenosis combined with ventricular septal defect, aortic insufficiency, athelete’s heart, and atrioventricular conduction abnormality. The mean age was not significantly different between HCM and misclassified patients (65.8±15.8 vs 70.1±13.4 years; P=0.177. There were 47.8% females among HCM and 60.8% females among

  20. Unfolding code for neutron spectrometry based on neural nets technology

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the R obust Design of Artificial Neural Networks Methodology . The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  1. Biometric iris image acquisition system with wavefront coding technology

    Science.gov (United States)

    Hsieh, Sheng-Hsun; Yang, Hsi-Wen; Huang, Shao-Hung; Li, Yung-Hui; Tien, Chung-Hao

    2013-09-01

    Biometric signatures for identity recognition have been practiced for centuries. Basically, the personal attributes used for a biometric identification system can be classified into two areas: one is based on physiological attributes, such as DNA, facial features, retinal vasculature, fingerprint, hand geometry, iris texture and so on; the other scenario is dependent on the individual behavioral attributes, such as signature, keystroke, voice and gait style. Among these features, iris recognition is one of the most attractive approaches due to its nature of randomness, texture stability over a life time, high entropy density and non-invasive acquisition. While the performance of iris recognition on high quality image is well investigated, not too many studies addressed that how iris recognition performs subject to non-ideal image data, especially when the data is acquired in challenging conditions, such as long working distance, dynamical movement of subjects, uncontrolled illumination conditions and so on. There are three main contributions in this paper. Firstly, the optical system parameters, such as magnification and field of view, was optimally designed through the first-order optics. Secondly, the irradiance constraints was derived by optical conservation theorem. Through the relationship between the subject and the detector, we could estimate the limitation of working distance when the camera lens and CCD sensor were known. The working distance is set to 3m in our system with pupil diameter 86mm and CCD irradiance 0.3mW/cm2. Finally, We employed a hybrid scheme combining eye tracking with pan and tilt system, wavefront coding technology, filter optimization and post signal recognition to implement a robust iris recognition system in dynamic operation. The blurred image was restored to ensure recognition accuracy over 3m working distance with 400mm focal length and aperture F/6.3 optics. The simulation result as well as experiment validates the proposed code

  2. Validation and applicability of the 3D core kinetics and thermal hydraulics coupled code SPARKLE

    International Nuclear Information System (INIS)

    Miyata, Manabu; Maruyama, Manabu; Ogawa, Junto; Otake, Yukihiko; Miyake, Shuhei; Tabuse, Shigehiko; Tanaka, Hirohisa

    2009-01-01

    The SPARKLE code is a coupled code system based on three individual codes whose physical models have already been verified and validated. Mitsubishi Heavy Industries (MHI) confirmed the coupling calculation, including data transfer and the total reactor coolant system (RCS) behavior of the SPARKLE code. The confirmation uses the OECD/NEA MSLB benchmark problem, which is based on Three Mile Island Unit 1 (TMI-1) nuclear power plant data. This benchmark problem has been used to verify coupled codes developed and used by many organizations. Objectives of the benchmark program are as follows. Phase 1 is to compare the results of the system transient code using point kinetics. Phase 2 is to compare the results of the coupled three-dimensional (3D) core kinetics code and 3D core thermal-hydraulics (T/H) code, and Phase 3 is to compare the results of the combined coupled system transient code, 3D core kinetics code, and 3D core T/H code as a total validation of the coupled calculation. The calculation results of the SPARKLE code indicate good agreement with other benchmark participants' results. Therefore, the SPARKLE code is validated through these benchmark problems. In anticipation of applying the SPARKLE code to licensing analyses, MHI and Japanese PWR utilities have established a safety analysis method regarding the calculation conditions such as power distributions, reactivity coefficients, and event-specific features. (author)

  3. Reviews on Technology and Standard of Spatial Audio Coding

    Directory of Open Access Journals (Sweden)

    Ikhwana Elfitri

    2017-03-01

    Full Text Available Market demands on a more impressive entertainment media have motivated for delivery of three dimensional (3D audio content to home consumers through Ultra High Definition TV (UHDTV, the next generation of TV broadcasting, where spatial audio coding plays fundamental role. This paper reviews fundamental concept on spatial audio coding which includes technology, standard, and application. Basic principle of object-based audio reproduction system will also be elaborated, compared to the traditional channel-based system, to provide good understanding on this popular interactive audio reproduction system which gives end users flexibility to render their own preferred audio composition.

  4. Thermal-hydraulic codes validation for safety analysis of NPPs with RBMK

    International Nuclear Information System (INIS)

    Brus, N.A.; Ioussoupov, O.E.

    2001-01-01

    This work is devoted to validation of western thermal-hydraulic codes (RELAP5/MOD3.2 and ATHLET 1.1 Cycle C) in application to Russia designed light water reactors of RBMK type. Such validation is needed due to features of RBMK reactor design and thermal-hydraulics in comparison with PWR and BWR reactors for which these codes were developed and validated. These validation studies are concluded in comparison of calculation results obtained with the thermal-hydraulics codes with the experimental data obtained earlier with the thermal-hydraulics test facilities. (authors)

  5. Production ready feature recognition based automatic group technology part coding

    Energy Technology Data Exchange (ETDEWEB)

    Ames, A.L.

    1990-01-01

    During the past four years, a feature recognition based expert system for automatically performing group technology part coding from solid model data has been under development. The system has become a production quality tool, capable of quickly the geometry based portions of a part code with no human intervention. It has been tested on over 200 solid models, half of which are models of production Sandia designs. Its performance rivals that of humans performing the same task, often surpassing them in speed and uniformity. The feature recognition capability developed for part coding is being extended to support other applications, such as manufacturability analysis, automatic decomposition (for finite element meshing and machining), and assembly planning. Initial surveys of these applications indicate that the current capability will provide a strong basis for other applications and that extensions toward more global geometric reasoning and tighter coupling with solid modeler functionality will be necessary.

  6. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  7. Validation of fuel rod performance analysis code COPERNIC

    International Nuclear Information System (INIS)

    Han Yebin; Wang Jun; Ren Qisen; Liu Tong; Zhou Yuemin

    2012-01-01

    IAEA has sponsored the FUMEX Ⅲ (FUel Modeling at Extended Burnup) coordinated research project to improve computer code used for fuel behaviour simulation. As one of over thirty international participants, CGNPC has been engaged in testing and developing the fuel modelling code COPERNIC against data and cases provided by the IAEA and OECD/NEA. Investigations focused on high burnup and transient analysis, and include dimensional change model- ling. Data from several 6 calculation cases have been compared with COPERNIC predictions by far. Due to different purposes of tests, these cases had different designs including rod refabrication and annular pellet and were under different operation conditions including normal operation and ramp test. The comparison and preliminary analysis between predicted and measured results in such as fuel temperature, cladding outer diameter, cladding corrosion layer thickness, and fission gas release have been conducted, which demonstrated that the COPERNIC code was applicable to different rod designs under different operation conditions with an accurate prediction. (authors)

  8. Standards, building codes, and certification programs for solar technology applicatons

    Energy Technology Data Exchange (ETDEWEB)

    Riley, J. D.; Odland, R.; Barker, H.

    1979-07-01

    This report is a primer on solar standards development. It explains the development of standards, building code provisions, and certification programs and their relationship to the emerging solar technologies. These areas are important in the commercialization of solar technology because they lead to the attainment of two goals: the development of an industry infrastructure and consumer confidence. Standards activities in the four phases of the commercialization process (applied research, development, introduction, and diffusion) are discussed in relation to institutional issues. Federal policies have been in operation for a number of years to accelerate the development process for solar technology. These policies are discussed in light of the Office of Management and Budget (OMB) Circular on federal interaction with the voluntary consensus system, and in light of current activities of DOE, HUD, and other interested federal agencies. The appendices cover areas of specific interest to different audiences: activities on the state and local level; and standards, building codes, and certification programs for specific technologies. In addition, a contract for the development of a model solar document let by DOE to a model consortium is excerpted in the Appendix.

  9. Photovoltaic and solar-thermal technologies in residential building codes, tackling building code requirements to overcome the impediments to applying new technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wortman, D.; Echo-Hawk, L. [authors] and Wiechman, J.; Hayter, S.; Gwinner, D. [eds.

    1999-10-04

    This report describes the building code requirements and impediments to applying photovoltaic (PV) and solar-thermal technologies in residential buildings (one- or two-family dwellings). It reviews six modern model building codes that represent the codes to be adopted by most locations in the coming years: International Residential Code, First Draft (IRC), International Energy Conservation Code (IECC), International Mechanical Code (IMC), International Plumbing Code (IPC), International Fuel Gas Code (IFGC), and National Electrical Code (NEC). The IRC may become the basis for many of the building codes in the United States after it is released in 2000, and it references the other codes that will also likely become applicable at that time. These codes are reviewed as they apply to photovoltaic systems in buildings and building-integrated photovoltaic systems and to active-solar domestic hot-water and space-heating systems. The first discussion is on general code issues that impact the s e technologies-for example, solar access and sustainability. Then, secondly, the discussion investigates the relationship of the technologies to the codes, providing examples, while keeping two major issues in mind: How do the codes treat these technologies as building components? and Do the IECC and other codes allow reasonable credit for the energy impacts of the technologies? The codes can impact the implementation of the above technologies in several ways: (1) The technology is not mentioned in the codes. It may be an obstacle to implementing the technology, and the solution is to develop appropriate explicit sections or language in the codes. (2) The technology is discussed by the codes, but the language is confusing or ambiguous. The solution is to clarify the language. (3) The technology is discussed in the codes, but the discussion is spread over several sections or different codes. Practitioners may not easily find all of the relevant material that should be considered. The

  10. Improving radiopharmaceutical supply chain safety by implementing bar code technology.

    Science.gov (United States)

    Matanza, David; Hallouard, François; Rioufol, Catherine; Fessi, Hatem; Fraysse, Marc

    2014-11-01

    The aim of this study was to describe and evaluate an approach for improving radiopharmaceutical supply chain safety by implementing bar code technology. We first evaluated the current situation of our radiopharmaceutical supply chain and, by means of the ALARM protocol, analysed two dispensing errors that occurred in our department. Thereafter, we implemented a bar code system to secure selected key stages of the radiopharmaceutical supply chain. Finally, we evaluated the cost of this implementation, from overtime, to overheads, to additional radiation exposure to workers. An analysis of the events that occurred revealed a lack of identification of prepared or dispensed drugs. Moreover, the evaluation of the current radiopharmaceutical supply chain showed that the dispensation and injection steps needed to be further secured. The bar code system was used to reinforce product identification at three selected key stages: at usable stock entry; at preparation-dispensation; and during administration, allowing to check conformity between the labelling of the delivered product (identity and activity) and the prescription. The extra time needed for all these steps had no impact on the number and successful conduct of examinations. The investment cost was reduced (2600 euros for new material and 30 euros a year for additional supplies) because of pre-existing computing equipment. With regard to the radiation exposure to workers there was an insignificant overexposure for hands with this new organization because of the labelling and scanning processes of radiolabelled preparation vials. Implementation of bar code technology is now an essential part of a global securing approach towards optimum patient management.

  11. Use of benchmark criticals in fast reactor code validation

    International Nuclear Information System (INIS)

    Curtis, R.; Kelber, C.; Luck, L.; Smith, L.R.

    1980-01-01

    The problem discussed is how to check the accuracy of SIMMER code used for the analysis of hypothetical core disruptive accidents. A three-step process is used for code checking: Benchmark criticals in ZPR-9; Monte Carlo analog calculations to isolate errors arising from cross-section data and to establish a secondary standard; and comparison between the secondary standard, SIMMER neutronics, and other transport approximations, for configurations of interest. The VIM Monte Carlo Code is used as such a secondary standard. The analysis with VIM of the experiments in ZPR-9 using ENDF-B/IV cross-section data yields the following conclusions: (1) A systematic change in bias exists in the analysis going from a reference configuration to a slumped configuration. This change is larger than β and must be attributed to errors in cross-section data, since the Monte Carlo simulation reproduces every significant detail of the experiment. (2) Transport (SN) calculations show the same trends in the bias as the Monte Carlo studies. Thus, the processes used in the construction of group cross-sections appear adequate. Further, the SN-VIM agreement appears to argue against gross errors in code or input. (3) Comparison with diffusion theory (using the same cross-section set) indicates that conventional diffusion theory has an opposite change in bias. (4) The change in bias in calculating the reactivity worth of slumped fuel is dramatic: transport theory overpredicts positive worths while diffusion theory underpredicts. Thus, reactivity ramp rates at prompt critical may be substantially underpredicted if there has been substantial fuel or coolant movement and diffusion theory has been used

  12. Study of experimental validation for combustion analysis of GOTHIC code

    International Nuclear Information System (INIS)

    Lee, J. Y.; Yang, S. Y.; Park, K. C.; Jeong, S. H.

    2001-01-01

    In this study, present lumped and subdivided GOTHIC6 code analyses of the premixed hydrogen combustion experiment at the Seoul National University and comparison with the experiment results. The experimental facility has 16367 cc free volume and rectangular shape. And the test was performed with unit equivalence ratio of the hydrogen and air, and with various location of igniter position. Using the lumped and mechanistic combustion model in GOTHIC6 code, the experiments were simulated with the same conditions. In the comparison between experiment and calculated results, the GOTHIC6 prediction of the combustion response does not compare well with the experiment results. In the point of combustion time, the lumped combustion model of GOTHIC6 code does not simulate the physical phenomena of combustion appropriately. In the case of mechanistic combustion model, the combustion time is predicted well, but the induction time of calculation data is longer than the experiment data remarkably. Also, the laminar combustion model of GOTHIC6 has deficiency to simulate combustion phenomena unless control the user defined value appropriately. And the pressure is not a proper variable that characterize the three dimensional effect of combustion

  13. Validation of the THIRMAL-1 melt-water interaction code

    International Nuclear Information System (INIS)

    Chu, C.C.; Sienicki, J.J.; Spencer, B.W.

    1995-05-01

    The THIRMAL-1 computer code has been used to calculate nonexplosive LWR melt-water interactions both in-vessel and ex-vessel. To support the application of the code and enhance its acceptability, THIRMAL-1 has been compared with available data from two of the ongoing FARO experiments at Ispra and two of the Corium Coolant Mixing (CCM) experiments performed at Argonne. THIRMAL-1 calculations for the FARO Scoping Test and Quenching Test 2 as well as the CCM-5 and -6 experiments were found to be in excellent agreement with the experiment results. This lends confidence to the modeling that has been incorporated in the code describing melt stream breakup due to the growth of both Kelvin-Helmholtz and large wave instabilities, the sizes of droplets formed, multiphase flow and heat transfer in the mixing zone surrounding and below the melt stream, as well as hydrogen generation due to oxidation of the melt metallic phase. As part of the analysis of the FARO tests, a mechanistic model was developed to calculate the prefragmentation as it may have occurred when melt relocated from the release vessel to the water surface and the model was compared with the relevant data from FARO

  14. Validation of the THIRMAL-1 melt-water interaction code

    Energy Technology Data Exchange (ETDEWEB)

    Chu, C.C.; Sienicki, J.J.; Spencer, B.W. [Argonne National Lab., IL (United States)

    1995-09-01

    The THIRMAL-1 computer code has been used to calculate nonexplosive LWR melt-water interactions both in-vessel and ex-vessel. To support the application of the code and enhance its acceptability, THIRMAL-1 has been compared with available data from two of the ongoing FARO experiments at Ispra and two of the Corium Coolant Mixing (CCM) experiments performed at Argonne. THIRMAL-1 calculations for the FARO Scoping Test and Quenching Test 2 as well as the CCM-5 and -6 experiments were found to be in excellent agreement with the experiment results. This lends confidence to the modeling that has been incorporated in the code describing melt stream breakup due to the growth of both Kelvin-Helmholtz and large wave instabilities, the sizes of droplets formed, multiphase flow and heat transfer in the mixing zone surrounding and below the melt metallic phase. As part of the analysis of the FARO tests, a mechanistic model was developed to calculate the prefragmentation as it may have occurred when melt relocated from the release vessel to the water surface and the model was compared with the relevant data from FARO.

  15. HD Photo: a new image coding technology for digital photography

    Science.gov (United States)

    Srinivasan, Sridhar; Tu, Chengjie; Regunathan, Shankar L.; Sullivan, Gary J.

    2007-09-01

    This paper introduces the HD Photo coding technology developed by Microsoft Corporation. The storage format for this technology is now under consideration in the ITU-T/ISO/IEC JPEG committee as a candidate for standardization under the name JPEG XR. The technology was developed to address end-to-end digital imaging application requirements, particularly including the needs of digital photography. HD Photo includes features such as good compression capability, high dynamic range support, high image quality capability, lossless coding support, full-format 4:4:4 color sampling, simple thumbnail extraction, embedded bitstream scalability of resolution and fidelity, and degradation-free compressed domain support of key manipulations such as cropping, flipping and rotation. HD Photo has been designed to optimize image quality and compression efficiency while also enabling low-complexity encoding and decoding implementations. To ensure low complexity for implementations, the design features have been incorporated in a way that not only minimizes the computational requirements of the individual components (including consideration of such aspects as memory footprint, cache effects, and parallelization opportunities) but results in a self-consistent design that maximizes the commonality of functional processing components.

  16. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  17. Validation Study of CODES Dragonfly Network Model with Theta Cray XC System

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, Misbah [Argonne National Lab. (ANL), Argonne, IL (United States); Ross, Robert B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-31

    This technical report describes the experiments performed to validate the MPI performance measurements reported by the CODES dragonfly network simulation with the Theta Cray XC system at the Argonne Leadership Computing Facility (ALCF).

  18. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  19. RELAP5-3D code validation for RBMK phenomena

    International Nuclear Information System (INIS)

    Fisher, J.E.

    1999-01-01

    The RELAP5-3D thermal-hydraulic code was assessed against Japanese Safety Experiment Loop (SEL) and Heat Transfer Loop (HTL) tests. These tests were chosen because the phenomena present are applicable to analyses of Russian RBMK reactor designs. The assessment cases included parallel channel flow fluctuation tests at reduced and normal water levels, a channel inlet pipe rupture test, and a high power, density wave oscillation test. The results showed that RELAP5-3D has the capability to adequately represent these RBMK-related phenomena

  20. Test Data for USEPR Severe Accident Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Rempe

    2007-05-01

    This document identifies data that can be used for assessing various models embodied in severe accident analysis codes. Phenomena considered in this document, which were limited to those anticipated to be of interest in assessing severe accidents in the USEPR developed by AREVA, include: • Fuel Heatup and Melt Progression • Reactor Coolant System (RCS) Thermal Hydraulics • In-Vessel Molten Pool Formation and Heat Transfer • Fuel/Coolant Interactions during Relocation • Debris Heat Loads to the Vessel • Vessel Failure • Molten Core Concrete Interaction (MCCI) and Reactor Cavity Plug Failure • Melt Spreading and Coolability • Hydrogen Control Each section of this report discusses one phenomenon of interest to the USEPR. Within each section, an effort is made to describe the phenomenon and identify what data are available modeling it. As noted in this document, models in US accident analysis codes (MAAP, MELCOR, and SCDAP/RELAP5) differ. Where possible, this report identifies previous assessments that illustrate the impact of modeling differences on predicting various phenomena. Finally, recommendations regarding the status of data available for modeling USEPR severe accident phenomena are summarized.

  1. Validation of the containment code Sirius: interpretation of an explosion experiment on a scale model

    International Nuclear Information System (INIS)

    Blanchet, Y.; Obry, P.; Louvet, J.; Deshayes, M.; Phalip, C.

    1979-01-01

    The explicit 2-D axisymmetric Langrangian code SIRIUS, developed at the CEA/DRNR, Cadarache, deals with transient compressive flows in deformable primary tanks with more or less complex internal component geometries. This code has been subjected to a two-year intensive validation program on scale model experiments and a number of improvements have been incorporated. This paper presents a recent calculation of one of these experiments using the SIRIUS code, and the comparison with experimental results shows the encouraging possibilities of this Lagrangian code

  2. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  3. An approach to validation of coupled CFD and system thermal-hydraulics codes

    International Nuclear Information System (INIS)

    Jeltsov, M.; Cadinu, F.; Villanueva, W.; Karbojian, A.; Koop, K.; Kudinov, P.

    2011-01-01

    This paper discusses the development of approach and experimental facility for the validation of coupled Computational Fluid Dynamics (CFD) and System Thermal Hydraulics (STH) codes. The validation of a coupled code requires experiments which feature two way feedback between the component (CFD sub-domain) and the system (STH sub-domain). We present results of CFD analysis that are used in the development of a flexible design for the TALL-3D experimental facility. The facility consists of a lead-bismuth thermal-hydraulic loop operating in forced and natural circulation regimes with a heated pool-type 3D test section. The goal of the design is to achieve a feedback between mixing and stratification phenomena in the 3D tests section and forced / natural circulation flow conditions in the loop. Finally, we discuss the development of an experimental validation matrix for validation of coupled STH and CFD codes that considers the key physical phenomena of interest. (author)

  4. Evaporation over sump surface in containment studies: code validation on TOSQAN tests

    International Nuclear Information System (INIS)

    Malet, J.; Gelain, T.; Degrees du Lou, O.; Daru, V.

    2011-01-01

    During the course of a severe accident in a Nuclear Power Plant, water can be collected in the sump containment through steam condensation on walls and spray systems activation. The objective of this paper is to present code validation on evaporative sump tests performed on the TOSQAN facility. The ASTEC-CPA code is used as a lumped-parameter code and specific user-defined-functions are developed for the TONUS-CFD code. The tests are air-steam tests, as well as tests with other non-condensable gases (He, CO 2 and SF 6 ) under steady and transient conditions. The results show a good agreement between codes and experiments, indicating a good behaviour of the sump models in both codes. (author)

  5. Validation of the transportation computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND

    Energy Technology Data Exchange (ETDEWEB)

    Maheras, S.J.; Pippen, H.K.

    1995-05-01

    The computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND were used to estimate radiation doses from the transportation of radioactive material in the Department of Energy Programmatic Spent Nuclear Fuel Management and Idaho National Engineering Laboratory Environmental Restoration and Waste Management Programs Environmental Impact Statement. HIGHWAY and INTERLINE were used to estimate transportation routes for truck and rail shipments, respectively. RADTRAN 4 was used to estimate collective doses from incident-free transportation and the risk (probability {times} consequence) from transportation accidents. RISKIND was used to estimate incident-free radiation doses for maximally exposed individuals and the consequences from reasonably foreseeable transportation accidents. The purpose of this analysis is to validate the estimates made by these computer codes; critiques of the conceptual models used in RADTRAN 4 are also discussed. Validation is defined as ``the test and evaluation of the completed software to ensure compliance with software requirements.`` In this analysis, validation means that the differences between the estimates generated by these codes and independent observations are small (i.e., within the acceptance criterion established for the validation analysis). In some cases, the independent observations used in the validation were measurements; in other cases, the independent observations used in the validation analysis were generated using hand calculations. The results of the validation analyses performed for HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND show that the differences between the estimates generated using the computer codes and independent observations were small. Based on the acceptance criterion established for the validation analyses, the codes yielded acceptable results; in all cases the estimates met the requirements for successful validation.

  6. Validation of the transportation computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND

    International Nuclear Information System (INIS)

    Maheras, S.J.; Pippen, H.K.

    1995-05-01

    The computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND were used to estimate radiation doses from the transportation of radioactive material in the Department of Energy Programmatic Spent Nuclear Fuel Management and Idaho National Engineering Laboratory Environmental Restoration and Waste Management Programs Environmental Impact Statement. HIGHWAY and INTERLINE were used to estimate transportation routes for truck and rail shipments, respectively. RADTRAN 4 was used to estimate collective doses from incident-free transportation and the risk (probability x consequence) from transportation accidents. RISKIND was used to estimate incident-free radiation doses for maximally exposed individuals and the consequences from reasonably foreseeable transportation accidents. The purpose of this analysis is to validate the estimates made by these computer codes; critiques of the conceptual models used in RADTRAN 4 are also discussed. Validation is defined as ''the test and evaluation of the completed software to ensure compliance with software requirements.'' In this analysis, validation means that the differences between the estimates generated by these codes and independent observations are small (i.e., within the acceptance criterion established for the validation analysis). In some cases, the independent observations used in the validation were measurements; in other cases, the independent observations used in the validation analysis were generated using hand calculations. The results of the validation analyses performed for HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND show that the differences between the estimates generated using the computer codes and independent observations were small. Based on the acceptance criterion established for the validation analyses, the codes yielded acceptable results; in all cases the estimates met the requirements for successful validation

  7. Preliminary Validation of the MATRA-LMR Code Using Existing Sodium-Cooled Experimental Data

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Kim, Sangji

    2014-01-01

    The main objective of the SFR prototype plant is to verify TRU metal fuel performance, reactor operation, and transmutation ability of high-level wastes. The core thermal-hydraulic design is used to ensure the safe fuel performance during the whole plant operation. The fuel design limit is highly dependent on both the maximum cladding temperature and the uncertainties of the design parameters. Therefore, an accurate temperature calculation in each subassembly is highly important to assure a safe and reliable operation of the reactor systems. The current core thermalhydraulic design is mainly performed using the SLTHEN (Steady-State LMR Thermal-Hydraulic Analysis Code Based on ENERGY Model) code, which has been already validated using the existing sodium-cooled experimental data. In addition to the SLTHEN code, a detailed analysis is performed using the MATRA-LMR (Multichannel Analyzer for Transient and steady-state in Rod Array-Liquid Metal Reactor) code. In this work, the MATRA-LMR code is validated for a single subassembly evaluation using the previous experimental data. The MATRA-LMR code has been validated using existing sodium-cooled experimental data. The results demonstrate that the design code appropriately predicts the temperature distributions compared with the experimental values. Major differences are observed in the experiments with the large pin number due to the radial-wise mixing difference

  8. In-vessel core degradation code validation matrix

    International Nuclear Information System (INIS)

    Haste, T.J.; Adroguer, B.; Gauntt, R.O.; Martinez, J.A.; Ott, L.J.; Sugimoto, J.; Trambauer, K.

    1996-01-01

    The objective of the current Validation Matrix is to define a basic set of experiments, for which comparison of the measured and calculated parameters forms a basis for establishing the accuracy of test predictions, covering the full range of in-vessel core degradation phenomena expected in light water reactor severe accident transients. The scope of the review covers PWR and BWR designs of Western origin: the coverage of phenomena extends from the initial heat-up through to the introduction of melt into the lower plenum. Concerning fission product behaviour, the effect of core degradation on fission product release is considered. The report provides brief overviews of the main LWR severe accident sequences and of the dominant phenomena involved. The experimental database is summarised. These data are cross-referenced against a condensed set of the phenomena and test condition headings presented earlier, judging the results against a set of selection criteria and identifying key tests of particular value. The main conclusions and recommendations are listed. (K.A.)

  9. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  10. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Mark D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Podgorney, Robert [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelkar, Sharad M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McClure, Mark W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Danko, George [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ghassemi, Ahmad [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fu, Pengcheng [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bahrami, Davood [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Barbier, Charlotte [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cheng, Qinglu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chiu, Kit-Kwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Detournay, Christine [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elsworth, Derek [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yi [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Furtney, Jason K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gan, Quan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gao, Qian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Guo, Bin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hao, Yue [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Horne, Roland N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, Kai [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Im, Kyungjae [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Norbeck, Jack [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rutqvist, Jonny [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Safari, M. R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sesetty, Varahanaresh [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sonnenthal, Eric [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tao, Qingfeng [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); White, Signe K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wong, Yang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xia, Yidong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-12-02

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems

  11. How Developments in Psychology and Technology Challenge Validity Argumentation

    Science.gov (United States)

    Mislevy, Robert J.

    2016-01-01

    Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…

  12. San Onofre PWR Data for Code Validation of MOX Fuel Depletion Analyses - Revision 1

    International Nuclear Information System (INIS)

    Hermann, O.W.

    2000-01-01

    The isotopic composition of mixed-oxide fuel (fabricated with both uranium and plutonium isotopes) discharged from reactors is of interest to the Fissile Material Disposition Program. The validation of depletion codes used to predict isotopic compositions of MOX fuel, similar to studies concerning uranium-only fueled reactors, thus, is very important. The EEI-Westinghouse Plutonium Recycle Demonstration Program was conducted to examine the use of MOX fuel in the San Onofre PWR, Unit I, during cycles 2 and 3. The data, usually required as input to depletion codes, either one-dimensional or lattice codes, were taken from various sources and compiled into this report. Where data were either lacking or determined inadequate, the appropriate data were supplied from other references. The scope of the reactor operations and design data, in addition to the isotopic analyses, was considered to be of sufficient quality for depletion code validation

  13. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    Science.gov (United States)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  14. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    Science.gov (United States)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  15. Validation of system codes RELAP5 and SPECTRA for natural convection boiling in narrow channels

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M.M., E-mail: stempniewicz@nrg.eu; Slootman, M.L.F.; Wiersema, H.T.

    2016-10-15

    Highlights: • Computer codes RELAP5/Mod3.3 and SPECTRA 3.61 validated for boiling in narrow channels. • Validated codes can be used for LOCA analyses in research reactors. • Code validation based on natural convection boiling in narrow channels experiments. - Abstract: Safety analyses of LOCA scenarios in nuclear power plants are performed with so called thermal–hydraulic system codes, such as RELAP5. Such codes are validated for typical fuel geometries applied in nuclear power plants. The question considered by this article is if the codes can be applied for LOCA analyses in research reactors, in particular exceeding CHF in very narrow channels. In order to answer this question, validation calculations were performed with two thermal–hydraulic system codes: RELAP and SPECTRA. The validation was based on natural convection boiling in narrow channels experiments, performed by Prof. Monde et al. in the years 1990–2000. In total 42 vertical tube and annulus experiments were simulated with both codes. A good agreement of the calculated values with the measured data was observed. The main conclusions are: • The computer codes RELAP5/Mod 3.3 (US NRC version) and SPECTRA 3.61 have been validated for natural convection boiling in narrow channels using experiments of Monde. The dimensions applied in the experiments were performed for a range that covers the values observed in typical research reactors. Therefore it is concluded that both codes are validated and can be used for LOCA analyses in research reactors, including natural convection boiling. The applicability range of the present validation is: hydraulic diameters of 1.1 ⩽ D{sub hyd} ⩽ 9.0 mm, heated lengths of 0.1 ⩽ L ⩽ 1.0 m, pressures of 0.10 ⩽ P ⩽ 0.99 MPa. In most calculations the burnout was predicted to occur at lower power than that observed in the experiments. In several cases the burnout was observed at higher power. The overprediction was not larger than 16% in RELAP and 15% in

  16. Validation of the Subchannel Code SUBCHANFLOW Using the NUPEC PWR Tests (PSBT

    Directory of Open Access Journals (Sweden)

    Uwe Imke

    2012-01-01

    Full Text Available SUBCHANFLOW is a computer code to analyze thermal-hydraulic phenomena in the core of pressurized water reactors, boiling water reactors, and innovative reactors operated with gas or liquid metal as coolant. As part of the ongoing assessment efforts, the code has been validated by using experimental data from the NUPEC PWR Subchannel and Bundle Tests (PSBT. The database includes single-phase flow bundle outlet temperature distributions, steady state and transient void distributions and critical power measurements. The performed validation work has demonstrated that the two-phase flow empirical knowledge base implemented in SUBCHANFLOW is appropriate to describe key mechanisms of the experimental investigations with acceptable accuracy.

  17. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    Science.gov (United States)

    2009-12-01

    leurs solutions potentielles. Une approche raisonnable et logique à la bonne validation CEM consiste à utiliser un modèle de cible bien défini comme...2009 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2009 DRDC Ottawa TM 2009-275 i...valider le Frequency Asymptotic Code for Electromagnetic Target Scattering (FACETS [code de fréquence asymptotique pour la diffusion de cibles

  18. Development and Validation of A Nuclear Fuel Cycle Analysis Tool: A FUTURE Code

    International Nuclear Information System (INIS)

    Kim, S. K.; Ko, W. I.; Lee, Yoon Hee

    2013-01-01

    This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy) code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses

  19. DEVELOPMENT AND VALIDATION OF A NUCLEAR FUEL CYCLE ANALYSIS TOOL: A FUTURE CODE

    Directory of Open Access Journals (Sweden)

    S.K. KIM

    2013-10-01

    Full Text Available This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C# and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses.

  20. Trends in EFL Technology and Educational Coding: A Case Study of an Evaluation Application Developed on LiveCode

    Science.gov (United States)

    Uehara, Suwako; Noriega, Edgar Josafat Martinez

    2016-01-01

    The availability of user-friendly coding software is increasing, yet teachers might hesitate to use this technology to develop for educational needs. This paper discusses studies related to technology for educational uses and introduces an evaluation application being developed. Through questionnaires by student users and open-ended discussion by…

  1. Empirical research methods for technology validation: Scaling up to practice

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    tBefore technology is transferred to the market, it must be validated empirically by simulating future prac-tical use of the technology. Technology prototypes are first investigated in simplified contexts, and thesesimulations are scaled up to conditions of practice step by step as more becomes

  2. Validation and verification of the ORNL Monte Carlo codes for nuclear safety analysis

    International Nuclear Information System (INIS)

    Emmett, M.B.

    1993-01-01

    The process of ensuring the quality of computer codes can be very time consuming and expensive. The Oak Ridge National Laboratory (ORNL) Monte Carlo codes all predate the existence of quality assurance (QA) standards and configuration control. The number of person-years and the amount of money spent on code development make it impossible to adhere strictly to all the current requirements. At ORNL, the Nuclear Engineering Applications Section of the Computing Applications Division is responsible for the development, maintenance, and application of the Monte Carlo codes MORSE and KENO. The KENO code is used for doing criticality analyses; the MORSE code, which has two official versions, CGA and SGC, is used for radiation transport analyses. Because KENO and MORSE were very thoroughly checked out over the many years of extensive use both in the United States and in the international community, the existing codes were open-quotes baselined.close quotes This means that the versions existing at the time the original configuration plan is written are considered to be validated and verified code systems based on the established experience with them

  3. Validation of International Classification of Diseases Codes for the Epidemiologic Study of Dermatomyositis.

    Science.gov (United States)

    Kwa, Michael C; Ardalan, Kaveh; Laumann, Anne E; Nardone, Beatrice; West, Dennis P; Silverberg, Jonathan I

    2017-05-01

    To assess the validity of using International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code 710.3 to identify adult patients with dermatomyositis in outpatient and inpatient settings. Electronic medical records of adult patients with ICD-9 code 710.3 between January 2001 and November 2014 (n = 511) were examined. Physician diagnosis, clinical findings, and diagnostic testing results were recorded. A dermatomyositis rating scale was assigned based on classic cutaneous findings and at least 2 additional clinical and diagnostic findings from the Bohan criteria. Sensitivity and positive predictive values (PPVs) were determined. Sensitivity analyses were performed to evaluate the accuracy of multiple ICD-9 codes in the outpatient setting, as well as primary and secondary inpatient codes. The sensitivity and PPV for multiple 710.3 ICD-9 codes in the outpatient setting were 0.89 and 0.35, respectively. The PPV for primary and secondary 710.3 inpatient codes was 0.95 and as high as 0.8. However, the sensitivity of ICD-9 code 710.3 was poor in the inpatient setting (primary 0.23 and secondary 0.26). The most common reason for failure to meet appropriate dermatomyositis criteria was miscoding as diabetes mellitus (32%), followed by diagnosis at an outside institution (19%), dermatomyositis as a rule-out diagnosis (10%), cutaneous dermatomyositis (8%), and juvenile dermatomyositis (6%). One or more occurrences of ICD-9 code 710.3 is insufficient to support the diagnosis of dermatomyositis in the outpatient setting. However, ICD-9 710.3 codes appear to be valid in the inpatient setting. © 2016, American College of Rheumatology.

  4. Neonatal Facial Coding System for Assessing Postoperative Pain in Infants: Item Reduction is Valid and Feasible

    NARCIS (Netherlands)

    Peters, J.W.B.; Koot, H.M.; Grunau, R.E.; Boer, J. de; Druenen, M.J. van; Tibboel, D.; Duivenvoorden, H.J.

    2003-01-01

    Objective: The objectives of this study were to: (1) evaluate the validity of the Neonatal Facial Coding System (NFCS) for assessment of postoperative pain and (2) explore whether the number of NFCS facial actions could be reduced for assessing postoperative pain. Design: Prospective, observational

  5. Test and validation of the iterative code for the neutrons spectrometry and dosimetry: NSDUAZ

    International Nuclear Information System (INIS)

    Reyes H, A.; Ortiz R, J. M.; Reyes A, A.; Castaneda M, R.; Solis S, L. O.; Vega C, H. R.

    2014-08-01

    In this work was realized the test and validation of an iterative code for neutronic spectrometry known as Neutron Spectrometry and Dosimetry of the Universidad Autonoma de Zacatecas (NSDUAZ). This code was designed in a user graph interface, friendly and intuitive in the environment programming of LabVIEW using the iterative algorithm known as SPUNIT. The main characteristics of the program are: the automatic selection of the initial spectrum starting from the neutrons spectra catalog compiled by the International Atomic Energy Agency, the possibility to generate a report in HTML format that shows in graph and numeric way the neutrons flowing and calculates the ambient dose equivalent with base to this. To prove the designed code, the count rates of a spectrometer system of Bonner spheres were used with a detector of 6 LiI(Eu) with 7 polyethylene spheres with diameter of 0, 2, 3, 5, 8, 10 and 12. The count rates measured with two neutron sources: 252 Cf and 239 PuBe were used to validate the code, the obtained results were compared against those obtained using the BUNKIUT code. We find that the reconstructed spectra present an error that is inside the limit reported in the literature that oscillates around 15%. Therefore, it was concluded that the designed code presents similar results to those techniques used at the present time. (Author)

  6. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    Directory of Open Access Journals (Sweden)

    F. Terzuoli

    2008-01-01

    Full Text Available Pressurized thermal shock (PTS modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV lifetime is the cold water emergency core cooling (ECC injection into the cold leg during a loss of coolant accident (LOCA. Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mécanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX, and a research code (NEPTUNE CFD. The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling.

  7. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  8. The COVA programme for the validation of computer codes for fast reactor containment studies

    International Nuclear Information System (INIS)

    Hoskin, N.E.; Lancefield, M.J.

    1978-01-01

    The UKAEA and the Joint Research Centre, Euratom, Ispra are engaged in a collaborative experimental programme carrying out a series of small scale, well instrumented tests aimed at providing high quality data of the stresses, strains and loads occurring when a well characterized source is released within a fluid in a containment vessel. In the UK the data are being used to validate the computer codes ASTARTE and SEURBNUK which are used in studies of the response of the fast reactor primary containment system in the event of a hypothetical reactor excursion. This paper describes the UK experimental programme and the development of the low density explosive which is used as the energy source; the rationale of the code development programme is presented together with a report on the progress that has been made in validating the two codes against the experimental data. (Auth.)

  9. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    Directory of Open Access Journals (Sweden)

    Jørgensen LK

    2016-05-01

    Full Text Available Laura Krogh Jørgensen,1 Lars Skov Dalgaard,1 Lars Jørgen Østergaard,1 Nanna Skaarup Andersen,2 Mette Nørgaard,3 Trine Hyrup Mogensen1 1Department of Infectious Diseases, Aarhus University Hospital, Aarhus, 2Department of Clinical Microbiology, Odense University Hospital, Odense, 3Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark Background: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE in the Danish National Patient Registry (DNPR. Methods: The DNPR was used to identify all hospitalized patients, aged ≥15 years, with a first-time diagnosis of HSE according to the International Classification of Diseases, tenth revision (ICD-10, from 2004 to 2014. To validate the coding of HSE, we collected data from the Danish Microbiology Database, from departments of clinical microbiology, and from patient medical records. Cases were classified as confirmed, probable, or no evidence of HSE. We estimated the positive predictive value (PPV of the HSE diagnosis coding stratified by diagnosis type, study period, and department type. Furthermore, we estimated the proportion of HSE cases coded with nonspecific ICD-10 codes of viral encephalitis and also the sensitivity of the HSE diagnosis coding. Results: We were able to validate 398 (94.3% of the 422 HSE diagnoses identified via the DNPR. Hereof, 202 (50.8% were classified as confirmed cases and 29 (7.3% as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0–62.9. For “Encephalitis due to herpes simplex virus” (ICD-10 code B00.4, the PPV was 56.6% (95% CI: 51.1–62.0. Similarly, the PPV for “Meningoencephalitis due to herpes simplex virus” (ICD-10 code B00.4A was 56.8% (95% CI: 39.5–72.9. “Herpes viral encephalitis” (ICD-10 code G05.1E had a PPV

  10. Validation logics for a reactive two-phase code and first results; Logique de validation d'un code diphasique reactif et premiers resultats

    Energy Technology Data Exchange (ETDEWEB)

    Dupays, J.

    1998-07-01

    The development of a reactive two-phase code devoted to the simulation of unstationary phenomena requires a pragmatic validation approach based on test cases with an increasing complexity. In this aim, the propagation of an acoustic wave inside a standing reactive two-phase medium is a first necessary step. The theoretical formalism on which this test case is based is explained first. Then a first theory-simulation comparison is presented. This study stresses on the destabilizing property of droplets evaporation and gives a way to estimate the critical frequency-diameter couples. Finally, a validation logics centered on specific problems raised by the simulation of reactive flows in solid propellant engines is proposed. (J.S.)

  11. Design validation of the ITER EC upper launcher according to codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Spaeh, Peter, E-mail: peter.spaeh@kit.edu [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Aiello, Gaetano [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Gagliardi, Mario [Karlsruhe Institute of Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); F4E, Fusion for Energy, Joint Undertaking, Barcelona (Spain); Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Weinhorst, Bastian [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany)

    2015-10-15

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  12. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  13. Validation of sonic boom propagation codes using SR-71 flight test data.

    Science.gov (United States)

    Ivanteyeva, Lyudmila G; Kovalenko, Victor V; Pavlyukov, Evgeny V; Teperin, Leonid L; Rackl, Robert G

    2002-01-01

    The results of two sonic boom propagation codes, ZEPHYRUS (NASA) and BOOM (TsAGI), are compared with SR-71 flight test data from 1995. Options available in the computational codes are described briefly. Special processing methods are described which were applied to the experimental data. A method to transform experimental data at close ranges to the supersonic aircraft into initial data required by the codes was developed; it is applicable at any flight regime. Data are compared in near-, mid-, and far fields. The far-field observation aircraft recorded both direct and reflected waves. Comparison of computed and measured results shows good agreement with peak pressure, duration, and wave shape for direct waves, thus validating the computational codes.

  14. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  15. Uncertainty propagation applied to multi-scale thermal-hydraulics coupled codes. A step towards validation

    Energy Technology Data Exchange (ETDEWEB)

    Geffray, Clotaire Clement

    2017-03-20

    The work presented here constitutes an important step towards the validation of the use of coupled system thermal-hydraulics and computational fluid dynamics codes for the simulation of complex flows in liquid metal cooled pool-type facilities. First, a set of methods suited for uncertainty and sensitivity analysis and validation activities with regards to the specific constraints of the work with coupled and expensive-to-run codes is proposed. Then, these methods are applied to the ATHLET - ANSYS CFX model of the TALL-3D facility. Several transients performed at this latter facility are investigated. The results are presented, discussed and compared to the experimental data. Finally, assessments of the validity of the selected methods and of the quality of the model are offered.

  16. Elaboration and validation of an assistive technology assessment questionnaire

    Directory of Open Access Journals (Sweden)

    Fernanda Jorge Guimarães

    2015-06-01

    Full Text Available Assistive Technologies consists of resources, methods, and strategies favoring autonomy and inclusion of elderly and people with disabilities, being scarce in the literature instruments assessing them. A methodology study conducted with a panel of specialists and people with visual impairment, aimed to elaborate and validate a questionnaire to assess educational assistive technology. To consider an item as valid, we used 80% as agreement percentage, and validity and reliability of the questionnaire were calculated. Assistive Technology was characterized in six attributes: objectives, access, clarity, structure and presentation, relevance and efficacy, interactivity, and 19 items were elaborated to compose the questionnaire. From those, 11 obtained percentages higher than 80%, seven were modified and one was excluded. The instrument Cronbach’s alpha was 0,822, guaranteeing validity and reliability of the tool to assess health education Assistive Technology, and therefore, its use is indicated.

  17. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  18. Research on the coding and decoding technology of the OCDMA system

    Science.gov (United States)

    Li, Ping; Wang, Yuru; Lan, Zhenping; Wang, Jinpeng; Zou, Nianyu

    2015-12-01

    Optical Code Division Multiplex Access, OCDMA, is a kind of new technology which is combined the wireless CDMA technology and the optical fiber communication technology together. The address coding technology in OCDMA system has been researched. Besides, the principle of the codec based on optical fiber delay line and non-coherent spectral domain encoding and decoding has been introduced and analysed, and the results was verified by experiment.

  19. Progress in Technology Validation of the Next Ion Propulsion System

    Science.gov (United States)

    Benson, Scott W.; Patterson, Michael J.

    2007-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development.

  20. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  1. Validating Advanced Supply-Chain Technology (VAST)

    Science.gov (United States)

    2004-06-01

    Use philosophy that is so important in today’s procurement environment. Electronic Data Interchange (EDI) and eCommerce is proving to be a major...the STEPwise methodology are particularly encouraging. These new EDI and eCommerce technologies are becoming more important with the customers who...critical assumption is based upon the 55 fact that eCommerce is growing throughout the commercial and military sector and those who are not

  2. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  3. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  4. Large leak sodium-water reaction code SWACS and its validation

    International Nuclear Information System (INIS)

    Miyake, O.; Shindo, Y.; Hiroi, H.; Tanabe, H.; Sato, M.

    1982-01-01

    A computer code SWACS for analyzing the large leak accident of an LMFBR steam generators has been developed and validated. Five tests data obtained by SWAT-3 test facility were compared with code results. In each of SWAT-3 tests, a double-ended guillotine rupture of one tube was simulated in a helical coil steam generator model with 1/2.5 scaled test vessel to the prototype SG. The analytical results, including an initial pressure spike, a propagated pressure in a secondary system, and a quasi-steady pressure, indicate that the overall large-leak event could be predicted in reasonably good agreement

  5. Validation of a new continuous Monte Carlo burnup code using a Mox fuel assembly

    International Nuclear Information System (INIS)

    El bakkari, B.; El Bardouni, T.; Merroun, O.; El Younoussi, C.; Boulaich, Y.; Boukhal, H.; Chakir, E.

    2009-01-01

    The reactivity of nuclear fuel decreases with irradiation (or burnup) due to the transformation of heavy nuclides and the formation of fission products. Burnup credit studies aim at accounting for fuel irradiation in criticality studies of the nuclear fuel cycle (transport, storage, etc...). The principal objective of this study is to evaluate the potential capabilities of a newly developed burnup code called 'BUCAL1'. BUCAL1 differs in comparison with other burnup codes as it does not use the calculated neutron flux as input to other computer codes to generate the nuclide inventory for the next time step. Instead, BUCAL1 directly uses the neutron reaction tally information generated by MCNP for each nuclide of interest to determine the new nuclides inventory. This allows the full capabilities of MCNP to be incorporated into the calculation and a more accurate and robust analysis to be performed. Validation of BUCAL1 was processed by code-to-code comparisons using predictions of several codes from the NEA/OCED. Infinite multiplication factors (k ∞ ) and important fission product and actinide concentrations were compared for a MOX core benchmark exercise. Results of calculations are analysed and discussed.

  6. Results and code predictions for ABCOVE [aerosol behavior code validation and evaluation] aerosol code validation with low concentration NaOH and NaI aerosol: CSTF test AB7

    International Nuclear Information System (INIS)

    Hilliard, R.K.; McCormack, J.D.; Muhlestein, L.D.

    1985-10-01

    A program for aerosol behavior validation and evaluation (ABCOVE) has been developed in accordance with the LMFBR Safety Program Plan. The ABCOVE program is a cooperative effort between the USDOE, the USNRC, and their contractor organizations currently involved in aerosol code development, testing or application. The third large-scale test in the ABCOVE program, AB7, was performed in the 850-m 3 CSTF vessel with a two-species test aerosol. The test conditions involved the release of a simulated fission product aerosol, NaI, into the containment atmosphere after the end of a small sodium pool fire. Four organizations made pretest predictions of aerosol behavior using five computer codes. Two of the codes (QUICKM and CONTAIN) were discrete, multiple species codes, while three (HAA-3, HAA-4, and HAARM-3) were log-normal codes which assume uniform coagglomeration of different aerosol species. Detailed test results are presented and compared with the code predictions for eight key aerosol behavior parameters. 11 refs., 44 figs., 35 tabs

  7. Infrared imaging - A validation technique for computational fluid dynamics codes used in STOVL applications

    Science.gov (United States)

    Hardman, R. R.; Mahan, J. R.; Smith, M. H.; Gelhausen, P. A.; Van Dalsem, W. R.

    1991-01-01

    The need for a validation technique for computational fluid dynamics (CFD) codes in STOVL applications has led to research efforts to apply infrared thermal imaging techniques to visualize gaseous flow fields. Specifically, a heated, free-jet test facility was constructed. The gaseous flow field of the jet exhaust was characterized using an infrared imaging technique in the 2 to 5.6 micron wavelength band as well as conventional pitot tube and thermocouple methods. These infrared images are compared to computer-generated images using the equations of radiative exchange based on the temperature distribution in the jet exhaust measured with the thermocouple traverses. Temperature and velocity measurement techniques, infrared imaging, and the computer model of the infrared imaging technique are presented and discussed. From the study, it is concluded that infrared imaging techniques coupled with the radiative exchange equations applied to CFD models are a valid method to qualitatively verify CFD codes used in STOVL applications.

  8. AEEW comments on the NNC/CEGB LOCA code validation report RX 440-A

    International Nuclear Information System (INIS)

    Brittain, I.; Bryce, W.M.; O'Mahoney, R.; Richards, C.G.; Gibson, I.H.; Porter, W.H.L.; Fell, J.

    1984-03-01

    Comments are made on the NNC/CEGB report PWR/RX 440-A, Review of Validation for the ECCS Evaluation Model Codes, by K.T. Routledge et al, 1982. This set out to review methods and models used in the LOCA safety case for Sizewell B. These methods are embodied in the Evaluation Model Computer codes SATAN-VI, WREFLOOD, WFLASH, LOCTA-IV and COCO. The main application of these codes is the determination of peak clad temperature and overall containment pressure. The comments represent the views of a group which has been involved for a number of years in the development and application of Best-Estimate methods for LOCA analysis. It is the judgement of this group that, overall, the EM methods can be used to make an acceptable safety case, but there are a number of points of detail still to be resolved. (U.K.)

  9. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    Glaeser, H.

    1995-01-01

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  10. Validation of the ORIGEN-S code for predicting radionuclide inventories in used CANDU Fuel

    International Nuclear Information System (INIS)

    Tait, J.C.; Gauld, I.; Kerr, A.H.

    1994-10-01

    The safety assessment being conducted by AECL Research for the concept of deep geological disposal of used CANDU UO 2 fuel requires the calculation of radionuclide inventories in the fuel to provide source terms for radionuclide release. This report discusses the validation of selected actinide and fission-product inventories calculated using the ORIGEN-S code coupled with the WIMS-AECL lattice code, using data from analytical measurements of radioisotope inventories in Pickering CANDU reactor fuel. The recent processing of new ENDF/B-VI cross-section data has allowed the ORIGEN-S calculations to be performed using the most up-to-date nuclear data available. The results indicate that the code is reliably predicting actinide and the majority of fission-product inventories to within the analytical uncertainty. 38 refs., 4 figs., 5 tabs

  11. Verification and Validation of the BISON Fuel Performance Code for PCMI Applications

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle Allan Lawrence [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Gardner, Russell James [Idaho National Laboratory; Perez, Danielle Marie [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-06-01

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described. Validation for application to light water reactor (LWR) PCMI problems is assessed by comparing predicted and measured rod diameter following base irradiation and power ramps. Results indicate a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. Initial rod diameter comparisons have led to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.

  12. Verification & Validation Toolkit to Assess Codes: Is it Theory Limitation, Numerical Method Inadequacy, Bug in the Code or a Serious Flaw?

    Science.gov (United States)

    Bombardelli, F. A.; Zamani, K.

    2014-12-01

    We introduce and discuss an open-source, user friendly, numerical post-processing piece of software to assess reliability of the modeling results of environmental fluid mechanics' codes. Verification and Validation, Uncertainty Quantification (VAVUQ) is a toolkit developed in Matlab© for general V&V proposes. In this work, The VAVUQ implementation of V&V techniques and user interfaces would be discussed. VAVUQ is able to read Excel, Matlab, ASCII, and binary files and it produces a log of the results in txt format. Next, each capability of the code is discussed through an example: The first example is the code verification of a sediment transport code, developed with the Finite Volume Method, with MES. Second example is a solution verification of a code for groundwater flow, developed with the Boundary Element Method, via MES. Third example is a solution verification of a mixed order, Compact Difference Method code of heat transfer via MMS. Fourth example is a solution verification of a 2-D, Finite Difference Method code of floodplain analysis via Complete Richardson Extrapolation. In turn, application of VAVUQ in quantitative model skill assessment studies (validation) of environmental codes is given through two examples: validation of a two-phase flow computational modeling of air entrainment in a free surface flow versus lab measurements and heat transfer modeling in the earth surface versus field measurement. At the end, we discuss practical considerations and common pitfalls in interpretation of V&V results.

  13. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  14. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    DEFF Research Database (Denmark)

    Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen

    2016-01-01

    BACKGROUND: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR...... (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0-62.9). For "Encephalitis due to herpes simplex virus" (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1-62.0). Similarly, the PPV for "Meningoencephalitis due to herpes simplex virus" (ICD-10 code B00.4A......) was 56.8% (95% CI: 39.5-72.9). "Herpes viral encephalitis" (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5-89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. CONCLUSION: The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence...

  15. Validation of the Thermal-Hydraulic Model in the SACAP Code with the ISP Tests

    Energy Technology Data Exchange (ETDEWEB)

    Park, Soon-Ho; Kim, Dong-Min; Park, Chang-Hwan [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    In safety viewpoint, the pressure of the containment is the important parameter, of course, the local hydrogen concentration is also the parameter of the major concern because of its flammability and the risk of the detonation. In Korea, there have been an extensive efforts to develop the computer code which can analyze the severe accident behavior of the pressurized water reactor. The development has been done in a modularized manner and SACAP(Severe Accident Containment Analysis Package) code is now under final stage of development. SACAP code adopts LP(Lumped Parameter) model and is applicable to analyze the synthetic behavior of the containment during severe accident occurred by thermal-hydraulic transient, combustible gas burn, direct containment heating by high pressure melt ejection, steam explosion and molten core-concrete interaction. The analyses of a number of ISP(International Standard Problem) experiments were done as a part of the SACAP code V and V(verification and validation). In this paper, the SACAP analysis results for ISP-35 NUPEC and ISP-47 TOSQAN are presented including comparison with other existing NPP simulation codes. In this paper, we selected and analyzed ISP-35 NUPEC, ISP-47 TOSQAN in order to confirm the computational performance of SACAP code currently under development. Now the multi-node analysis for the ISP-47 is under process. As a result of simulation, SACAP predicts well the thermal-hydraulic variables such as temperature, pressure, etc. Also, we verify that SACAP code is properly equipped to analyze the gas distribution and condensation.

  16. The Validity of HCC Diagnosis Codes in Chronic Hepatitis B Patients in the Veterans Health Administration.

    Science.gov (United States)

    Omino, Ronald; Mittal, Sahil; Kramer, Jennifer R; Chayanupatkul, Maneerat; Richardson, Peter; Kanwal, Fasiha

    2017-05-01

    Administrative databases that include diagnostic codes are valuable sources of information for research purposes. To validate diagnostic codes for hepatocellular carcinoma (HCC) in chronic hepatitis B patients. We conducted a retrospective study of patients with chronic HBV seen in the national Veterans Administration (VA). HCC cases were identified by the presence of ICD-9 code 155.0. We randomly selected 200 HBV controls without this code as controls. We manually reviewed the electronic medical record (EMR) of all cases and controls to determine HCC status. We calculated the positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity for the HCC code. We conducted an implicit review of the false-positive cases to determine possible reasons for the miscoding. Of the 8350 patients with HBV, 416 had an ICD-9 code for HCC. Of these 416, 332 patients had confirmed HCC and 61 did not; HCC status was indeterminate for 23 patients. Of the 200 controls, none had HCC confirmed in the EMR. The PPV ranged from 85.3 to 80.0% and specificity ranged from 99.2 to 99.0% based on classification of indeterminate cases as true versus false positives, respectively. The NPV, sensitivity, and specificity were 100%. Two-thirds of false-positive cases were diagnosed with HCC prematurely as a workup of liver mass and latter imaging and/or biopsy were not diagnostic for HCC. The diagnostic code of HCC in chronic HBV patients in the VHA data is predictive of the presence of HCC in medical records and can be used for epidemiological and clinical research.

  17. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  18. Validation of fast reactor thermomechanical and thermohydraulic codes. Final report of a co-ordinated research project. 1996-1999

    International Nuclear Information System (INIS)

    2002-11-01

    This report is a summary of the work performed under a co-ordinated research project (CRP) entitled Harmonization and Validation of Fast Reactor Thermomechanical and Thermo-Hydraulic Codes and Relations using Experimental Data. The project was organized by the IAEA on the recommendation of the IAEA's Technical Working Group on Fast Reactors (TWGFR) and carried out from 1996 to 1999. In certain conditions, temperature fluctuations in the coolant close to a structure caused by thermal striping can lead to thermomechanical damage to structures. Institutes from a number of Member States have an interest in improving engineering tools and prediction techniques concerning the characterization of the thermal striping effects, in which numerical models have a major role. Therefore, the IAEA through its advanced reactor technology development programme supports the activities of Member States in this area. Design analyses applied to thermal striping phenomena need to be firmly established, and the CRP provided a valuable tool in assessing their reliability. Eleven institutes from France, India, Italy, Japan, the Republic of Korea, the Russian Federation and the United Kingdom co-operated in this CRP. This report documents the CRP activities, provides the main results and recommendations and includes the work carried out by the research groups at the participating institutes within the CRP on harmonization and validation of fast reactor thermomechanical and thermohydraulic codes and relations

  19. Secret Codes: The Hidden Curriculum of Semantic Web Technologies

    Science.gov (United States)

    Edwards, Richard; Carmichael, Patrick

    2012-01-01

    There is a long tradition in education of examination of the hidden curriculum, those elements which are implicit or tacit to the formal goals of education. This article draws upon that tradition to open up for investigation the hidden curriculum and assumptions about students and knowledge that are embedded in the coding undertaken to facilitate…

  20. Network Coding is the 5G Key Enabling Technology

    DEFF Research Database (Denmark)

    Compta, Pol Torres; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    The exponential growth of the mobile devices market, not only smartphones, but also tablets, laptops or wearables, poses a serious challenge for 5G communications. Random Linear Network Coding (RLNC) is a promising solution for present and future networks as it has been shown to provide increased...

  1. The Impact of Bar Code Medication Administration Technology on Reported Medication Errors

    Science.gov (United States)

    Holecek, Andrea

    2011-01-01

    The use of bar-code medication administration technology is on the rise in acute care facilities in the United States. The technology is purported to decrease medication errors that occur at the point of administration. How significantly this technology affects actual rate and severity of error is unknown. This descriptive, longitudinal research…

  2. Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits

    Directory of Open Access Journals (Sweden)

    Lieberman Rebecca M

    2008-04-01

    Full Text Available Abstract Background Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. Methods This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3. We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. Results We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64% cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8, often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2 identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2% true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86–92 for

  3. First validation of the new continuous energy version of the MORET5 Monte Carlo code

    International Nuclear Information System (INIS)

    Miss, Joachim; Bernard, Franck; Forestier, Benoit; Haeck, Wim; Richet, Yann; Jacquet, Olivier

    2008-01-01

    The 5.A.1 version is the next release of the MORET Monte Carlo code dedicated to criticality and reactor calculations. This new version combines all the capabilities that are already available in the multigroup version with many new and enhanced features. The main capabilities of the previous version are the powerful association of a deterministic and Monte Carlo approach (like for instance APOLLO-MORET), the modular geometry, five source sampling techniques and two simulation strategies. The major advance in MORET5 is the ability to perform calculations either a multigroup or a continuous energy simulation. Thanks to these new developments, we now have better control over the whole process of criticality calculations, from reading the basic nuclear data to the Monte Carlo simulation itself. Moreover, this new capability enables us to better validate the deterministic-Monte Carlo multigroup calculations by performing continuous energy calculations with the same code, using the same geometry and tracking algorithms. The aim of this paper is to describe the main options available in this new release, and to present the first results. Comparisons of the MORET5 continuous-energy results with experimental measurements and against another continuous-energy Monte Carlo code are provided in terms of validation and time performance. Finally, an analysis of the interest of using a unified energy grid for continuous energy Monte Carlo calculations is presented. (authors)

  4. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  5. Thyc, a 3D thermal-hydraulic code for rod bundles. Recent developments and validation tests

    International Nuclear Information System (INIS)

    Caremoli, C.; Rascle, P.; Aubry, S.; Olive, J.

    1993-09-01

    PWR or LMFBR cores or fuel assemblies, PWR steam generators, condensers, tubular heat exchangers, are basic components of a nuclear power plant involving two-phase flows in tube or rod bundles. A deep knowledge of the detailed flow patterns on the shell side is necessary to evaluate DNB margins in reactor cores, singularity effects (grids, wire spacers, support plates, baffles), corrosion on steam generator tube sheet, bypass effects and vibration risks. For that purpose, Electricite de France has developed, since 1986, a general purpose code named THYC (Thermal HYdraulic Code) designed to study three-dimensional single and two phase flows in rod or tube bundles (pressurized water reactor cores, steam generators, condensers, heat exchangers). It considers the three-dimensional domain to contain two kinds of components: fluid and solids. The THYC model is obtained by space-time averaging of the instantaneous equations (mass, momentum and energy) of each phase over control volumes including fluid and solids. This paper briefly presents the physical model and the numerical method used in THYC. Then, validation tests (comparison with experiments) and applications (coupling with three-dimensional neutronics code and DNB predictions) are presented. They emphasize the last developments and new capabilities of the code. (authors). 10 figs., 3 tabs., 21 refs

  6. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  7. Guidelines on Active Content and Mobile Code: Recommendations of the National Institute of Standards and Technology

    National Research Council Canada - National Science Library

    Jansen, Wayne

    2001-01-01

    .... One such category of technologies is active content. Broadly speaking, active content refers to electronic documents that, unlike past character documents based on the American Standard Code for Information Interchange (ASCII...

  8. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Science.gov (United States)

    2011-10-26

    ... symbol, standard, or technology (Id. at 12510 and 12529). In response to the Bar Code Proposed Rule, FDA... to FDA, they are not required to do so. In recognition of these challenges, in the Federal Register...

  9. Validation of TEMP: A finite line heat transfer code for geologic repositories for nuclear waste

    International Nuclear Information System (INIS)

    Atterbury, W.G.; Hetteburg, J.R.; Wurm, K.J.

    1987-09-01

    TEMP is a FORTRAN computer code for calculating temperatures in a geologic repository for nuclear waste. A previous report discusses the structure, usage, verification, and benchmarking of TEMP V1.0 (Wurm et al., 1987). This report discusses modifications to the program in the development of TEMP V1.1 and documents the validation of TEMP. The development of TEMP V1.1 from TEMP V1.0 consisted of two major efforts. The first was to recode several of the subroutines to improve logic flow and to allow for geometry-independent temperature calculation routines which, in turn, allowed for the addition of the geometry-independent validation option. The validation option provides TEMP with the ability to model any geometry of temperature sources with any step-wise heat release rate. This capability allows TEMP to model the geometry and heat release characteristics of the validation problems. The validation of TEMP V1.1 consists of the comparison of TEMP to three in-ground heater tests. The three tests chosen were Avery Island, Louisiana, Site A; Avery Island, Louisiana, Site C; and Asse Mine, Federal Republic of Germany, Site 2. TEMP shows marginal comparison with the two Avery Island sites and good comparison with the Asse Mine Site. 8 refs., 25 figs., 14 tabs

  10. Validation of a tetrahedral spectral element code for solving the Navier Stokes equation

    International Nuclear Information System (INIS)

    Niewiadomski, C.; Paraschivoiu, M.

    2004-01-01

    The tetrahedral spectral element method is considered to solve the incompressible Navier-Stokes equations because it is capable to capture complex geometries and obtain highly accurate solutions. This method allows accuracy improvements both by decreasing the spatial discretization as well as increasing the expansion order. The method is presented here-in as a modification of an standard finite element code. Some recent improvement to the baseline spectral element method for the tetrahedron described in References 3 and 2 are presented. These improvements include: the continuity enforcement procedure avoiding the need to change the global assembly operation and the removal of the reference coordinate system from the elemental evaluations thus simplifying greatly the method. A study is performed on the Stokes and Navier-Stokes equations to validate the method and the resulting code. (author)

  11. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  12. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  13. Emerging technologies for 3D video creation, coding, transmission and rendering

    CERN Document Server

    Dufaux, Frederic; Cagnazzo, Marco

    2013-01-01

    With the expectation of greatly enhanced user experience, 3D video is widely perceived as the next major advancement in video technology. In order to fulfil the expectation of enhanced user experience, 3D video calls for new technologies addressing efficient content creation, representation/coding, transmission and display. Emerging Technologies for 3D Video will deal with all aspects involved in 3D video systems and services, including content acquisition and creation, data representation and coding, transmission, view synthesis, rendering, display technologies, human percepti

  14. Validity of Diagnostic Codes for Acute Stroke in Administrative Databases: A Systematic Review.

    Directory of Open Access Journals (Sweden)

    Natalie McCormick

    Full Text Available To conduct a systematic review of studies reporting on the validity of International Classification of Diseases (ICD codes for identifying stroke in administrative data.MEDLINE and EMBASE were searched (inception to February 2015 for studies: (a Using administrative data to identify stroke; or (b Evaluating the validity of stroke codes in administrative data; and (c Reporting validation statistics (sensitivity, specificity, positive predictive value (PPV, negative predictive value (NPV, or Kappa scores for stroke, or data sufficient for their calculation. Additional articles were located by hand search (up to February 2015 of original papers. Studies solely evaluating codes for transient ischaemic attack were excluded. Data were extracted by two independent reviewers; article quality was assessed using the Quality Assessment of Diagnostic Accuracy Studies tool.Seventy-seven studies published from 1976-2015 were included. The sensitivity of ICD-9 430-438/ICD-10 I60-I69 for any cerebrovascular disease was ≥ 82% in most [≥ 50%] studies, and specificity and NPV were both ≥ 95%. The PPV of these codes for any cerebrovascular disease was ≥ 81% in most studies, while the PPV specifically for acute stroke was ≤ 68%. In at least 50% of studies, PPVs were ≥ 93% for subarachnoid haemorrhage (ICD-9 430/ICD-10 I60, 89% for intracerebral haemorrhage (ICD-9 431/ICD-10 I61, and 82% for ischaemic stroke (ICD-9 434/ICD-10 I63 or ICD-9 434&436. For in-hospital deaths, sensitivity was 55%. For cerebrovascular disease or acute stroke as a cause-of-death on death certificates, sensitivity was ≤ 71% in most studies while PPV was ≥ 87%.While most cases of prevalent cerebrovascular disease can be detected using 430-438/I60-I69 collectively, acute stroke must be defined using more specific codes. Most in-hospital deaths and death certificates with stroke as a cause-of-death correspond to true stroke deaths. Linking vital statistics and hospitalization

  15. Computational Fluid Dynamics Code Validation/Calibration: JANNAF Airbreathing Propulsion Subcommittee Workshop: High-Speed Inlet Forebody Interactions

    Science.gov (United States)

    Hudson, Camille T. (Editor)

    1991-01-01

    A summary, viewgraphs, and a transcript of discussions of a workshop on computational fluid dynamics code validation/calibration are presented. The workshop focused on inlet/forebody interactions in high-speed ramjets.

  16. The TALL-3D facility design and commissioning tests for validation of coupled STH and CFD codes

    International Nuclear Information System (INIS)

    Grishchenko, Dmitry; Jeltsov, Marti; Kööp, Kaspar; Karbojian, Aram; Villanueva, Walter; Kudinov, Pavel

    2015-01-01

    Highlights: • Design of a heavy liquid thermal-hydraulic loop for CFD/STH code validation. • Description of the loop instrumentation and assessment of measurement error. • Experimental data from forced to natural circulation transient. - Abstract: Application of coupled CFD (Computational Fluid Dynamics) and STH (System Thermal Hydraulics) codes is a prerequisite for computationally affordable and sufficiently accurate prediction of thermal-hydraulics of complex systems. Coupled STH and CFD codes require validation for understanding and quantification of the sources of uncertainties in the code prediction. TALL-3D is a liquid Lead Bismuth Eutectic (LBE) loop developed according to the requirements for the experimental data for validation of coupled STH and CFD codes. The goals of the facility design are to provide (i) mutual feedback between natural circulation in the loop and complex 3D mixing and stratification phenomena in the pool-type test section, (ii) a possibility to validate standalone STH and CFD codes for each subsection of the facility, and (iii) sufficient number of experimental data to separate the process of input model calibration and code validation. Description of the facility design and its main components, approach to estimation of experimental uncertainty and calibration of model input parameters that are not directly measured in the experiment are discussed in the paper. First experimental data from the forced to natural circulation transient is also provided in the paper

  17. The TALL-3D facility design and commissioning tests for validation of coupled STH and CFD codes

    Energy Technology Data Exchange (ETDEWEB)

    Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se; Jeltsov, Marti, E-mail: marti@safety.sci.kth.se; Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se; Karbojian, Aram, E-mail: karbojan@kth.se; Villanueva, Walter, E-mail: walter@safety.sci.kth.se; Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se

    2015-08-15

    Highlights: • Design of a heavy liquid thermal-hydraulic loop for CFD/STH code validation. • Description of the loop instrumentation and assessment of measurement error. • Experimental data from forced to natural circulation transient. - Abstract: Application of coupled CFD (Computational Fluid Dynamics) and STH (System Thermal Hydraulics) codes is a prerequisite for computationally affordable and sufficiently accurate prediction of thermal-hydraulics of complex systems. Coupled STH and CFD codes require validation for understanding and quantification of the sources of uncertainties in the code prediction. TALL-3D is a liquid Lead Bismuth Eutectic (LBE) loop developed according to the requirements for the experimental data for validation of coupled STH and CFD codes. The goals of the facility design are to provide (i) mutual feedback between natural circulation in the loop and complex 3D mixing and stratification phenomena in the pool-type test section, (ii) a possibility to validate standalone STH and CFD codes for each subsection of the facility, and (iii) sufficient number of experimental data to separate the process of input model calibration and code validation. Description of the facility design and its main components, approach to estimation of experimental uncertainty and calibration of model input parameters that are not directly measured in the experiment are discussed in the paper. First experimental data from the forced to natural circulation transient is also provided in the paper.

  18. Low Bit Rate Video Coding | Mishra | Nigerian Journal of Technology

    African Journals Online (AJOL)

    Nigerian Journal of Technology. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 32, No 3 (2013) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected should ...

  19. Validating an infrared thermal switch as a novel access technology

    Directory of Open Access Journals (Sweden)

    Memarian Negar

    2010-08-01

    Full Text Available Abstract Background Recently, a novel single-switch access technology based on infrared thermography was proposed. The technology exploits the temperature differences between the inside and surrounding areas of the mouth as a switch trigger, thereby allowing voluntary switch activation upon mouth opening. However, for this technology to be clinically viable, it must be validated against a gold standard switch, such as a chin switch, that taps into the same voluntary motion. Methods In this study, we report an experiment designed to gauge the concurrent validity of the infrared thermal switch. Ten able-bodied adults participated in a series of 3 test sessions where they simultaneously used both an infrared thermal and conventional chin switch to perform multiple trials of a number identification task with visual, auditory and audiovisual stimuli. Participants also provided qualitative feedback about switch use. User performance with the two switches was quantified using an efficiency measure based on mutual information. Results User performance (p = 0.16 and response time (p = 0.25 with the infrared thermal switch were comparable to those of the gold standard. Users reported preference for the infrared thermal switch given its non-contact nature and robustness to changes in user posture. Conclusions Thermal infrared access technology appears to be a valid single switch alternative for individuals with disabilities who retain voluntary mouth opening and closing.

  20. Decay heat experiment and validation of calculation code systems for fusion reactor

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of ±10%. (author)

  1. Validation of two-phase flow code THYC on VATICAN experiment

    International Nuclear Information System (INIS)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B.

    1997-01-01

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project > has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  2. Validation of two-phase flow code THYC on VATICAN experiment

    Energy Technology Data Exchange (ETDEWEB)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B. [EDF/DER, Dept. TTA, 78 - Chatou (France)

    1997-12-31

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project <> has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  3. Decay heat experiment and validation of calculation code systems for fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of {+-}10%. (author)

  4. Validations of BWR nuclear design code using ABWR MOX numerical benchmark problems

    International Nuclear Information System (INIS)

    Takano, Shou; Sasagawa, Masaru; Yamana, Teppei; Ikehara, Tadashi; Yanagisawa, Naoki

    2017-01-01

    BWR core design code package (the HINES assembly code and the PANACH core simulator), being used for full MOX-ABWR core design, has been benchmarked against the high-fidelity numerical solutions as references, for the purpose of validating its capability of predicting the BWR core design parameters systematically from UO 2 to 100% MOX cores. The reference solutions were created by whole core critical calculations using MCNPs with the precisely modeled ABWR cores both in hot and cold conditions at BOC and EOC of the equilibrium cycle. A Doppler-Broadening Rejection Correction (DCRB) implemented MCNP5-1.4 with ENDF/B-VII.0 was mainly used to evaluate the core design parameters, except for effective delayed neutron fraction (β eff ) and prompt neutron lifetime (l) with MCNP6.1. The discrepancies in the results between the design codes HINES-PANACH and MCNPs for the core design parameters such as the bundle powers, hot pin powers, control rod worth, boron worth, void reactivity, Doppler reactivity, β eff and l, are almost within target accuracy, leading to the conclusion that HINES-PANACH has sufficient fidelity for application to full MOX-ABWR core design. (author)

  5. A validation of ATR LOCA thermal-hydraulic code with a statistical approach

    International Nuclear Information System (INIS)

    Mochizuki, Hiroyasu

    2000-01-01

    When cladding temperatures are measured for a blowdown experiment, cladding temperatures at the same elevation in the fuel bundle have usually some differences due to eccentricity of the fuel bundle and other reasons such as biased two-phase flow. In the present paper, manufacturing tolerances and uncertainties of thermal-hydraulics are incorporated into a LOCA code that is applied with the statistical method. The present method was validated with the results of different blowdown experiments conducted using the 6 MW blowdown facility simulating the Advanced Thermal Reactor (ATR). In the present statistical method, the code was modified to run fast in order to calculate the blowdown thermal-hydraulics a lot of times with the code using different sets of input data. These input data for sizes and empirical correlations are prepared by the effective Monte-Carlo method based on the distribution functions deduced by the measured manufacturing errors and the uncertainties of thermal hydraulics. The calculated curves express uncertainties due to the different input deck. The uncertainty band and tendency of the cladding temperature were dependent on the beak sizes in the experiment. The measured results were traced by the present method. (author)

  6. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  7. A Supersonic Argon/Air Coaxial Jet Experiment for Computational Fluid Dynamics Code Validation

    Science.gov (United States)

    Clifton, Chandler W.; Cutler, Andrew D.

    2007-01-01

    A non-reacting experiment is described in which data has been acquired for the validation of CFD codes used to design high-speed air-breathing engines. A coaxial jet-nozzle has been designed to produce pressure-matched exit flows of Mach 1.8 at 1 atm in both a center jet of argon and a coflow jet of air, creating a supersonic, incompressible mixing layer. The flowfield was surveyed using total temperature, gas composition, and Pitot probes. The data set was compared to CFD code predictions made using Vulcan, a structured grid Navier-Stokes code, as well as to data from a previous experiment in which a He-O2 mixture was used instead of argon in the center jet of the same coaxial jet assembly. Comparison of experimental data from the argon flowfield and its computational prediction shows that the CFD produces an accurate solution for most of the measured flowfield. However, the CFD prediction deviates from the experimental data in the region downstream of x/D = 4, underpredicting the mixing-layer growth rate.

  8. Validation of the metal fuel version of the SAS4A accident analysis code

    International Nuclear Information System (INIS)

    Tentner, A.M.

    1991-01-01

    This paper describes recent work directed towards the validation of the metal fuel version of the SAS4A accident analysis code. The SAS4A code system has been developed at Argonne National Laboratory for the simulation of hypothetical severe accidents in Liquid Metal-Cooled Reactors (LMR), designed to operate in a fast neutron spectrum. SAS4A was initially developed for the analysis of oxide-fueled liquid metal-cooled reactors and has played an important role in the simulation and assessment of the energetics potential for postulated severe accidents in these reactors. Due to the current interest in the metal-fueled liquid metal-cooled reactors, a metal fuel version of the SAS4A accident analysis code is being developed in the Integral Fast Reactor program at Argonne. During such postulated accident scenarios as the unprotected (i.e. without scram) loss-of-flow and transient overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling, and fuel and cladding melting and relocation. Due to strong neutronic feedbacks these events can significantly influence the reactor power history in the accident progression. The paper presents the results of a recent SAS4A simulation of the M7 TREAT experiment. 6 refs., 5 figs

  9. Renewable Energy and Energy Efficiency Technologies in Residential Building Codes: June 15, 1998 to September 15, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Wortman, D.; Echo-Hawk, L.

    2005-02-01

    This report is an attempt to describe the building code requirements and impediments to the application of EE and RE technologies in residential buildings. Several modern model building codes were reviewed. These are representative of the codes that will be adopted by most locations in the coming years. The codes reviewed for this report include: International Residential Code, First Draft, April 1998; International Energy Conservation Code, 1998; International Mechanical Code, 1998; International Plumbing Code, 1997; International Fuel Gas Code, 1997; National Electrical Code, 1996. These codes were reviewed as to their application to (1) PV systems in buildings and building-integrated PV systems and (2) active solar domestic hot water and space-heating systems. A discussion of general code issues that impact these technologies is also included. Examples of this are solar access and sustainability.

  10. COCOSYS: Status of development and validation of the German containment code system

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Arndt, S.; Klein-Hessling, W.; Schwarz, S.; Spengler, C.; Weber, G.

    2006-01-01

    For the simulation of severe accident propagation in containments of nuclear power plants it is necessary to assess the efficiency of a severe accident measures under conditions as realistic as possible. Therefore the German containment code system COCOSYS is under development and validation at GRS. The main objective is to provide a code system on the basis of mostly mechanistic models for the comprehensive simulation of all relevant processes and plant states during severe accidents in the containment of light water reactors covering the design basis accidents, too. COCOSYS is being used for the identification of possible deficits in plant safety, qualification of the safety reserves of the entire system, assessment of damage-limiting or mitigating accident management measures, support of integral codes in PSA level 2 studies and safety evaluation of new plants. COCOSYS is composed for three main modules, which are separate executable files. The communication is realized via PVM (parallel virtual machine). The thermal hydraulic main module (THY) contains several specific models relevant for the simulation of severe accidents. Beside the usual capabilities to calculate the gas distribution and thermal behavior inside the containment, there are special models for the simulation of Hydrogen deflagration, pressure suppression systems etc. Further detailed models exist for the simulation of safety systems, like catalytic recombiners (PAR's), safety relief valves (used in WWR-440/V-230 type plants), ice condenser model, pump and spray system models for the complete simulation of cooling systems. The aerosol and fission product part (AFP) describes the aerosol behavior of nonsoluble and as well as hygroscopic aerosols, iodine chemistry and fission transport. Further the decay process of nuclides is considered using ORIGIN like routines. The corium concrete interaction (CCI) main module is based on an improved version of WECHSL extended by the ChemApp module for the

  11. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  12. Sound Synthesis and Bar-Code Technology to Develop Learning Environments for Blind Children.

    Science.gov (United States)

    Burger, D.; And Others

    1990-01-01

    An interactive, computerized sound machine was designed, incorporating bar-code technology in the user interface. The system was used in a classroom of nine blind elementary level children to teach sound awareness, logic, metalinguistics, and technological literacy and was found to have pedagogical relevance. (Author/JDD)

  13. Validation of integrated burnup code system SWAT2 by the analyses of isotopic composition of spent nuclear fuel

    International Nuclear Information System (INIS)

    Suyama, K.; Mochizuki, H.; Okuno, H.; Miyoshi, Y.

    2004-01-01

    This paper provides validation results of SWAT2, the revised version of SWAT, which is a code system combining point burnup code ORIGEN2 and continuous energy Monte Carlo code MVP, by the analysis of post irradiation examinations (PIEs). Some isotopes show differences of calculation results between SWAT and SWAT2. However, generally, the differences are smaller than the error of PIE analysis that was reported in previous SWAT validation activity, and improved results are obtained for several important fission product nuclides. This study also includes comparison between an assembly and a single pin cell geometry models. (authors)

  14. The Effects of Bar-coding Technology on Medication Errors: A Systematic Literature Review.

    Science.gov (United States)

    Hutton, Kevin; Ding, Qian; Wellman, Gregory

    2017-02-24

    The bar-coding technology adoptions have risen drastically in U.S. health systems in the past decade. However, few studies have addressed the impact of bar-coding technology with strong prospective methodologies and the research, which has been conducted from both in-pharmacy and bedside implementations. This systematic literature review is to examine the effectiveness of bar-coding technology on preventing medication errors and what types of medication errors may be prevented in the hospital setting. A systematic search of databases was performed from 1998 to December 2016. Studies measuring the effect of bar-coding technology on medication errors were included in a full-text review. Studies with the outcomes other than medication errors such as efficiency or workarounds were excluded. The outcomes were measured and findings were summarized for each retained study. A total of 2603 articles were initially identified and 10 studies, which used prospective before-and-after study design, were fully reviewed in this article. Of the 10 included studies, 9 took place in the United States, whereas the remaining was conducted in the United Kingdom. One research article focused on bar-coding implementation in a pharmacy setting, whereas the other 9 focused on bar coding within patient care areas. All 10 studies showed overall positive effects associated with bar-coding implementation. The results of this review show that bar-coding technology may reduce medication errors in hospital settings, particularly on preventing targeted wrong dose, wrong drug, wrong patient, unauthorized drug, and wrong route errors.

  15. Validation of favor code linear elastic fracture solutions for finite-length flaw geometries

    International Nuclear Information System (INIS)

    Dickson, T.L.; Keeney, J.A.; Bryson, J.W.

    1995-01-01

    One of the current tasks within the US Nuclear Regulatory Commission (NRC)-funded Heavy Section Steel Technology Program (HSST) at Oak Ridge National Laboratory (ORNL) is the continuing development of the FAVOR (Fracture, analysis of Vessels: Oak Ridge) computer code. FAVOR performs structural integrity analyses of embrittled nuclear reactor pressure vessels (RPVs) with stainless steel cladding, to evaluate compliance with the applicable regulatory criteria. Since the initial release of FAVOR, the HSST program has continued to enhance the capabilities of the FAVOR code. ABAQUS, a nuclear quality assurance certified (NQA-1) general multidimensional finite element code with fracture mechanics capabilities, was used to generate a database of stress-intensity-factor influence coefficients (SIFICs) for a range of axially and circumferentially oriented semielliptical inner-surface flaw geometries applicable to RPVs with an internal radius (Ri) to wall thickness (w) ratio of 10. This database of SIRCs has been incorporated into a development version of FAVOR, providing it with the capability to perform deterministic and probabilistic fracture analyses of RPVs subjected to transients, such as pressurized thermal shock (PTS), for various flaw geometries. This paper discusses the SIFIC database, comparisons with other investigators, and some of the benchmark verification problem specifications and solutions

  16. Location Based Service in Indoor Environment Using Quick Response Code Technology

    Science.gov (United States)

    Hakimpour, F.; Zare Zardiny, A.

    2014-10-01

    Today by extensive use of intelligent mobile phones, increased size of screens and enriching the mobile phones by Global Positioning System (GPS) technology use of location based services have been considered by public users more than ever.. Based on the position of users, they can receive the desired information from different LBS providers. Any LBS system generally includes five main parts: mobile devices, communication network, positioning system, service provider and data provider. By now many advances have been gained in relation to any of these parts; however the users positioning especially in indoor environments is propounded as an essential and critical issue in LBS. It is well known that GPS performs too poorly inside buildings to provide usable indoor positioning. On the other hand, current indoor positioning technologies such as using RFID or WiFi network need different hardware and software infrastructures. In this paper, we propose a new method to overcome these challenges. This method is using the Quick Response (QR) Code Technology. QR Code is a 2D encrypted barcode with a matrix structure which consists of black modules arranged in a square grid. Scanning and data retrieving process from QR Code is possible by use of different camera-enabled mobile phones only by installing the barcode reader software. This paper reviews the capabilities of QR Code technology and then discusses the advantages of using QR Code in Indoor LBS (ILBS) system in comparison to other technologies. Finally, some prospects of using QR Code are illustrated through implementation of a scenario. The most important advantages of using this new technology in ILBS are easy implementation, spending less expenses, quick data retrieval, possibility of printing the QR Code on different products and no need for complicated hardware and software infrastructures.

  17. COBRA-SFS [Spent Fuel Storage]: A thermal-hydraulic analysis computer code: Volume 3, Validation assessments

    International Nuclear Information System (INIS)

    Lombardo, N.J.; Cuta, J.M.; Michener, T.E.; Rector, D.R.; Wheeler, C.L.

    1986-12-01

    This report presents the results of the COBRA-SFS (Spent Fuel Storage) computer code validation effort. COBRA-SFS, while refined and specialized for spent fuel storage system analyses, is a lumped-volume thermal-hydraulic analysis computer code that predicts temperature and velocity distributions in a wide variety of systems. Through comparisons of code predictions with spent fuel storage system test data, the code's mathematical, physical, and mechanistic models are assessed, and empirical relations defined. The six test cases used to validate the code and code models include single-assembly and multiassembly storage systems under a variety of fill media and system orientations and include unconsolidated and consolidated spent fuel. In its entirety, the test matrix investigates the contributions of convection, conduction, and radiation heat transfer in spent fuel storage systems. To demonstrate the code's performance for a wide variety of storage systems and conditions, comparisons of code predictions with data are made for 14 runs from the experimental data base. The cases selected exercise the important code models and code logic pathways and are representative of the types of simulations required for spent fuel storage system design and licensing safety analyses. For each test, a test description, a summary of the COBRA-SFS computational model, assumptions, and correlations employed are presented. For the cases selected, axial and radial temperature profile comparisons of code predictions with test data are provided, and conclusions drawn concerning the code models and the ability to predict the data and data trends. Comparisons of code predictions with test data demonstrate the ability of COBRA-SFS to successfully predict temperature distributions in unconsolidated or consolidated single and multiassembly spent fuel storage systems

  18. Validation of an ICD code for accurately identifying emergency department patients who suffer an out-of-hospital cardiac arrest.

    Science.gov (United States)

    Shelton, Shelby K; Chukwulebe, Steve B; Gaieski, David F; Abella, Benjamin S; Carr, Brendan G; Perman, Sarah M

    2018-01-16

    International classification of disease (ICD-9) code 427.5 (cardiac arrest) is utilized to identify cohorts of patients who suffer out-of-hospital cardiac arrest (OHCA), though the use of ICD codes for this purpose has never been formally validated. We sought to validate the utility of ICD-9 code 427.5 by identifying patients admitted from the emergency department (ED) after OHCA. Adult visits to a single ED between January 2007 and July 2012 were retrospectively examined and a keyword search of the electronic medical record (EMR) was used to identify patients. Cardiac arrest was confirmed; and ICD-9 information and location of return of spontaneous circulation (ROSC) were collected. Separately, the EMR was searched for patients who received ICD-9 code 427.5. The kappa coefficient (κ) was calculated, as was the sensitivity and specificity of the code for identifying OHCA. The keyword search identified 1717 patients, of which 385 suffered OHCA and 333 were assigned the code 427.5. The agreement between ICD-9 code and cardiac arrest was excellent (κ = 0.895). The ICD-9 code 427.5 was both specific (99.4%) and sensitive (86.5%). Of the 52 cardiac arrests that were not identified by ICD-9 code, 33% had ROSC before arrival to the ED. When searching independently on ICD-9 code, 347 patients with ICD-9 code 427.5 were found, of which 320 were "true" arrests. This yielded a positive predictive value of 92% for ICD-9 code 427.5 in predicting OHCA. ICD-9 code 427.5 is sensitive and specific for identifying ED patients who suffer OHCA with a positive predictive value of 92%. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  20. Validation of AMPX-KENO code for criticality analysis under various moderator density condition

    International Nuclear Information System (INIS)

    Ahn, Joon Gi; Hwang, Hae Ryang; Kim, Hyeong Heon; Lee, Seong Hee

    1992-01-01

    Nuclear criticality safety analysis shall be performed for the storage and handling facilities of the fissionable materials and the calculational method used to determine the effective multiplication factor also shall be validated by comparison with proper experimental data. The benchmark calculations were performed for the criticality analysis of new fuel storage facility using AMPX-KENO computer code system. The reference of the benchmark calculations are the critical experiments performed by the Nuclear Safety Department of the French Atomic Energy Commission to study the problems raised by the accidental sprinkling of a mist into a fuel storage. The bias and statistical uncertainties of the calculational method that will be applied in the criticality analysis of new fuel storage facility were also evaluated

  1. Current status of thermohydraulic validation studies at CEA-Grenoble for the SIMMER-III code

    International Nuclear Information System (INIS)

    Coste, P.; Pigny, S.; Meignen, R.

    2000-01-01

    SIMMER-III (SIII) is a two-dimensional, three-velocity-field, multiphase, multicomponent, Eulerian, fluid-dynamics code coupled with a space- and energy-dependent neutron kinetics model, to investigate postulated core disruptive accidents in LMFRs. It is developed by PNC, Japan. The paper makes the synthesis of the SIII assessment performed at CEA-Grenoble since 1996, which covers a large variety of multiphase flows, from two-phase flow basic modelling to LMFR accident simulation experiments with real materials. Single bubbles or droplets equilibrium radii and velocities, air/water experiments in tubes, and comparisons with the literature, are used to qualify the interfacial area convection equation and the momentum exchange functions. Using the second order differencing scheme of the Navier-Stokes equation, a turbulence model for two-phase recirculating flows is implemented. It is successfully validated on an adiabatic air/water experiment, and on the Sebulon boiling pool simulation experiment, which is a box of water internally heated, with a cover gas, and cooled at the walls. The successful calculations of the SGI experiment and of a reactor scale case contribute to the code validation for LMFR expansion phase. Besides, the large scale UO2/sodium interactions of the Termos T1 experiment, and the UO2 boiling pool laterally cooled with sodium flow at the wall of the Scarabee BF2 experiment, is also studied with SIM Lastly, satisfying results are obtained with the calculation of the Scarabee APL3 slow pump run down without scram. It is shown that SIII is a state-of-the-art tool to simulate transient multiphase phenomena. The paper also discusses those areas, identified through these assessment calculations, which require further research and development. (author)

  2. Validation of annual average air concentration predictions from the AIRDOS-EPA computer code

    International Nuclear Information System (INIS)

    Miller, C.W.; Fields, D.E.; Cotter, S.J.

    1981-01-01

    The AIRDOS-EPA computer code is used to assess the annual doses to the general public resulting from releases of radionuclides to the atmosphere by Oak Ridge National Laboratory (ORNL) facilities. This code uses a modified Gaussian plume equation to estimate air concentrations resulting from the release of a maximum of 36 radionuclides. Radionuclide concentrations in food products are estimated from the output of the atmospheric transport model using the terrestrial transport model described in US Nuclear Regulatory Commission Regulatory Guide 1.109. Doses to man at each distance and direction specified are estimated for up to eleven organs and five exposure modes. To properly use any environmental transport model, some estimate of the model's predictive accuracy must be obtained. Because of a lack of sufficient data for the ORNL site, one year of weekly average 85 Kr concentrations observed at 13 stations located 30 to 150 km distant from an assumed-continuous point source at the Savannah River Plant, Aiken, South Carolina, have been used in a validation study of the atmospheric transport portion of AIRDOS-EPA. The predicted annual average concentration at each station exceeded the observed value in every case. The overprediction factor ranged from 1.4 to 3.4 with an average value of 2.4. Pearson's correlation between pairs of logarithms of observed and predicted values was r = 0.93. Based on a one-tailed students's test, we can be 98% confident that for this site under similar meteorological, release, and monitoring conditions no annual average air concentrations will be observed at the sampling stations in excess of those predicted by the code. As the averaging time of the prdiction decreases, however, the uncertainty in the prediction increases

  3. Development of the SEAtrace{trademark} barrier verification and validation technology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, S.D.; Lowry, W.; Walsh, R.; Rao, D.V. [Science and Engineering Associates, Santa Fe, NM (United States); Williams, C. [Sandia National Labs., Albuquerque, NM (United States). Underground Storage Technology Dept.

    1998-08-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) have developed a quantitative subsurface barrier assessment system using gaseous tracers in support of the Subsurface Contaminants Focus Area barrier technology program. Called SEAtrace{trademark}, this system integrates an autonomous, multi-point soil vapor sampling and analysis system with a global optimization modeling methodology to locate and size barrier breaches in real time. The methodology for the global optimization code was completed and a prototype code written using simplifying assumptions. Preliminary modeling work to validate the code assumptions were performed using the T2VOC numerical code. A multi-point field sampling system was built to take soil gas samples and analyze for tracer gas concentration. The tracer concentration histories were used in the global optimization code to locate and size barrier breaches. SEAtrace{trademark} was consistently able to detect and locate leaks, even under very adverse conditions. The system was able to locate the leak to within 0.75 m of the actual value, and was able to determine the size of the leak to within 0.15 m.

  4. Validation of a Computational Fluid Dynamics (CFD) Code for Supersonic Axisymmetric Base Flow

    Science.gov (United States)

    Tucker, P. Kevin

    1993-01-01

    The ability to accurately and efficiently calculate the flow structure in the base region of bodies of revolution in supersonic flight is a significant step in CFD code validation for applications ranging from base heating for rockets to drag for protectives. The FDNS code is used to compute such a flow and the results are compared to benchmark quality experimental data. Flowfield calculations are presented for a cylindrical afterbody at M = 2.46 and angle of attack a = O. Grid independent solutions are compared to mean velocity profiles in the separated wake area and downstream of the reattachment point. Additionally, quantities such as turbulent kinetic energy and shear layer growth rates are compared to the data. Finally, the computed base pressures are compared to the measured values. An effort is made to elucidate the role of turbulence models in the flowfield predictions. The level of turbulent eddy viscosity, and its origin, are used to contrast the various turbulence models and compare the results to the experimental data.

  5. Validation of Code ASTEC with LIVE-L1 Experimental Results

    International Nuclear Information System (INIS)

    Bachrata, Andrea

    2008-01-01

    The severe accidents with core melting are considered at the design stage of project at Generation 3+ of Nuclear Power Plants (NPP). Moreover, there is an effort to apply the severe accident management to the operated NPP. The one of main goals of severe accidents mitigation is corium localization and stabilization. The two strategies that fulfil this requirement are: the in-vessel retention (e.g. AP-600, AP- 1000) and the ex-vessel retention (e.g. EPR). To study the scenario of in-vessel retention, a large experimental program and the integrated codes have been developed. The LIVE-L1 experimental facility studied the formation of melt pools and the melt accumulation in the lower head using different cooling conditions. Nowadays, a new European computer code ASTEC is being developed jointly in France and Germany. One of the important steps in ASTEC development in the area of in-vessel retention of corium is its validation with LIVE-L1 experimental results. Details of the experiment are reported. Results of the ASTEC (module DIVA) application to the analysis of the test are presented. (author)

  6. Development and validation of educational technology for venous ulcer care.

    Science.gov (United States)

    Benevides, Jéssica Lima; Coutinho, Janaina Fonseca Victor; Pascoal, Liliane Chagas; Joventino, Emanuella Silva; Martins, Mariana Cavalcante; Gubert, Fabiane do Amaral; Alves, Allana Mirella

    2016-04-01

    To develop and validate an educational technology venous ulcers care. Methodological study conducted in five steps: Situational diagnosis; literature review; development of texts, illustrations and layout; apparent and content validity by the Content Validity Index, assessment of Flesch Readability Index; and pilot testing. The developed technology was a type of booklet entitled Booklet for Venous Ulcers Care, consisting of seven topics: Diet and food intake, walking and light exercise, resting with elevated leg, bandage care, compression therapy, family support, and keeping healthy habits. The apparent validity revealed minimal agreement of 85.7% in the clarity and comprehensibility. The total content validity index was 0.97, the Flesch Readability Index was 75%, corresponding to the reading "fairly easy". The pilot test showed that 100% of people with venous ulcers evaluated the text and the illustrations as understandable, as appropriate. The educational technology proved to be valid for the appearance and content with potential for use in clinical practice. Construir e validar uma tecnologia educativa para cuidados com úlcera venosa. Estudo metodológico realizado em cinco fases: diagnóstico situacional; revisão da literatura; desenvolvimento de textos, ilustrações e diagramação; validade de aparência e de conteúdo pelo Índice de Validade de Conteúdo, avaliação do Índice de Legibilidade de Flesch; e teste piloto. A tecnologia desenvolvida foi do tipo cartilha intitulada Cartilha para cuidados com úlcera venosa, constituída de sete tópicos: Alimentação, Caminhadas e exercícios leves, Repouso com a perna elevada, Cuidados com o curativo, Terapia compressiva, Apoio familiar, e manter hábitos saudáveis. A validade aparente revelou concordância mínima de 85,7% na clareza e compreensibilidade. O Índice de Validade de Conteúdo total foi de 0,97, o Índice de legibilidade de Flesch foi de 75%, o que correspondeu à leitura "razoavelmente f

  7. Validation matrix for the assessment of thermal-hydraulic codes for VVER LOCA and transients. A report by the OECD support group on the VVER thermal-hydraulic code validation matrix

    International Nuclear Information System (INIS)

    2001-06-01

    This report deals with an internationally agreed experimental test facility matrix for the validation of best estimate thermal-hydraulic computer codes applied for the analysis of VVER reactor primary systems in accident and transient conditions. Firstly, the main physical phenomena that occur during the considered accidents are identified, test types are specified, and test facilities that supplement the CSNI CCVMs and are suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. The construction of VVER Thermal-Hydraulic Code Validation Matrix follows the logic of the CSNI Code Validation Matrices (CCVM). Similar to the CCVM it is an attempt to collect together in a systematic way the best sets of available test data for VVER specific code validation, assessment and improvement, including quantitative assessment of uncertainties in the modelling of phenomena by the codes. In addition to this objective, it is an attempt to record information which has been generated in countries operating VVER reactors over the last 20 years so that it is more accessible to present and future workers in that field than would otherwise be the case. (authors)

  8. Feasibility and validation of virtual autopsy for dental identification using the Interpol dental codes.

    Science.gov (United States)

    Franco, Ademir; Thevissen, Patrick; Coudyzer, Walter; Develter, Wim; Van de Voorde, Wim; Oyen, Raymond; Vandermeulen, Dirk; Jacobs, Reinhilde; Willems, Guy

    2013-05-01

    Virtual autopsy is a medical imaging technique, using full body computed tomography (CT), allowing for a noninvasive and permanent observation of all body parts. For dental identification clinically and radiologically observed ante-mortem (AM) and post-mortem (PM) oral identifiers are compared. The study aimed to verify if a PM dental charting can be performed on virtual reconstructions of full-body CT's using the Interpol dental codes. A sample of 103 PM full-body CT's was collected from the forensic autopsy files of the Department of Forensic Medicine University Hospitals, KU Leuven, Belgium. For validation purposes, 3 of these bodies underwent a complete dental autopsy, a dental radiological and a full-body CT examination. The bodies were scanned in a Siemens Definition Flash CT Scanner (Siemens Medical Solutions, Germany). The images were examined on 8- and 12-bit screen resolution as three-dimensional (3D) reconstructions and as axial, coronal and sagittal slices. InSpace(®) (Siemens Medical Solutions, Germany) software was used for 3D reconstruction. The dental identifiers were charted on pink PM Interpol forms (F1, F2), using the related dental codes. Optimal dental charting was obtained by combining observations on 3D reconstructions and CT slices. It was not feasible to differentiate between different kinds of dental restoration materials. The 12-bit resolution enabled to collect more detailed evidences, mainly related to positions within a tooth. Oral identifiers, not implemented in the Interpol dental coding were observed. Amongst these, the observed (3D) morphological features of dental and maxillofacial structures are important identifiers. The latter can become particularly more relevant towards the future, not only because of the inherent spatial features, yet also because of the increasing preventive dental treatment, and the decreasing application of dental restorations. In conclusion, PM full-body CT examinations need to be implemented in the

  9. Assessing the impact of automated coding & grouping technology at St Vincent's Hospital, Sydney.

    Science.gov (United States)

    Howes, M H

    1993-12-01

    In 1992 the Hospital recognised that the existing casemix data reporting systems were too removed from individual patients to have any meaning for clinicians, analysis of the data was difficult and the processes involved in the DRG assignment were subject to considerable error. Consequently, the Hospital approved the purchase of technology that would facilitate the coding and grouping process. The impact of automated coding and grouping technology is assessed by three methods. Firstly, by looking at by-product information systems, secondly, through subjective responses by coders to a satisfaction questionnaire and, thirdly, by objectively measuring hospital activity and identified coding elements before and after implementation of the 3M technology. It was concluded that while the 3M Coding and Grouping software should not be viewed as a panacea to all coding and documentation ills, objective evidence and subjective comment from the coders indicated an improvement in data quality and more accurate DRG assignment. Development of an in-house casemix information system and a feedback mechanism between coder and clinician had been effected. The product had been used as a training tool for coders and had also proven to be a useful auditing tool. Finally, linkage with other systems and the generation of timely reports had been realised.

  10. Validity of code based algorithms to identify primary open angle glaucoma (POAG) in Veterans Affairs (VA) administrative databases.

    Science.gov (United States)

    Biggerstaff, K S; Frankfort, B J; Orengo-Nania, S; Garcia, J; Chiao, E; Kramer, J R; White, D

    2018-04-01

    The validity of the International Classification of Diseases, 9th revision, Clinical Modification (ICD-9) code for primary open angle glaucoma (POAG) in the Department of Veterans Affairs (VA) electronic medical record has not been examined. We determined the accuracy of the ICD-9 code for POAG and developed diagnostic algorithms for the detection of POAG. We conducted a retrospective study of abstracted data from the Michael E. DeBakey VA Medical Center's medical records of 334 unique patients with at least one visit to the Eye Clinic between 1999 and 2013. Algorithms were developed to validly identify POAG using ICD-9 codes and pharmacy data. The positive predictive value (PPV), negative predictive value (NPV), sensitivity, specificity and percent agreement of the various algorithms were calculated. For the ICD-9 code 365.1x, the PPV was 65.9%, NPV was 95.2%, sensitivity was 100%, specificity was 82.6%, and percent agreement was 87.8%. The algorithm with the highest PPV was 76.3%, using pharmacy data in conjunction with two or more ICD-9 codes for POAG, but this algorithm also had the lowest NPV at 88.2%. Various algorithms for identifying POAG in the VA administrative databases have variable validity. Depending on the type of research being done, the ICD-9 code 365.1x can be used for epidemiologic or health services database research.

  11. The use of QR Code as a learning technology: an exploratory study

    Directory of Open Access Journals (Sweden)

    Stefano Besana

    2010-12-01

    Full Text Available This paper discusses a pilot study on the potential benefits of QR (Quick Response Codes as a tool for facilitating and enhancing learning processes. An analysis is given of the strengths and added value of QR technologies applied to museum visits, with precautions regarding the design of learning environments like the one presented. Some possible future scenarios are identified for implementing these technologies in contexts more strictly related to teaching and education.

  12. Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.

    Science.gov (United States)

    Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N

    2016-04-01

    The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P children with cerebral palsy. © The Author(s) 2015.

  13. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  14. Validity of Principal Diagnoses in Discharge Summaries and ICD-10 Coding Assessments Based on National Health Data of Thailand.

    Science.gov (United States)

    Sukanya, Chongthawonsatid

    2017-10-01

    This study examined the validity of the principal diagnoses on discharge summaries and coding assessments. Data were collected from the National Health Security Office (NHSO) of Thailand in 2015. In total, 118,971 medical records were audited. The sample was drawn from government hospitals and private hospitals covered by the Universal Coverage Scheme in Thailand. Hospitals and cases were selected using NHSO criteria. The validity of the principal diagnoses listed in the "Summary and Coding Assessment" forms was established by comparing data from the discharge summaries with data obtained from medical record reviews, and additionally, by comparing data from the coding assessments with data in the computerized ICD (the data base used for reimbursement-purposes). The summary assessments had low sensitivities (7.3%-37.9%), high specificities (97.2%-99.8%), low positive predictive values (9.2%-60.7%), and high negative predictive values (95.9%-99.3%). The coding assessments had low sensitivities (31.1%-69.4%), high specificities (99.0%-99.9%), moderate positive predictive values (43.8%-89.0%), and high negative predictive values (97.3%-99.5%). The discharge summaries and codings often contained mistakes, particularly the categories "Endocrine, nutritional, and metabolic diseases", "Symptoms, signs, and abnormal clinical and laboratory findings not elsewhere classified", "Factors influencing health status and contact with health services", and "Injury, poisoning, and certain other consequences of external causes". The validity of the principal diagnoses on the summary and coding assessment forms was found to be low. The training of physicians and coders must be strengthened to improve the validity of discharge summaries and codings.

  15. Evaluation of measured LWR spent fuel composition data for use in code validation end-user manual

    International Nuclear Information System (INIS)

    Hermann, O.W.; DeHart, M.D.; Murphy, B.D.

    1998-02-01

    Burnup credit (BUC) is a concept applied in the criticality safety analysis of spent nuclear fuel in which credit or partial credit is taken for the reduced reactivity worth of the fuel due to both fissile depletion and the buildup of actinides and fission products that act as net neutron absorbers. Typically, a two-step process is applied in BUC analysis: first, depletion calculations are performed to estimate the isotopic content of spent fuel based on its burnup history; second, three-dimensional (3-D) criticality calculations are performed based on specific spent fuel packaging configurations. In seeking licensing approval of any BUC approach (e.g., disposal, transportation, or storage) both of these two computational procedures must be validated. This report was prepared in support of the validation process for depletion methods applied in the analysis of spent fuel from commercial light-water-reactor (LWR) designs. Such validation requires the comparison of computed isotopic compositions with those measured via radiochemical assay to assess the ability of a computer code to predict the contents of spent fuel samples. The purpose of this report is to address the availability and appropriateness of measured data for use in the validation of isotopic depletion methods. Although validation efforts to date at ORNL have been based on calculations using the SAS2H depletion sequence of the SCALE code system, this report has been prepared as an overview of potential sources of validation data independent of the code system used. However, data that are identified as in use in this report refer to earlier validation work performed using SAS2H in support of BUC. This report is the result of a study of available assay data, using the experience gained in spent fuel isotopic validation and with a consideration of the validation issues described earlier. This report recommends the suitability of each set of data for validation work similar in scope to the earlier work

  16. Remote sensing validation through SOOP technology: implementation of Spectra system

    Science.gov (United States)

    Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco

    2017-04-01

    The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to

  17. Validation of CONTAIN-LMR code for accident analysis of sodium-cooled fast reactor containments

    Energy Technology Data Exchange (ETDEWEB)

    Gordeev, S.; Hering, W.; Schikorr, M.; Stieglitz, R. [Inst. for Neutron Physic and Reactor Technology, Karlsruhe Inst. of Technology, Campus Nord (Germany)

    2012-07-01

    CONTAIN-LMR 1 is an analytical tool for the containment performance of sodium cooled fast reactors. In this code, the modelling for the sodium fire is included: the oxygen diffusion model for the sodium pool fire, and the liquid droplet model for the sodium spray fire. CONTAIN-LMR is also able to model the interaction of liquid sodium with concrete structure. It may be applicable to different concrete compositions. Testing and validation of these models will help to qualify the simulation results. Three experiments with sodium performed in the FAUNA facility at FZK have been used for the validation of CONTAIN-LMR. For pool fire tests, calculations have been performed with two models. The first model consists of one gas cell representing the volume of the burn compartment. The volume of the second model is subdivided into 32 coupled gas cells. The agreement between calculations and experimental data is acceptable. The detailed pool fire model shows less deviation from experiments. In the spray fire, the direct heating from the sodium burning in the media is dominant. Therefore, single cell modeling is enough to describe the phenomena. Calculation results have reasonable agreement with experimental data. Limitations of the implemented spray model can cause the overestimation of predicted pressure and temperature in the cell atmosphere. The ability of the CONTAIN-LMR to simulate the sodium pool fire accompanied by sodium-concrete reactions was tested using the experimental study of sodium-concrete interactions for construction concrete as well as for shielding concrete. The model provides a reasonably good representation of chemical processes during sodium-concrete interaction. The comparison of time-temperature profiles of sodium and concrete shows, that the model requires modifications for predictions of the test results. (authors)

  18. VULCAN: An Open-source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    Science.gov (United States)

    Tsai, Shang-Min; Lyons, James R.; Grosheintz, Luc; Rimmer, Paul B.; Kitzmann, Daniel; Heng, Kevin

    2017-02-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K, using a reduced C-H-O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer & Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. We also use VULCAN to examine the theoretical trends produced when the temperature-pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching approximation and find that it is accurate for methane but breaks down for acetylene, because the disequilibrium abundance of acetylene is not directly determined by transport-induced quenching, but is rather indirectly controlled by the disequilibrium abundance of methane. Therefore we suggest that the quenching approximation should be used with caution and must always be checked against a chemical kinetics calculation. A one-dimensional model atmosphere with 100 layers, computed using VULCAN, typically takes several minutes to complete. VULCAN is part of the Exoclimes Simulation Platform (ESP; exoclime.net) and publicly available at https://github.com/exoclime/VULCAN.

  19. Validity of congenital malformation diagnostic codes recorded in Québec's administrative databases.

    Science.gov (United States)

    Blais, Lucie; Bérard, Anick; Kettani, Fatima-Zohra; Forget, Amélie

    2013-08-01

    To assess the validity of the diagnostic codes of congenital malformations (CMs) recorded in two of Québec's administrative databases. A cohort of pregnancies and infants born to asthmatic and non-asthmatic women in 1990-2002 was reconstructed using Québec's administrative databases. From this cohort, we selected 269 infants with a CM and 144 without CM born to asthmatic women, together with 284 and 138 infants, respectively, born to non-asthmatic women. The diagnoses of CMs recorded in the databases were compared with the diagnoses written by the physicians in the infants' medical charts. The positive predictive values (PPV) and negative predictive values (NPV) for all, major, and several specific CMs were estimated. The PPVs for all CMs and major CMs were 82.2% (95% confidence interval (CI): 78.5%-85.9%) and 78.1% (74.1%-82.1%), respectively, in the asthmatic group and were 79.2% (75.4%-83.1%) and 69.0% (64.6%-73.4%), respectively, in the non-asthmatic group. PPVs >80% were found for several specific CMs, including cardiac, cleft, and limb CMs in both groups. The NPV for any CM was 88.2% (95% CI: 85.1%-91.3%) in the asthmatic group and 94.2% (92.2%-96.2%) in the non-asthmatic group. Québec's administrative databases are valid tools for epidemiological research of CMs. The results were similar between infants born to women with and without asthma. Copyright © 2013 John Wiley & Sons, Ltd.

  20. VULCAN: An Open-source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Shang-Min; Grosheintz, Luc; Kitzmann, Daniel; Heng, Kevin [University of Bern, Center for Space and Habitability, Sidlerstrasse 5, CH-3012, Bern (Switzerland); Lyons, James R. [Arizona State University, School of Earth and Space Exploration, Bateman Physical Sciences, Tempe, AZ 85287-1404 (United States); Rimmer, Paul B., E-mail: shang-min.tsai@space.unibe.ch, E-mail: kevin.heng@csh.unibe.ch, E-mail: jimlyons@asu.edu [University of St. Andrews, School of Physics and Astronomy, St. Andrews, KY16 9SS (United Kingdom)

    2017-02-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K, using a reduced C–H–O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer and Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. We also use VULCAN to examine the theoretical trends produced when the temperature–pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching approximation and find that it is accurate for methane but breaks down for acetylene, because the disequilibrium abundance of acetylene is not directly determined by transport-induced quenching, but is rather indirectly controlled by the disequilibrium abundance of methane. Therefore we suggest that the quenching approximation should be used with caution and must always be checked against a chemical kinetics calculation. A one-dimensional model atmosphere with 100 layers, computed using VULCAN, typically takes several minutes to complete. VULCAN is part of the Exoclimes Simulation Platform (ESP; exoclime.net) and publicly available at https://github.com/exoclime/VULCAN.

  1. Understanding Student Teachers’ Behavioural Intention to Use Technology: Technology Acceptance Model (TAM Validation and Testing

    Directory of Open Access Journals (Sweden)

    Kung-Teck, Wong

    2013-01-01

    Full Text Available This study sets out to validate and test the Technology Acceptance Model (TAM in the context of Malaysian student teachers’ integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA, and structural equation modelling (SEM was used for model comparison and hypotheses testing. The goodness-of-fit test of the analysis shows partial support of the applicability of the TAM in a Malaysian context. Overall, the TAM accounted for 37.3% of the variance in intention to use technology among student teachers and of the five hypotheses formulated, four are supported. Perceived usefulness is a significant influence on attitude towards computer use and behavioural intention. Perceived ease of use significantly influences perceived usefulness, and finally, behavioural intention is found to be influenced by attitude towards computer use. The findings of this research contribute to the literature by validating the TAM in the Malaysian context and provide several prominent implications for the research and practice of technology integration development.

  2. Validity of diagnostic codes and laboratory measurements to identify patients with idiopathic acute liver injury in a hospital database

    NARCIS (Netherlands)

    Udo, Renate; Maitland-van der Zee, Anke H.; Egberts, Toine C G|info:eu-repo/dai/nl/162850050; den Breeijen, Johanna H.; Leufkens, Hubert G M; van Solinge, Wouter W.|info:eu-repo/dai/nl/106837753; De Bruin, Marie L.

    2016-01-01

    Purpose: The development and validation of algorithms to identify cases of idiopathic acute liver injury (ALI) are essential to facilitate epidemiologic studies on drug-induced liver injury. The aim of this study is to determine the ability of diagnostic codes and laboratory measurements to identify

  3. Neonatal Facial Coding System for assessing postoperative pain in infants: Item reduction is valid and feasible. [IF 1.9

    NARCIS (Netherlands)

    Peters, J.W.B.; Koot, H.M.; Grunau, R.; de Boer, J.B.; van Druenen, M.J.; Tibboel, D.; Duivenvoorden, H.J.

    2003-01-01

    Objective: The objectives of this study were to: (1) evaluate the validity of the Neonatal Facial Coding System (NFCS) for assessment of postoperative pain and (2) explore whether the number of NFCS facial actions could be reduced for assessing postoperative pain. Design: Prospective, observational

  4. Validity of diagnostic codes and laboratory measurements to identify patients with idiopathic acute liver injury in a hospital database

    NARCIS (Netherlands)

    Udo, Renate; Maitland-van der Zee, Anke H.; Egberts, Toine C. G.; den Breeijen, Johanna H.; Leufkens, Hubert G. M.; van Solinge, Wouter W.; de Bruin, Marie L.

    2016-01-01

    PurposeThe development and validation of algorithms to identify cases of idiopathic acute liver injury (ALI) are essential to facilitate epidemiologic studies on drug-induced liver injury. The aim of this study is to determine the ability of diagnostic codes and laboratory measurements to identify

  5. Relative validity of the pre-coded food diary used in the Danish National Survey of Diet and Physical Activity

    DEFF Research Database (Denmark)

    Knudsen, Vibeke Kildegaard; Gille, Maj-Britt; Nielsen, Trine Holmgaard

    2011-01-01

    Objective: To determine the relative validity of the pre-coded food diary applied in the Danish National Survey of Dietary Habits and Physical Activity. Design: A cross-over study among seventy-two adults (aged 20 to 69 years) recording diet by means of a pre-coded food diary over 4 d and a 4 d...... weighed food record. Intakes of foods and drinks were estimated, and nutrient intakes were calculated. Means and medians of intake were compared, and crossclassification of individuals according to intake was performed. To assess agreement between the two methods, Pearson and Spearman’s correlation...... coefficients and weighted kappa coefficients were calculated. Setting: Validation study of the pre-coded food diary against a 4 d weighed food record. Subjects: Seventy-two volunteer, healthy free-living adults (thirty-five males, thirty-seven females). Results: Intakes of cereals and vegetables were higher...

  6. Validation of the TRACR3D code for soil water flow under saturated/unsaturated conditions in three experiments

    International Nuclear Information System (INIS)

    Perkins, B.; Travis, B.; DePoorter, G.

    1985-01-01

    Validation of the TRACR3D code in a one-dimensional form was obtained for flow of soil water in three experiments. In the first experiment, a pulse of water entered a crushed-tuff soil and initially moved under conditions of saturated flow, quickly followed by unsaturated flow. In the second experiment, steady-state unsaturated flow took place. In the final experiment, two slugs of water entered crushed tuff under field conditions. In all three experiments, experimentally measured data for volumetric water content agreed, within experimental errors, with the volumetric water content predicted by the code simulations. The experiments and simulations indicated the need for accurate knowledge of boundary and initial conditions, amount and duration of moisture input, and relevant material properties as input into the computer code. During the validation experiments, limitations on monitoring of water movement in waste burial sites were also noted. 5 references, 34 figures, 9 tables

  7. Comparison and validation of the results of the AZNHEX v.1.0 code with the MCNP code simulating the core of a fast reactor cooled with sodium

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E.; Esquivel E, J.

    2016-09-01

    The development of the AZTLAN platform for the analysis and design of nuclear reactors is led by Instituto Nacional de Investigaciones Nucleares (ININ) and divided into four working groups, which have well-defined activities to achieve significant progress in this project individually and jointly. Within these working groups is the users group, whose main task is to use the codes that make up the AZTLAN platform to provide feedback to the developers, and in this way to make the final versions of the codes are efficient and at the same time reliable and easy to understand. In this paper we present the results provided by the AZNHEX v.1.0 code when simulating the core of a fast reactor cooled with sodium at steady state. The validation of these results is a fundamental part of the platform development and responsibility of the users group, so in this research the results obtained with AZNHEX are compared and analyzed with those provided by the Monte Carlo code MCNP-5, software worldwide used and recognized. A description of the methodology used with MCNP-5 is also presented for the calculation of the interest variables and the difference that is obtained with respect to the calculated with AZNHEX. (Author)

  8. GOTHIC-IST 6.1b code validation exercises relating to heat removal by dousing and air coolers in CANDU containment

    International Nuclear Information System (INIS)

    Ramachandran, S.; Krause, M.; Nguyen, T.

    2003-01-01

    This paper presents validation results relating to the use of the GOTHIC containment analysis code for CANDU safety analysis. The validation results indicate that GOTHIC predicts heat removal by dousing and air cooler heat transfer with reasonable accuracy. (author)

  9. Validation of GPS and accelerometer technology in swimming.

    Science.gov (United States)

    Beanland, Emma; Main, Luana C; Aisbett, Brad; Gastin, Paul; Netto, Kevin

    2014-03-01

    To evaluate the validity of an integrated accelerometer and Global Positioning System (GPS) device to quantify swimming kinematics variables in swimming. Criterion validation study. Twenty-one sub-elite swimmers completed three 100 m efforts (one butterfly, breaststroke and freestyle) in an outdoor 50 m Olympic pool. A GPS device with an integrated tri-axial accelerometer was used to obtain mid-pool velocity and stroke count of each effort. This data was compared to velocity and stroke count data obtained from concurrently recorded digital video of the performance. A strong relationship was detected between the accelerometer stroke count and the video criterion measure for both breaststroke (r>0.98) and butterfly (r>0.99). Also, no significant differences were detected between the GPS velocity and video obtained velocity for both freestyle and breaststroke. There was a significant difference between the GPS velocity and criterion measure for butterfly. Acceptable standard error and 95% limits of agreement were obtained for freestyle (0.13 m s(-1), 0.36 m s(-1)) and breaststroke (0.12 m s(-1), 0.33 m s(-1)) compared to butterfly (0.18 m s(-1), 0.50 m s(-1)). Relative error measurements ranged between 10.2 and 13.4% across the three strokes. The integrated accelerometer and GPS device offers a valid and accurate tool for stroke count quantification in breaststroke and butterfly as well as measuring mid-pool swimming velocity in freestyle and breaststroke. The application of GPS technology in the outdoor training environment suggests advantageous practical benefits for swimmers, coaches and sports scientists. Copyright © 2013 Sports Medicine Australia. All rights reserved.

  10. Application of FPGA technology in packed coding of coal mine digital communication

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.; Zhang, S.; Wu, Z.; Di, J. [China University of Mining and Technology (China)

    1996-12-01

    According to the characteristics of the narrow frequency band in underground mine moving communication, this paper provides an application of FPGA technology to realise an audio packed plan with low digital code rate, and studied and designed a full digital DPCM-ADM compile code device. This plan is based on the distribution probability of different value signals to pack the signals with DPCM method. Then with the characteristics of the different value signal energy concentrated near DC, an ADM method is used to realise the secondary packing of the audio signals from 64K bit to 8k bit. With an advanced EPCM technology, computers can be used for a circuit design, adjustment and simulation output. The design will have a simple adjustment, small volume, low energy consumption, high reliability and other features. 4 refs., 3 figs.

  11. MISTRA facility for containment lumped parameter and CFD codes validation. Example of the International Standard Problem ISP47

    International Nuclear Information System (INIS)

    Tkatschenko, I.; Studer, E.; Paillere, H.

    2005-01-01

    During a severe accident in a Pressurized Water Reactor (PWR), the formation of a combustible gas mixture in the complex geometry of the reactor depends on the understanding of hydrogen production, the complex 3D thermal-hydraulics flow due to gas/steam injection, natural convection, heat transfer by condensation on walls and effect of mitigation devices. Numerical simulation of such flows may be performed either by Lumped Parameter (LP) or by Computational Fluid Dynamics (CFD) codes. Advantages and drawbacks of LP and CFD codes are well-known. LP codes are mainly developed for full size containment analysis but they need improvements, especially since they are not able to accurately predict the local gas mixing within the containment. CFD codes require a process of validation on well-instrumented experimental data before they can be used with a high degree of confidence. The MISTRA coupled effect test facility has been built at CEA to fulfil this validation objective: with numerous measurement points in the gaseous volume - temperature, gas concentration, velocity and turbulence - and with well controlled boundary conditions. As illustration of both experimental and simulation areas of this topic, a recent example in the use of MISTRA test data is presented for the case of the International Standard Problem ISP47. The proposed experimental work in the MISTRA facility provides essential data to fill the gaps in the modelling/validation of computational tools. (author)

  12. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  13. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  14. Review and evaluation of technology, equipment, codes and standards for digitization of industrial radiographic film

    International Nuclear Information System (INIS)

    1992-05-01

    This reports contains a review and evaluation of the technology, equipment, and codes and standards related to the digitization of industrial radiographic film. The report presents recommendations and equipment-performance specifications that will allow the digitization of radiographic film from nuclear power plant components in order to produce faithful reproductions of flaw images of interest on the films. Justification for the specifications selected are provided. Performance demonstration tests for the digitization process are required and criteria for such tests is presented. Also several comments related to implementation of the technology are presented and discussed

  15. New technologies accelerate the exploration of non-coding RNAs in horticultural plants

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Degao; Mewalal, Ritesh; Hu, Rongbin; Tuskan, Gerald A.; Yang, Xiaohan

    2017-07-05

    Non-coding RNAs (ncRNAs), that is, RNAs not translated into proteins, are crucial regulators of a variety of biological processes in plants. While protein-encoding genes have been relatively well-annotated in sequenced genomes, accounting for a small portion of the genome space in plants, the universe of plant ncRNAs is rapidly expanding. Recent advances in experimental and computational technologies have generated a great momentum for discovery and functional characterization of ncRNAs. Here we summarize the classification and known biological functions of plant ncRNAs, review the application of next-generation sequencing (NGS) technology and ribosome profiling technology to ncRNA discovery in horticultural plants and discuss the application of new technologies, especially the new genome-editing tool clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated protein 9 (Cas9) systems, to functional characterization of plant ncRNAs.

  16. CODE ACCEPTANCE OF A NEW JOINING TECHNOLOGY FOR STORAGE CONTAINMENTS (REISSUE)

    International Nuclear Information System (INIS)

    Cannel, G.R.; Grant, G.J.; Hill, B.E.

    2009-01-01

    perform the final closure in a single pass (GTAW requires multiple passes) resulting in increased productivity. The performance characteristics of FSW, i.e., high weld quality, simple machine-tool equipment and increased welding efficiency, suggest that this new technology should be considered for radioactive materials packaging campaigns. FSW technology will require some development, adaptation for this application, along with several activities needed for commercialization. One of these activities will be to obtain approval from the governing construction code to use the FSW technology. The American Society of Mechanical Engineers Boiler and Pressure Vessel Code (ASME B and PVC) will govern this work; however, rules for the use of FSW are not currently addressed. A code case will be required, defining appropriate process variables within prescribed limits, and submitted to the Code for review/approval and incorporation

  17. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  18. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  19. VALIDATION OF FULL CORE GEOMETRY MODEL OF THE NODAL3 CODE IN THE PWR TRANSIENT BENCHMARK PROBLEMS

    Directory of Open Access Journals (Sweden)

    Tagor Malem Sembiring

    2015-10-01

    Full Text Available ABSTRACT VALIDATION OF FULL CORE GEOMETRY MODEL OF THE NODAL3 CODE IN THE PWR TRANSIENT BENCHMARK PROBLEMS. The coupled neutronic and thermal-hydraulic (T/H code, NODAL3 code, has been validated in some PWR static benchmark and the NEACRP PWR transient benchmark cases. However, the NODAL3 code have not yet validated in the transient benchmark cases of a control rod assembly (CR ejection at peripheral core using a full core geometry model, the C1 and C2 cases.  By this research work, the accuracy of the NODAL3 code for one CR ejection or the unsymmetrical group of CRs ejection case can be validated. The calculations by the NODAL3 code have been carried out by the adiabatic method (AM and the improved quasistatic method (IQS. All calculated transient parameters by the NODAL3 code were compared with the reference results by the PANTHER code. The maximum relative difference of 16% occurs in the calculated time of power maximum parameter by using the IQS method, while the relative difference of the AM method is 4% for C2 case.  All calculation results by the NODAL3 code shows there is no systematic difference, it means the neutronic and T/H modules are adopted in the code are considered correct. Therefore, all calculation results by using the NODAL3 code are very good agreement with the reference results. Keywords: nodal method, coupled neutronic and thermal-hydraulic code, PWR, transient case, control rod ejection.   ABSTRAK VALIDASI MODEL GEOMETRI TERAS PENUH PAKET PROGRAM NODAL3 DALAM PROBLEM BENCHMARK GAYUT WAKTU PWR. Paket program kopel neutronik dan termohidraulika (T/H, NODAL3, telah divalidasi dengan beberapa kasus benchmark statis PWR dan kasus benchmark gayut waktu PWR NEACRP.  Akan tetapi, paket program NODAL3 belum divalidasi dalam kasus benchmark gayut waktu akibat penarikan sebuah perangkat batang kendali (CR di tepi teras menggunakan model geometri teras penuh, yaitu kasus C1 dan C2. Dengan penelitian ini, akurasi paket program

  20. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology.

    Science.gov (United States)

    Chen, Shuo; Luo, Chenggao; Deng, Bin; Wang, Hongqiang; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-01-19

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D) TCAI architecture based on single input multiple output (SIMO) technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  1. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Single Input Multiple Output Technology

    Directory of Open Access Journals (Sweden)

    Shuo Chen

    2018-01-01

    Full Text Available As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. In this paper, we propose a three-dimensional (3D TCAI architecture based on single input multiple output (SIMO technology, which can reduce the coding and sampling times sharply. The coded aperture applied in the proposed TCAI architecture loads either purposive or random phase modulation factor. In the transmitting process, the purposive phase modulation factor drives the terahertz beam to scan the divided 3D imaging cells. In the receiving process, the random phase modulation factor is adopted to modulate the terahertz wave to be spatiotemporally independent for high resolution. Considering human-scale targets, images of each 3D imaging cell are reconstructed one by one to decompose the global computational complexity, and then are synthesized together to obtain the complete high-resolution image. As for each imaging cell, the multi-resolution imaging method helps to reduce the computational burden on a large-scale reference-signal matrix. The experimental results demonstrate that the proposed architecture can achieve high-resolution imaging with much less time for 3D targets and has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  2. On the implementation of new technology modules for fusion reactor systems codes

    Energy Technology Data Exchange (ETDEWEB)

    Franza, F., E-mail: fabrizio.franza@kit.edu [Institute of Neutron Physics and Reactor Technology, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany); Boccaccini, L.V.; Fisher, U. [Institute of Neutron Physics and Reactor Technology, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany); Gade, P.V.; Heller, R. [Institute for Technical Physics, Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen, 76344 (Germany)

    2015-10-15

    Highlights: • At KIT a new technology modules for systems code are under development. • A new algorithm for the definition of the main reactor's components is defined. • A new blanket model based on 1D neutronics analysis is described. • A new TF coil stress model based on 3D electromagnetic analysis is described. • The models were successfully benchmarked against more detailed models. - Abstract: In the frame of the pre-conceptual design of the next generation fusion power plant (DEMO), systems codes are being used from nearly 20 years. In such computational tools the main reactor components (e.g. plasma, blanket, magnets, etc.) are integrated in a unique computational algorithm and simulated by means of rather simplified mathematical models (e.g. steady state and zero dimensional models). The systems code tries to identify the main design parameters (e.g. major radius, net electrical power, toroidal field) and to make the reactor's requirements and constraints to be simultaneously accomplished. In fusion applications, requirements and constraints can be either of physics or technology kind. Concerning the latest category, at Karlsruhe Institute of Technology a new modelling activity has been recently launched aiming to develop improved models focusing on the main technology areas, such as neutronics, thermal-hydraulics, electromagnetics, structural mechanics, fuel cycle and vacuum systems. These activities started by developing: (1) a geometry model for the definition of poloidal profiles for the main reactors components, (2) a blanket model based on neutronics analyses and (3) a toroidal field coil model based on electromagnetic analysis, firstly focusing on the stresses calculations. The objective of this paper is therefore to give a short outline of these models.

  3. On the implementation of new technology modules for fusion reactor systems codes

    International Nuclear Information System (INIS)

    Franza, F.; Boccaccini, L.V.; Fisher, U.; Gade, P.V.; Heller, R.

    2015-01-01

    Highlights: • At KIT a new technology modules for systems code are under development. • A new algorithm for the definition of the main reactor's components is defined. • A new blanket model based on 1D neutronics analysis is described. • A new TF coil stress model based on 3D electromagnetic analysis is described. • The models were successfully benchmarked against more detailed models. - Abstract: In the frame of the pre-conceptual design of the next generation fusion power plant (DEMO), systems codes are being used from nearly 20 years. In such computational tools the main reactor components (e.g. plasma, blanket, magnets, etc.) are integrated in a unique computational algorithm and simulated by means of rather simplified mathematical models (e.g. steady state and zero dimensional models). The systems code tries to identify the main design parameters (e.g. major radius, net electrical power, toroidal field) and to make the reactor's requirements and constraints to be simultaneously accomplished. In fusion applications, requirements and constraints can be either of physics or technology kind. Concerning the latest category, at Karlsruhe Institute of Technology a new modelling activity has been recently launched aiming to develop improved models focusing on the main technology areas, such as neutronics, thermal-hydraulics, electromagnetics, structural mechanics, fuel cycle and vacuum systems. These activities started by developing: (1) a geometry model for the definition of poloidal profiles for the main reactors components, (2) a blanket model based on neutronics analyses and (3) a toroidal field coil model based on electromagnetic analysis, firstly focusing on the stresses calculations. The objective of this paper is therefore to give a short outline of these models.

  4. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  5. Validation of the BISON 3D Fuel Performance Code: Temperature Comparisons for Concentrically and Eccentrically Located Fuel Pellets

    Energy Technology Data Exchange (ETDEWEB)

    J. D. Hales; D. M. Perez; R. L. Williamson; S. R. Novascone; B. W. Spencer

    2013-03-01

    BISON is a modern finite-element based nuclear fuel performance code that has been under development at the Idaho National Laboratory (USA) since 2009. The code is applicable to both steady and transient fuel behaviour and is used to analyse either 2D axisymmetric or 3D geometries. BISON has been applied to a variety of fuel forms including LWR fuel rods, TRISO-coated fuel particles, and metallic fuel in both rod and plate geometries. Code validation is currently in progress, principally by comparison to instrumented LWR fuel rods. Halden IFA experiments constitute a large percentage of the current BISON validation base. The validation emphasis here is centreline temperatures at the beginning of fuel life, with comparisons made to seven rods from the IFA-431 and 432 assemblies. The principal focus is IFA-431 Rod 4, which included concentric and eccentrically located fuel pellets. This experiment provides an opportunity to explore 3D thermomechanical behaviour and assess the 3D simulation capabilities of BISON. Analysis results agree with experimental results showing lower fuel centreline temperatures for eccentric fuel with the peak temperature shifted from the centreline. The comparison confirms with modern 3D analysis tools that the measured temperature difference between concentric and eccentric pellets is not an artefact and provides a quantitative explanation for the difference.

  6. Determination of multi-GNSS pseudo-absolute code biases and verification of receiver tracking technology

    Science.gov (United States)

    Villiger, Arturo; Schaer, Stefan; Dach, Rolf; Prange, Lars; Jäggi, Adrian

    2017-04-01

    It is common to handle code biases in the Global Navigation Satellite System (GNSS) data analysis as conventional differential code biases (DCBs): P1-C1, P1-P2, and P2-C2. Due to the increasing number of signals and systems in conjunction with various tracking modes for the different signals (as defined in RINEX3 format), the number of DCBs would increase drastically and the bookkeeping becomes almost unbearable. The Center for Orbit Determination in Europe (CODE) has thus changed its processing scheme to observable-specific signal biases (OSB). This means that for each observation involved all related satellite and receiver biases are considered. The OSB contributions from various ionosphere analyses (geometry-free linear combination) using different observables and frequencies and from clock analyses (ionosphere-free linear combination) are then combined on normal equation level. By this, one consistent set of OSB values per satellite and receiver can be obtained that contains all information needed for GNSS-related processing. This advanced procedure of code bias handling is now also applied to the IGS (International GNSS Service) MGEX (Multi-GNSS Experiment) procedure at CODE. Results for the biases from the legacy IGS solution as well as the CODE MGEX processing (considering GPS, GLONASS, Galileo, BeiDou, and QZSS) are presented. The consistency with the traditional method is confirmed and the new results are discussed regarding the long-term stability. When processing code data, it is essential to know the true observable types in order to correct for the associated biases. CODE has been verifying the receiver tracking technologies for GPS based on estimated DCB multipliers (for the RINEX 2 case). With the change to OSB, the original verification approach was extended to search for the best fitting observable types based on known OSB values. In essence, a multiplier parameter is estimated for each involved GNSS observable type. This implies that we could

  7. Medication dispensing errors and potential adverse drug events before and after implementing bar code technology in the pharmacy.

    Science.gov (United States)

    Poon, Eric G; Cina, Jennifer L; Churchill, William; Patel, Nirali; Featherstone, Erica; Rothschild, Jeffrey M; Keohane, Carol A; Whittemore, Anthony D; Bates, David W; Gandhi, Tejal K

    2006-09-19

    Many dispensing errors made in hospital pharmacies can harm patients. Some hospitals are investing in bar code technology to reduce these errors, but data about its efficacy are limited. To evaluate whether implementation of bar code technology reduced dispensing errors and potential adverse drug events (ADEs). Before-and-after study using direct observations. Hospital pharmacy at a 735-bed tertiary care academic medical center. A bar code-assisted dispensing system was implemented in 3 configurations. In 2 configurations, all doses were scanned once during the dispensing process. In the third configuration, only 1 dose was scanned if several doses of the same medication were being dispensed. Target dispensing errors, defined as dispensing errors that bar code technology was designed to address, and target potential ADEs, defined as target dispensing errors that can harm patients. In the pre- and post-bar code implementation periods, the authors observed 115,164 and 253,984 dispensed medication doses, respectively. Overall, the rates of target potential ADEs and all potential ADEs decreased by 74% and 63%, respectively. Of the 3 configurations of bar code technology studied, the 2 configurations that required staff to scan all doses had a 93% to 96% relative reduction in the incidence of target dispensing errors (P dispensing errors (P dispensing errors and potential ADEs substantially decreased after implementing bar code technology. However, the technology should be configured to scan every dose during the dispensing process.

  8. Using clinician text notes in electronic medical record data to validate transgender-related diagnosis codes.

    Science.gov (United States)

    Blosnich, John R; Cashy, John; Gordon, Adam J; Shipherd, Jillian C; Kauth, Michael R; Brown, George R; Fine, Michael J

    2018-04-04

    Transgender individuals are vulnerable to negative health risks and outcomes, but research remains limited because data sources, such as electronic medical records (EMRs), lack standardized collection of gender identity information. Most EMR do not include the gold standard of self-identified gender identity, but International Classification of Diseases (ICDs) includes diagnostic codes indicating transgender-related clinical services. However, it is unclear if these codes can indicate transgender status. The objective of this study was to determine the extent to which patients' clinician notes in EMR contained transgender-related terms that could corroborate ICD-coded transgender identity. Data are from the US Department of Veterans Affairs Corporate Data Warehouse. Transgender patients were defined by the presence of ICD9 and ICD10 codes associated with transgender-related clinical services, and a 3:1 comparison group of nontransgender patients was drawn. Patients' clinician text notes were extracted and searched for transgender-related words and phrases. Among 7560 patients defined as transgender based on ICD codes, the search algorithm identified 6753 (89.3%) with transgender-related terms. Among 22 072 patients defined as nontransgender without ICD codes, 246 (1.1%) had transgender-related terms; after review, 11 patients were identified as transgender, suggesting a 0.05% false negative rate. Using ICD-defined transgender status can facilitate health services research when self-identified gender identity data are not available in EMR.

  9. Deterministic sensitivity and uncertainty methodology for best estimate system codes applied in nuclear technology

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Cacuci, D.G.

    2009-01-01

    Nuclear Power Plant (NPP) technology has been developed based on the traditional defense in depth philosophy supported by deterministic and overly conservative methods for safety analysis. In the 1970s [1], conservative hypotheses were introduced for safety analyses to address existing uncertainties. Since then, intensive thermal-hydraulic experimental research has resulted in a considerable increase in knowledge and consequently in the development of best-estimate codes able to provide more realistic information about the physical behaviour and to identify the most relevant safety issues allowing the evaluation of the existing actual margins between the results of the calculations and the acceptance criteria. However, the best-estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are un-predictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes (BE) within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Taking into consideration the above framework, a comprehensive approach for utilizing quantified uncertainties arising from Integral Test Facilities (ITFs, [2]) and Separate Effect Test Facilities (SETFs, [3]) in the process of calibrating complex computer models for the application to NPP transient scenarios has been developed. The methodology proposed is capable of accommodating multiple SETFs and ITFs to learn as much as possible about uncertain parameters, allowing for the improvement of the computer model predictions based on the available experimental evidences. The proposed methodology constitutes a major step forward with respect to the generally used expert judgment and statistical methods as it permits a) to establish the uncertainties of any parameter

  10. Development and Validation of Information Technology Mentor Teacher Attitude Scale: A Pilot Study

    Science.gov (United States)

    Saltan, Fatih

    2015-01-01

    The aim of this study development and validation of a teacher attitude scale toward Information Technology Mentor Teachers (ITMT). ITMTs give technological support to other teachers for integration of technology in their lessons. In the literature, many instruments have been developed to measure teachers' attitudes towards the technological tools…

  11. The Validity of the Sales Purchase Contract with Redemption Pact in Light of the Provisions New Civil Code

    Directory of Open Access Journals (Sweden)

    Emilia Mateescu

    2010-06-01

    Full Text Available The sales contract with buyback agreement has existed in the Romanian legislation as it has been instituted by means of the original form of the Civil Code in force. The legislative evolution of the lastcentury has abrogated the provisions referring to this field and has eventually led to a legislative void in this matter. This situation has entailed the validation of all legal deeds in the form of sales contract with buyback agreement, form which used to be prohibited in the past, in certain situations. Noticing such situation andunderstanding the need to reinstate the legal framework for the regulation of social relationships with respect to the sales contract with buyback agreements, the Romanian lawgiver has dedicated it a subsection in the new Civil Code. The future civil regulation resumes a major part of the contents and meanings of the provisions of articles 1371-1387 of the Civil Code, currently abrogated. The different element lies in the institution of the express prohibition of sales with buyback option where the difference between the pricereceived and the price paid exceeds the level of interests set by the specific legislation. In addition, the sales where the seller has the obligation to buy back the good sold without setting the price of the good at the time of undertaking such obligation are also prohibited. Following the entry into force of the new Civil Code, the sale with buyback option shall fit the category of legal deeds affected by a resolutive condition, which shall also affect possible rights transmitted by concluding the contract. Such agreements shall be fully valid as long as the general validity conditions of the legal deeds are complied with and the legal norms passed in the interest matter are not infringed.

  12. ANITA-IEAF activation code package - updating of the decay and cross section data libraries and validation on the experimental data from the Karlsruhe Isochronous Cyclotron

    Science.gov (United States)

    Frisoni, Manuela

    2017-09-01

    ANITA-IEAF is an activation package (code and libraries) developed in the past in ENEA-Bologna in order to assess the activation of materials exposed to neutrons with energies greater than 20 MeV. An updated version of the ANITA-IEAF activation code package has been developed. It is suitable to be applied to the study of the irradiation effects on materials in facilities like the International Fusion Materials Irradiation Facility (IFMIF) and the DEMO Oriented Neutron Source (DONES), in which a considerable amount of neutrons with energies above 20 MeV is produced. The present paper summarizes the main characteristics of the updated version of ANITA-IEAF, able to use decay and cross section data based on more recent evaluated nuclear data libraries, i.e. the JEFF-3.1.1 Radioactive Decay Data Library and the EAF-2010 neutron activation cross section library. In this paper the validation effort related to the comparison between the code predictions and the activity measurements obtained from the Karlsruhe Isochronous Cyclotron is presented. In this integral experiment samples of two different steels, SS-316 and F82H, pure vanadium and a vanadium alloy, structural materials of interest in fusion technology, were activated in a neutron spectrum similar to the IFMIF neutron field.

  13. Notes for a Genealogy of Dress Codes and Aestheticizing Technologies in the Colombian School

    Directory of Open Access Journals (Sweden)

    Alexánder Aldana Bautista

    2016-08-01

    Full Text Available This article shows an analysis of the schoolchild’s construction from a series of aestheticizing technologies that constitute a child’s body in which the aesthetic utopia of modern school is inscribed. The paper, derived from an archaeological–genealogical research about school uniform and dress codes in the Colombian school during the late twentieth century and the early twenty– first century revolves around the following questions: What enabled the emergence of some discourses about school bodies, appropriate appearance and attire in the Colombian school? How did the school subject became a properly uniformed, seemly, neat, respectful and beauty person?

  14. Offshore Code Comparison Collaboration (OC3) for IEA Wind Task 23 Offshore Wind Technology and Deployment

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J.; Musial, W.

    2010-12-01

    This final report for IEA Wind Task 23, Offshore Wind Energy Technology and Deployment, is made up of two separate reports, Subtask 1: Experience with Critical Deployment Issues and Subtask 2: Offshore Code Comparison Collaborative (OC3). Subtask 1 discusses ecological issues and regulation, electrical system integration, external conditions, and key conclusions for Subtask 1. Subtask 2 included here, is the larger of the two volumes and contains five chapters that cover background information and objectives of Subtask 2 and results from each of the four phases of the project.

  15. Validation Calculations for the Application of MARS Code to the Safety Analysis of Research Reactors

    International Nuclear Information System (INIS)

    Park, Cheol; Kim, H.; Chae, H. T.; Lim, I. C.

    2006-10-01

    In order to investigate the applicability of MARS code to the accident analysis of the HANARO and other RRs, the following test data were simulated. Test data of the HANARO design and operation, Test data of flow instability and void fraction from published documents, IAEA RR transient data in TECDOC-643, Brazilian IEA-R1 experimental data. For the simulation of the HANARO data with finned rod type fuels at low pressure and low temperature conditions, MARS code, developed for the transient analysis of power reactors, was modified. Its prediction capability was assessed against the experimental data for the HANARO. From the assessment results, it can be said that the modified MARS code could be used for analyzing the thermal hydraulic transient of the HANARO. Some other simulations such as flow instability test and reactor transients were also done for the application of MARS code to RRs with plate type fuels. In the simulation for these cases, no modification was made. The results of simulated cases show that the MARS code can be used to the transient analysis of RRs with careful considerations. In particular, it seems that an improvement on a void model may be necessary for dealing with the phenomena in high void conditions

  16. Use of an Accurate DNS Particulate Flow Method to Supply and Validate Boundary Conditions for the MFIX Code

    Energy Technology Data Exchange (ETDEWEB)

    Zhi-Gang Feng

    2012-05-31

    The simulation of particulate flows for industrial applications often requires the use of two-fluid models, where the solid particles are considered as a separate continuous phase. One of the underlining uncertainties in the use of the two-fluid models in multiphase computations comes from the boundary condition of the solid phase. Typically, the gas or liquid fluid boundary condition at a solid wall is the so called no-slip condition, which has been widely accepted to be valid for single-phase fluid dynamics provided that the Knudsen number is low. However, the boundary condition for the solid phase is not well understood. The no-slip condition at a solid boundary is not a valid assumption for the solid phase. Instead, several researchers advocate a slip condition as a more appropriate boundary condition. However, the question on the selection of an exact slip length or a slip velocity coefficient is still unanswered. Experimental or numerical simulation data are needed in order to determinate the slip boundary condition that is applicable to a two-fluid model. The goal of this project is to improve the performance and accuracy of the boundary conditions used in two-fluid models such as the MFIX code, which is frequently used in multiphase flow simulations. The specific objectives of the project are to use first principles embedded in a validated Direct Numerical Simulation particulate flow numerical program, which uses the Immersed Boundary method (DNS-IB) and the Direct Forcing scheme in order to establish, modify and validate needed energy and momentum boundary conditions for the MFIX code. To achieve these objectives, we have developed a highly efficient DNS code and conducted numerical simulations to investigate the particle-wall and particle-particle interactions in particulate flows. Most of our research findings have been reported in major conferences and archived journals, which are listed in Section 7 of this report. In this report, we will present a

  17. Analysis of LWR-MOX fuel steady behavior and code validation

    International Nuclear Information System (INIS)

    He Xiaojun; Chen Peng; Huang Yucai

    2010-01-01

    In this study, the fuel performance code METEOR v1.9 developed by CEA (France) was used to analyze the fuel behavior of MOX fuel of light water reactor from OECD-Halden project IFA597.4-7 test under steady state irradiation. The code predictions were compared with the experimental data. The research showed that: 1) The MOX fuel of light water reactor had stable performance under steady state irradiation condition. The performance of MOX fuel had no significant difference with UO 2 fuel, except that its fission gas release (FGR) was a little bit higher than UO 2 fuel; 2) METEOR1.9 code could predict behavior of MOX very well. The predictions of temperature distribution, rod inner pressure and fission gas release fraction had a good agreement with experimental data. (authors)

  18. Validation of a new library of nuclear constants of the WIMS code

    International Nuclear Information System (INIS)

    Aguilar H, F.

    1991-10-01

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H 1 , O 16 , Al 27 , U 235 and U 238 was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  19. Bar Code Medication Administration Technology: Characterization of High-Alert Medication Triggers and Clinician Workarounds.

    Science.gov (United States)

    Miller, Daniel F; Fortier, Christopher R; Garrison, Kelli L

    2011-02-01

    Bar code medication administration (BCMA) technology is gaining acceptance for its ability to prevent medication administration errors. However, studies suggest that improper use of BCMA technology can yield unsatisfactory error prevention and introduction of new potential medication errors. To evaluate the incidence of high-alert medication BCMA triggers and alert types and discuss the type of nursing and pharmacy workarounds occurring with the use of BCMA technology and the electronic medication administration record (eMAR). Medication scanning and override reports from January 1, 2008, through November 30, 2008, for all adult medical/surgical units were retrospectively evaluated for high-alert medication system triggers, alert types, and override reason documentation. An observational study of nursing workarounds on an adult medicine step-down unit was performed and an analysis of potential pharmacy workarounds affecting BCMA and the eMAR was also conducted. Seventeen percent of scanned medications triggered an error alert of which 55% were for high-alert medications. Insulin aspart, NPH insulin, hydromorphone, potassium chloride, and morphine were the top 5 high-alert medications that generated alert messages. Clinician override reasons for alerts were documented in only 23% of administrations. Observational studies assessing for nursing workarounds revealed a median of 3 clinician workarounds per administration. Specific nursing workarounds included a failure to scan medications/patient armband and scanning the bar code once the dosage has been removed from the unit-dose packaging. Analysis of pharmacy order entry process workarounds revealed the potential for missed doses, duplicate doses, and doses being scheduled at the wrong time. BCMA has the potential to prevent high-alert medication errors by alerting clinicians through alert messages. Nursing and pharmacy workarounds can limit the recognition of optimal safety outcomes and therefore workflow processes

  20. Development and validation of the fast doppler broadening module coupled within RMC code

    International Nuclear Information System (INIS)

    Yu Jiankai; Liang Jin'gang; Yu Ganglin; Wang Kan

    2015-01-01

    It is one of the efficient approach to reduce the memory consumption in Monte Carlo based reactor physical simulations by using the On-the-fly Doppler broadening for temperature dependent nuclear cross sections. RXSP is a nuclear cross sections processing code being developed by REAL team in Department of Engineering Physics in Tsinghua University, which has an excellent performance in Doppler broadening the temperature dependent continuous energy neutron cross sections. To meet the dual requirements of both accuracy and efficiency during the Monte Carlo simulations with many materials and many temperatures in it, this work enables the capability of on-the-fly pre-Doppler broadening cross sections during the neutron transport by coupling the Fast Doppler Broaden module in RXSP code embedded in the RMC code also being developed by REAL team in Tsinghua University. Additionally, the original OpenMP-based parallelism has been successfully converted into the MPI-based framework, being fully compatible with neutron transport in RMC code, which has achieved a vast parallel efficiency improvement. This work also provides a flexible approach to solve Monte Carlo based full core depletion calculation with many temperatures feedback in many isotopes. (author)

  1. Validation of an Administrative Definition of ICU Admission Using Revenue Center Codes.

    Science.gov (United States)

    Weissman, Gary E; Hubbard, Rebecca A; Kohn, Rachel; Anesi, George L; Manaker, Scott; Kerlin, Meeta Prasad; Halpern, Scott D

    2017-08-01

    Describe the operating characteristics of a proposed set of revenue center codes to correctly identify ICU stays among hospitalized patients. Retrospective cohort study. We report the operating characteristics of all ICU-related revenue center codes for intensive and coronary care, excluding nursery, intermediate, and incremental care, to identify ICU stays. We use a classification and regression tree model to further refine identification of ICU stays using administrative data. The gold standard for classifying ICU admission was an electronic patient location tracking system. The University of Pennsylvania Health System in Philadelphia, PA, United States. All adult inpatient hospital admissions between July 1, 2013, and June 30, 2015. None. Among 127,680 hospital admissions, the proposed combination of revenue center codes had 94.6% sensitivity (95% CI, 94.3-94.9%) and 96.1% specificity (95% CI, 96.0-96.3%) for correctly identifying hospital admissions with an ICU stay. The classification and regression tree algorithm had 92.3% sensitivity (95% CI, 91.6-93.1%) and 97.4% specificity (95% CI, 97.2-97.6%), with an overall improved accuracy (χ = 398; p center codes has excellent sensitivity and specificity for identifying true ICU admission. A classification and regression tree algorithm with additional administrative variables offers further improvements to accuracy.

  2. Validation of Mean Drift Forces Computed with the BEM Code NEMOH

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg

    This report covers a simple investigation of mean drift forces found by use of the boundary element method code NEMOH. The results from NEMOH are compared to analytical results from literature and to numerical values found from the commercial software package WADAM by DNV-GL. The work was conducted...

  3. Experimental validation of the DPM Monte Carlo code using minimally scattered electron beams in heterogeneous media

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A.; Wilderman, Scott J.; Bielajew, Alex F.

    2002-01-01

    A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions. (author)

  4. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    Science.gov (United States)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery

  5. Improving reliability of non-volatile memory technologies through circuit level techniques and error control coding

    Science.gov (United States)

    Yang, Chengen; Emre, Yunus; Cao, Yu; Chakrabarti, Chaitali

    2012-12-01

    Non-volatile resistive memories, such as phase-change RAM (PRAM) and spin transfer torque RAM (STT-RAM), have emerged as promising candidates because of their fast read access, high storage density, and very low standby power. Unfortunately, in scaled technologies, high storage density comes at a price of lower reliability. In this article, we first study in detail the causes of errors for PRAM and STT-RAM. We see that while for multi-level cell (MLC) PRAM, the errors are due to resistance drift, in STT-RAM they are due to process variations and variations in the device geometry. We develop error models to capture these effects and propose techniques based on tuning of circuit level parameters to mitigate some of these errors. Unfortunately for reliable memory operation, only circuit-level techniques are not sufficient and so we propose error control coding (ECC) techniques that can be used on top of circuit-level techniques. We show that for STT-RAM, a combination of voltage boosting and write pulse width adjustment at the circuit-level followed by a BCH-based ECC scheme can reduce the block failure rate (BFR) to 10-8. For MLC-PRAM, a combination of threshold resistance tuning and BCH-based product code ECC scheme can achieve the same target BFR of 10-8. The product code scheme is flexible; it allows migration to a stronger code to guarantee the same target BFR when the raw bit error rate increases with increase in the number of programming cycles.

  6. Code Validation of CFD Heat Transfer Models for Liquid Rocket Engine Combustion Devices

    National Research Council Canada - National Science Library

    Coy, E. B

    2007-01-01

    .... The design of the rig and its capabilities are described. A second objective of the test rig is to provide CFD validation data under conditions relevant to liquid rocket engine thrust chambers...

  7. Ella-V and technology usage technology usage in an english language and literacy acquisition validation randomized controlled trial study

    OpenAIRE

    Roisin P. Corcoran; Steven M. Ross; Beverly J. Irby; Fuhui Tong; Rafael Lara-Alecio; Cindy Guerrero

    2014-01-01

    This paper describes the use of technology to provide virtual professional development (VPD) for teachers and to conduct classroom observations in a study of English Language Learner (ELL) instruction in grades K–3. The technology applications were part of a cluster randomized control trial (RCT) design for a federally funded longitudinal validation study of a particular program, English Language and Literacy Acquisition-Validation, ELLA- V, to determine its degree of impact on English oral l...

  8. TRIPOLI-4{sup ®} Monte Carlo code ITER A-lite neutronic model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jaboulay, Jean-Charles, E-mail: jean-charles.jaboulay@cea.fr [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Cayla, Pierre-Yves; Fausser, Clement [MILLENNIUM, 16 Av du Québec Silic 628, F-91945 Villebon sur Yvette (France); Damian, Frederic; Lee, Yi-Kang; Puma, Antonella Li; Trama, Jean-Christophe [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France)

    2014-10-15

    3D Monte Carlo transport codes are extensively used in neutronic analysis, especially in radiation protection and shielding analyses for fission and fusion reactors. TRIPOLI-4{sup ®} is a Monte Carlo code developed by CEA. The aim of this paper is to show its capability to model a large-scale fusion reactor with complex neutron source and geometry. A benchmark between MCNP5 and TRIPOLI-4{sup ®}, on the ITER A-lite model was carried out; neutron flux, nuclear heating in the blankets and tritium production rate in the European TBMs were evaluated and compared. The methodology to build the TRIPOLI-4{sup ®} A-lite model is based on MCAM and the MCNP A-lite model. Simplified TBMs, from KIT, were integrated in the equatorial-port. A good agreement between MCNP and TRIPOLI-4{sup ®} is shown; discrepancies are mainly included in the statistical error.

  9. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  10. Validation of the COBRA code for dry out power calculation in CANDU type advanced fuels

    International Nuclear Information System (INIS)

    Daverio, Hernando J.

    2003-01-01

    Stern Laboratories perform a full scale CHF testing of the CANFLEX bundle under AECL request. This experiment is modeled with the COBRA IV HW code to verify it's capacity for the dry out power calculation . Good results were obtained: errors below 10 % with respect to all data measured and 1 % for standard operating conditions in CANDU reactors range . This calculations were repeated for the CNEA advanced fuel CARA obtaining the same performance as the CANFLEX fuel. (author)

  11. Design and implementation of safety traceability system for candied fruits based on two-dimension code technology

    Directory of Open Access Journals (Sweden)

    ZHAO Kun

    2014-12-01

    Full Text Available Traceability is the basic principle of food safety.A food safety traceability system based on QR code and cloud computing technology was introduced in this paper.First of all we introduced the QR code technology and the concept of traceability.And then through the field investigation,we analyzed the traceability process.At the same time,we designed the system and database were found,and the consumer experiencing technology is studied.Finally we expounded the traceability information collection,transmission and final presentation style and expected the future development of traceability system.

  12. Student Interest in Technology and Science (SITS) Survey: Development, Validation, and Use of a New Instrument

    Science.gov (United States)

    Romine, William; Sadler, Troy D.; Presley, Morgan; Klosterman, Michelle L.

    2014-01-01

    This study presents the systematic development, validation, and use of a new instrument for measuring student interest in science and technology. The Student Interest in Technology and Science (SITS) survey is composed of 5 sub-sections assessing the following dimensions: interest in learning science, using technology to learn science, science…

  13. Validation of an Instrument to Measure Students' Motivation and Self-Regulation towards Technology Learning

    Science.gov (United States)

    Liou, Pey-Yan; Kuo, Pei-Jung

    2014-01-01

    Background: Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose: The present study is to validate an…

  14. Decay heat measurement on fusion reactor materials and validation of calculation code system

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Decay heat rates for 32 fusion reactor relevant materials irradiated with 14-MeV neutrons were measured for the cooling time period between 1 minute and 400 days. With using the experimental data base, validity of decay heat calculation systems for fusion reactors were investigated. (author)

  15. Validation study of automatically generated codes in colonoscopy using the endoscopic report system Endobase

    NARCIS (Netherlands)

    Groenen, Marcel J. M.; van Buuren, Henk R.; van Berge Henegouwen, Gerard P.; Fockens, Paul; van der Lei, Johan; Stuifbergen, Wouter N. H. M.; van der Schaar, Peter J.; Kuipers, Ernst J.; Ouwendijk, Rob J. Th

    2010-01-01

    OBJECTIVE: Gastrointestinal endoscopy databases are important for surveillance, epidemiology, quality control and research. A good quality of automatically generated databases to enable drawing justified conclusions based on the data is of key importance. The aim of this study is to validate the

  16. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  17. Validation of the LH antenna code ALOHA against Tore Supra experiments

    International Nuclear Information System (INIS)

    Hillairet, J.; Ekedahl, A.; Kocan, M.; Gunn, J. P.; Goniche, M.

    2009-01-01

    Comparisons between ALOHA code predictions and experimental measurements of reflection coefficients for the two different Lower Hybrid Current Drive (LHCD) antennas (named C2 and C3) in Tore Supra are presented. A large variation of density in front of the antennas was obtained by varying the distance between the plasma and the antennas. Low power ( 2 ) was used in order to avoid non-linear effects on the wave coupling. Results obtained with ALOHA are in good agreement with the experimental measurements for both Tore Supra antennas and show that ALOHA is an efficient LH predictive tool.

  18. Modeling of solid/porous wall boundary conditions for the validation of computational fluid dynamics codes

    Science.gov (United States)

    Beutner, Thomas J.; Celik, Zeki Z.; Roberts, Leonard

    1992-01-01

    A computational study has been undertaken to investigate method of modeling solid and porous wall boundary conditions in computational fluid dynamics (CFD) codes. The procedure utilizes experimental measurements at the walls to develop a flow field solution based on the method of singularities. This flow field solution is then imposed as a boundary condition in a CFD simulation of the internal flow field. The effectiveness of this method in describing the boundary conditions at the wind tunnel walls using only sparse experimental measurements has been investigated. Position and refinement of experimental measurement locations required to describe porous wall boundary conditions has also been considered.

  19. 3D Measurement Technology by Structured Light Using Stripe-Edge-Based Gray Code

    International Nuclear Information System (INIS)

    Wu, H B; Chen, Y; Wu, M Y; Guan, C R; Yu, X Y

    2006-01-01

    The key problem of 3D vision measurement using triangle method based on structured light is to acquiring projecting angle of projecting light accurately. In order to acquire projecting angle thereby determine the corresponding relationship between sampling point and image point, method for encoding and decoding structured light based on stripe edge of Gray code is presented. The method encoded with Gray code stripe and decoded with stripe edge acquired by sub-pixel technology instead of pixel centre, so latter one-bit decoding error was removed. Accuracy of image sampling point location and correspondence between image sampling point and object sampling point achieved sub-pixel degree. In addition, measurement error caused by dividing projecting angle irregularly by even-width encoding stripe was analysed and corrected. Encoding and decoding principle and decoding equations were described. Finally, 3dsmax and Matlab software were used to simulate measurement system and reconstruct measured surface. Indicated by experimental results, measurement error is about 0.05%

  20. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  1. Experimental validation for combustion analysis of GOTHIC 6.1b code in 2-dimensional premixed combustion experiments

    International Nuclear Information System (INIS)

    Lee, J. Y.; Lee, J. J.; Park, K. C.

    2003-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. In the experimental results, we could confirm the propagation characteristics of hydrogen flame such as buoyancy effect, flame front shape etc.. The combustion time of the tests was about 0.1 sec.. In the GOTHIC analyses results, the GOTHIC code could predict the overall hydrogen flame propagation characteristics but the buoyancy effect and flame shape did not compare well with the experimental results. Especially, in case of the flame propagate to the dead-end, GOTHIC predicted the flame did not affected by the flow and this cause quite different results in flame propagation from experimental results. Moreover the combustion time of the analyses was about 1 sec. which is ten times longer than the experimental result. To obtain more reasonable analysis results, it is necessary that combustion model parameters in GOTHIC code apply appropriately and hydrogen flame characteristics be reflected in solving governing equations

  2. Validation of Heat Transfer and Film Cooling Capabilities of the 3-D RANS Code TURBO

    Science.gov (United States)

    Shyam, Vikram; Ameri, Ali; Chen, Jen-Ping

    2010-01-01

    The capabilities of the 3-D unsteady RANS code TURBO have been extended to include heat transfer and film cooling applications. The results of simulations performed with the modified code are compared to experiment and to theory, where applicable. Wilcox s k-turbulence model has been implemented to close the RANS equations. Two simulations are conducted: (1) flow over a flat plate and (2) flow over an adiabatic flat plate cooled by one hole inclined at 35 to the free stream. For (1) agreement with theory is found to be excellent for heat transfer, represented by local Nusselt number, and quite good for momentum, as represented by the local skin friction coefficient. This report compares the local skin friction coefficients and Nusselt numbers on a flat plate obtained using Wilcox's k-model with the theory of Blasius. The study looks at laminar and turbulent flows over an adiabatic flat plate and over an isothermal flat plate for two different wall temperatures. It is shown that TURBO is able to accurately predict heat transfer on a flat plate. For (2) TURBO shows good qualitative agreement with film cooling experiments performed on a flat plate with one cooling hole. Quantitatively, film effectiveness is under predicted downstream of the hole.

  3. Development and Validation of a Momentum Integral Numerical Analysis Code for Liquid Metal Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xiangyi; Suh, Kune Y. [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this work, this benchmark problem is conducted to assess the precision of the upgraded in-house code MINA. Comparison of the results from different best estimate codes employed by various grid spacer pressure drop correlations is carried out to suggest the best one. By modifying In's method, it presents good agreement with the experiment data which is shown in Figure 7. The reason for the failure of the prediction in previous work is caused by the utilization of Rehme's method which is categorized into four groups according to different fitting strategy. Through comparison of drag coefficients calculated by four groups of Rheme's method, equivalent drag coefficient calculated by In's method and experiment data shown in Figure 8, we can conclude that Rehme's method considerably underestimate the drag coefficients in grid spacers used in HELIOS and In's method give a reasonable prediction. Starting from the core inlet, the accumulated pressure losses are presented in figure 9 along the accumulated length of the forced convection flow path; the good agreement of the prediction from MINA with the experiment result shows MINA has very good capability in integrated momentum analysis makes it robust in the future design scoping method development of LFR.

  4. Calibration/Validation Technology for the CO2 Satellite Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to develop high altitude CO2 analyzer technology that can be deployed on the research aircraft of NASA's Airborne Science Program (ASP). The...

  5. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  6. Validation of main nuclear libraries used in thorium reactors using the Serpent code

    International Nuclear Information System (INIS)

    Faga, Lucas J.

    2017-01-01

    The purpose of this work is to validate the library of the Serpent standard database for systems containing U-233, U-235, Th-232, Pu-239 and Pu-240. The project will support the other projects of the newly created study group of Nuclear Engineering Center (CEN) of Instituto de Pesquisas Energéticas e Nucleares (IPEN), linked to the study of several types of reactors and their application in thorium cycles, a subject that gains more and more visibility, due to strong and potential promises of energy revolution. The results obtained at the end of the simulations were satisfactory, with the multiplication factors being effective close to 100 PCM of the values provided by the benchmarks, as expected for a validated library. The minimum distance between these values was 2 PCM and the maximum of 280 PCM. The final analysis demonstrates that the ENDF / B-VII library has validated nuclear data for the isotopes of interest and may be used in future thorium study group projects

  7. Health Information Technology Usability Evaluation Scale (Health-ITUES) for Usability Assessment of Mobile Health Technology: Validation Study.

    Science.gov (United States)

    Schnall, Rebecca; Cho, Hwayoung; Liu, Jianfang

    2018-01-05

    Mobile technology has become a ubiquitous technology and can be particularly useful in the delivery of health interventions. This technology can allow us to deliver interventions to scale, cover broad geographic areas, and deliver technologies in highly tailored ways based on the preferences or characteristics of users. The broad use of mobile technologies supports the need for usability assessments of these tools. Although there have been a number of usability assessment instruments developed, none have been validated for use with mobile technologies. The goal of this work was to validate the Health Information Technology Usability Evaluation Scale (Health-ITUES), a customizable usability assessment instrument in a sample of community-dwelling adults who were testing the use of a new mobile health (mHealth) technology. A sample of 92 community-dwelling adults living with HIV used a new mobile app for symptom self-management and completed the Health-ITUES to assess the usability of the app. They also completed the Post-Study System Usability Questionnaire (PSSUQ), a widely used and well-validated usability assessment tool. Correlations between these scales and each of the subscales were assessed. The subscales of the Health-ITUES showed high internal consistency reliability (Cronbach alpha=.85-.92). Each of the Health-ITUES subscales and the overall scale was moderately to strongly correlated with the PSSUQ scales (r=.46-.70), demonstrating the criterion validity of the Health-ITUES. The Health-ITUES has demonstrated reliability and validity for use in assessing the usability of mHealth technologies in community-dwelling adults living with a chronic illness. ©Rebecca Schnall, Hwayoung Cho, Jianfang Liu. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 05.01.2018.

  8. Validation of the Serpent 2-DYNSUB code sequence using the Special Power Excursion Reactor Test III (SPERT III)

    International Nuclear Information System (INIS)

    Knebel, Miriam; Mercatali, Luigi; Sanchez, Victor; Stieglitz, Robert; Macian-Juan, Rafael

    2016-01-01

    Highlights: • Full few-group cross section tables created by Monte Carlo lattice code Serpent 2. • Serpent 2 group constant methodology verified for HFP static and transient cases. • Serpent 2-DYNSUB tool chainvalidated using SPERT III REA experiments. • Serpent 2-DYNSUB tool chain suitable to model RIAs in PWRs. - Abstract: The Special Power Excursion Reactor Test III (SPERT III) is studied using the Serpent 2-DYNSUB code sequence in order to validate it for modeling reactivity insertion accidents (RIA) in PWRs. The SPERT III E-core was a thermal research reactor constructed to analyze reactor dynamics. Its configuration resembles a commercial PWR on terms of fuel type, choice of moderator, coolant flow and system pressure. The initial conditions of the rod ejection accident experiments (REA) performed cover cold startup, hot startup, hot standby and operating power scenarios. Eight of these experiments were analyzed in detail. Firstly, multi-dimensional nodal diffusion cross section tables were created for the three-dimensional reactor simulator DYNSUB employing the Monte Carlo neutron transport code Serpent 2. In a second step, DYNSUB stationary simulations were compared to Monte Carlo reference three-dimensional full scale solutions obtained with Serpent 2 (cold startup conditions) and Serpent 2/SUBCHANFLOW (operating power conditions) with a good agreement being observed. The latter tool is an internal coupling of Serpent 2 and the sub-channel thermal-hydraulics code SUBCHANFLOW. Finally, DYNSUB was utilized to study the eight selected transient experiments. Results were found to match measurements well. As the selected experiments cover much of the possible transient (delayed super-critical, prompt super-critical and super-prompt critical excursion) and initial conditions (cold and hot as well as zero, little and full power reactor states) one expects in commercial PWRs, the obtained results give confidence that the Serpent 2-DYNSUB tool chain is

  9. Validation of current procedural terminology codes for rotavirus vaccination among infants in two commercially insured US populations.

    Science.gov (United States)

    Hoffman, Veena; Everage, Nicholas J; Quinlan, Scott C; Skerry, Kathleen; Esposito, Daina; Praet, Nicolas; Rosillon, Dominique; Holick, Crystal N; Dore, David D

    2016-12-01

    We validated procedure codes used in health insurance claims for reimbursement of rotavirus vaccination by comparing claims for monovalent live-attenuated human rotavirus vaccine (RV1) and live, oral pentavalent rotavirus vaccine (RV5) to medical records. Using administrative data from two commercially insured United States populations, we randomly sampled vaccination claims for RV1 and RV5 from a cohort of infants aged less than 1 year from an ongoing post-licensure safety study of rotavirus vaccines. The codes for RV1 and RV5 found in claims were confirmed through medical record review. The positive predictive value (PPV) of the Current Procedural Terminology codes for RV1 and RV5 was calculated as the number of medical record-confirmed vaccinations divided by the number of medical records obtained. Medical record review confirmed 92 of 104 RV1 vaccination claims (PPV: 88.5%; 95% CI: 80.7-93.9%) and 98 of 113 RV5 vaccination claims (PPV: 86.7%; 95% CI: 79.1-92.4%). Among the 217 medical records abstracted, only three (1.4%) of vaccinations were misclassified in claims-all were RV5 misclassified as RV1. The medical records corresponding to 9 RV1 and 15 RV5 claims contained insufficient information to classify the type of rotavirus vaccine. Misclassification of rotavirus vaccines is infrequent within claims. The PPVs reported here are conservative estimates as those with insufficient information in the medical records were assumed to be incorrectly coded in the claims. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Verification and validation of LMFBR static core mechanics codes. Pt. 2

    International Nuclear Information System (INIS)

    1990-07-01

    Although core static mechanics had received sporadic attention at international gatherings in the 1970s (e.g. the SMIRT series), the first major coordinated international review was organized as an IWGFR Specialists' Meeting which was held in the United Kingdom at Wilmslow, Cheshire in October 1984. The problems of structural analysis raised by core static mechanics were novel and difficult. They involved analysing the non-linear interactive behaviour of hundreds of wrappers separated by gaps which might be open or closed. Because this problem had not responded to conventional analysis, each country had set about solving this ''discontinuum'' problem by specialist coding. The current document presents proceedings of the two Research Coordination Meetings held in Vienna (March 1987) and in Oarai, Japan (May 1989). The proceeding of the first meeting contains 11 presentations and the second one has 14 presentations. A separate abstract was prepared for each of these 25 papers. Refs, figs and tabs

  11. Experimental benchmark and code validation for airfoils equipped with passive vortex generators

    DEFF Research Database (Denmark)

    Baldacchino, D.; Manolesos, M.; Ferreira, Célia Maria Dias

    2016-01-01

    % thick DU97W300 and an 18% thick NTUA T18 have been used for benchmarking several simulation tools. These tools span low-to-high complexity, ranging from engineering-level integral boundary layer tools to fully-resolved computational fluid dynamics codes. Results indicate that with appropriate......Experimental results and complimentary computations for airfoils with vortex generators are compared in this paper, as part of an effort within the AVATAR project to develop tools for wind turbine blade control devices. Measurements from two airfoils equipped with passive vortex generators, a 30...... calibration, engineering-type tools can capture the effects of vortex generators and outperform more complex tools. Fully resolved CFD comes at a much higher computational cost and does not necessarily capture the increased lift due to the VGs. However, in lieu of the limited experimental data available...

  12. Threats to Validity When Using Open-Ended Items in International Achievement Studies: Coding Responses to the PISA 2012 Problem-Solving Test in Finland

    Science.gov (United States)

    Arffman, Inga

    2016-01-01

    Open-ended (OE) items are widely used to gather data on student performance in international achievement studies. However, several factors may threaten validity when using such items. This study examined Finnish coders' opinions about threats to validity when coding responses to OE items in the PISA 2012 problem-solving test. A total of 6…

  13. Development of blow down and sodium-water reaction jet analysis codes-Validation by sodium-water reaction tests (SWAT-1R)

    International Nuclear Information System (INIS)

    Hiroshi Seino; Akikazu Kurihara; Isao Ono; Koji Jitsu

    2005-01-01

    Blow down analysis code (LEAP-BLOW) and sodium-water reaction jet analysis code (LEAP-JET) have been developed in order to improve the evaluation method on sodium-water reaction event in the steam generator (SG) of a sodium cooled fast breeder reactor (FBR). The validation analyses by these two codes were carried out using the data of Sodium-Water Reaction Test (SWAT-1R). The following main results have been obtained through this validation: (1) The calculational results by LEAP-BLOW such as internal pressure and water flow rate show good agreement with the results of the SWAT- 1R test. (2) The LEAP-JET code can qualitatively simulate the behavior of sodium-water reaction. However, it is found that the code has tendency to overestimate the maximum temperature of the reaction jet. (authors)

  14. Validation and configuration management plan for the KE basins KE-PU spreadsheet code

    International Nuclear Information System (INIS)

    Harris, R.A.

    1996-01-01

    This report provides documentation of the spreadsheet KE-PU software that is used to verify compliance with the Operational Safety Requirement and Process Standard limit on the amount of plutonium in the KE-Basin sandfilter backwash pit. Included are: A summary of the verification of the method and technique used in KE-PU that were documented elsewhere, the requirements, plans, and results of validation tests that confirm the proper functioning of the software, the procedures and approvals required to make changes to the software, and the method used to maintain configuration control over the software

  15. Electrically Driven Thermal Management: Flight Validation, Experiment Development, Future Technologies

    Science.gov (United States)

    Didion, Jeffrey R.

    2018-01-01

    Electrically Driven Thermal Management is an active research and technology development initiative incorporating ISS technology flight demonstrations (STP-H5), development of Microgravity Science Glovebox (MSG) flight experiment, and laboratory-based investigations of electrically based thermal management techniques. The program targets integrated thermal management for future generations of RF electronics and power electronic devices. This presentation reviews four program elements: i.) results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched in February 2017 ii.) development of the Electrically Driven Liquid Film Boiling Experiment iii.) two University based research efforts iv.) development of Oscillating Heat Pipe evaluation at Goddard Space Flight Center.

  16. Preventing another Chernobyl: codes, practices, and the role of new technology

    International Nuclear Information System (INIS)

    Egan, J.R.

    1988-01-01

    Preventative steps to prevent and mitigate the consequences of a nuclear accident are considered. The national and international legislation now available is mentioned elsewhere in the book. This chapter discusses the role of codes of conduct and personal agreements which could also prevent nuclear accidents. It is suggested that a radical departure from existing nuclear technology is needed. In particular some definite goals and objectives are suggested. The first is that there should be no meltdowns - all reactors must be physically incapable of core meltdown or explosion under any circumstances. This would mean no huge emergency evacuation plans, no need for expensive emergency core cooling systems. No spiralling cost overruns, no nuclear waste problem, no more huge reactors too big for national electrical systems, no military complications, no intractable decommissionings and no quality control problems are also given as goals. Some of these goals could be achieved with inherently safe reactor designs, much smaller reactors and factory assembly, rather than on-site construction. Other suggestions are also made to achieve the proposed new nuclear technology. (U.K.)

  17. Integral and Separate Effects Tests for Thermal Hydraulics Code Validation for Liquid-Salt Cooled Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Per

    2012-10-30

    The objective of the 3-year project was to collect integral effects test (IET) data to validate the RELAP5-3D code and other thermal hydraulics codes for use in predicting the transient thermal hydraulics response of liquid salt cooled reactor systems, including integral transient response for forced and natural circulation operation. The reference system for the project is a modular, 900-MWth Pebble Bed Advanced High Temperature Reactor (PB-AHTR), a specific type of Fluoride salt-cooled High temperature Reactor (FHR). Two experimental facilities were developed for thermal-hydraulic integral effects tests (IETs) and separate effects tests (SETs). The facilities use simulant fluids for the liquid fluoride salts, with very little distortion to the heat transfer and fluid dynamics behavior. The CIET Test Bay facility was designed, built, and operated. IET data for steady state and transient natural circulation was collected. SET data for convective heat transfer in pebble beds and straight channel geometries was collected. The facility continues to be operational and will be used for future experiments, and for component development. The CIET 2 facility is larger in scope, and its construction and operation has a longer timeline than the duration of this grant. The design for the CIET 2 facility has drawn heavily on the experience and data collected on the CIET Test Bay, and it was completed in parallel with operation of the CIET Test Bay. CIET 2 will demonstrate start-up and shut-down transients and control logic, in addition to LOFC and LOHS transients, and buoyant shut down rod operation during transients. Design of the CIET 2 Facility is complete, and engineering drawings have been submitted to an external vendor for outsourced quality controlled construction. CIET 2 construction and operation continue under another NEUP grant. IET data from both CIET facilities is to be used for validation of system codes used for FHR modeling, such as RELAP5-3D. A set of

  18. Validation and application of the system code TRACE for safety related investigations of innovative nuclear energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim

    2011-12-19

    The system code TRACE is the latest development of the U.S. Nuclear Regulatory Commission (US NRC). TRACE, developed for the analysis of operational conditions, transients and accidents of light water reactors (LWR), is a best-estimate code with two fluid, six equation models for mass, energy, and momentum conservation, and related closure models. Since TRACE is mainly applied to LWR specific issues, the validation process related to innovative nuclear systems (liquid metal cooled systems, systems operated with supercritical water, etc.) is very limited, almost not existing. In this work, essential contribution to the validation of TRACE related to lead and lead alloy cooled systems as well as systems operated with supercritical water is provided in a consistent and corporate way. In a first step, model discrepancies of the TRACE source code were removed. This inconsistencies caused the wrong prediction of the thermo physical properties of supercritical water and lead bismuth eutectic, and hence the incorrect prediction of heat transfer relevant characteristic numbers like Reynolds or Prandtl number. In addition to the correction of the models to predict these quantities, models describing the thermo physical properties of lead and Diphyl THT (synthetic heat transfer medium) were implemented. Several experiments and numerical benchmarks were used to validate the modified TRACE version. These experiments, mainly focused on wall-to-fluid heat transfer, revealed that not only the thermo physical properties are afflicted with inconsistencies but also the heat transfer models. The models for the heat transfer to liquid metals were enhanced in a way that the code can now distinguish between pipe and bundle flow by using the right correlation. The heat transfer to supercritical water was not existing in TRACE up to now. Completely new routines were implemented to overcome that issue. The comparison of the calculations to the experiments showed, on one hand, the necessity

  19. The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness.

    Energy Technology Data Exchange (ETDEWEB)

    Walraven, Jeremy Allen; Blecke, Jill; Baker, Michael Sean; Clemens, Rebecca C.; Mitchell, John Anthony; Brake, Matthew Robert; Epp, David S.; Wittwer, Jonathan W.

    2008-10-01

    This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal effects. Finally, technology readiness testing demonstrated the integrated elements of the sensor under realistic environments.

  20. Use of bar-code technology in MC and A system

    International Nuclear Information System (INIS)

    Mykhaylov, V.; Odeychuk, N.; Tovkanetz, V.; Lapshin, V.; Ewing, T.

    2001-01-01

    Full text: Significant problem during the treatment with nuclear materials is the usage of reliable, rapid, integrant automated systems of nuclear material control and account. Thus the dose loading of attending technical personnel is essentially reduced. One of the directions of the solution of the indicated problems is the usage of bar-code technology. Such integrated system should include protection of materials, measuring of materials, and record of materials and drawing up of an inventory list. Especially it is important for the enterprises, on which the enriched uranium and other nuclear materials, which are under IAEA warranties, are utilized. According to US assistance program in the field of MC and A, NSC KIPT has been received indispensable equipment and software, including equipment of nondestructive analysis and automated inventory material accounting system (AIMAS), which was intended for modernizing of nuclear material account system in NSC KIPT. The purpose of operations was estimation of generalized procedures on both MC and A and nondestructive analysis, and updating them so that they might obey the specific conditions of the Enterprise and demands of the Ukraine Regulatory Administration. In NSC KIPT, which is the largest nuclear and physics research center in Ukraine, the measures on enactment of bar-code technology for nuclear materials control and account with the usage of equipment and software of US leading firms (Intermec, Prodigy Max, Tharo Systems, Inc) have been conducting since 1999. During the introduction of this technology, it has been installed the software on nuclear material control and account (AIMAS data base), which was intended for this activities, on NSC KIPT computers. The structure of the NSC KIPT's facility has been determined according to demands of the State and IAEA demands. The key measuring points of inventory quantity has been determined in nuclear material balance zone and the concrete computers, on which is kept

  1. Development and validation of an improved version of the DART code

    International Nuclear Information System (INIS)

    Taboada, H; Moscarda, M.V.; Markiewicz, M.; Estevez, E.; Rest, J.

    2002-01-01

    ANL/USDOE and CNEA Argentina have been participating within a SisterLab Program in the area of Low Enriched Uranium Advanced Fuels since October 16, 1997 under the 'Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy'. An annex concerning DART code optimization has been operative since February 8, 1999. Previously, as a part of this annex we developed a visual version of DART named FASTDART for silicide and U-Mo fuels that was presented at the RERTR Meeting in Las Vegas, Nevada. This paper describes several major improvements in the FASTDART code: a thermal calculation subroutine, a fuel particle size distribution subroutine and several visual interfaces for thermal output plotting and particle size input. Using the power history, coolant regime data and fuel dimensions, the new thermal subroutine is able to calculate at each time step the maximum temperature along the z-longitudinal axis as a function of plate/rod morphology (corrosion oxide, cladding, meat, aluminide particle layer, each radial shell of a central fuel particle, and particle center). Calculated temperatures at each time step are coupled to the DART calculation kernel such that swelling processes, volume phase fractions and meat thermal conductivity are calculated synergistically. The new fuel particle size-distribution subroutine is essential in order to determine the evolution of the volume fraction of reaction product. This phase degrades the heat transport by a twofold mechanism: its appearance implies a diminution of aluminium phase and its thermal conductivity is lower than those of fuel and dispersant phase. The new version includes the capability of plotting thermal data output by means of the plate/rod temperature profile at a given irradiation step, and displaying the maximum temperature evolution of each layer. A comparison between the reaction layer thickness and matrix and fuel volume fractions of several RERTR-3 experiment

  2. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    Science.gov (United States)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called "Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres", (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the "Robust design of artificial neural networks methodology" and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of 252Cf, 241AmBe and 239PuBe neutron sources measured with a Bonner spheres system.

  3. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    International Nuclear Information System (INIS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-01-01

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called ''Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres'', (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the ''Robust design of artificial neural networks methodology'' and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of 252 Cf, 241 AmBe and 239 PuBe neutron sources measured with a Bonner spheres system

  4. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solis Sanches, L. O.; Miranda, R. Castaneda; Cervantes Viramontes, J. M. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac (Mexico); Vega-Carrillo, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica. Av. Ramon Lopez Velarde 801. Col. Centro Zacatecas, Zac., Mexico. and Unidad Academica de Estudios Nucleares. C. Cip (Mexico)

    2013-07-03

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called ''Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres'', (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the ''Robust design of artificial neural networks methodology'' and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of {sup 252}Cf, {sup 241}AmBe and {sup 239}PuBe neutron sources measured with a Bonner spheres system.

  5. Development and validation of a model TRIGA Mark III reactor with code MCNP5

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Aguilar H, F.

    2015-09-01

    The main purpose of this paper is to obtain a model of the reactor core TRIGA Mark III that accurately represents the real operating conditions to 1 M Wth, using the Monte Carlo code MCNP5. To provide a more detailed analysis, different models of the reactor core were realized by simulating the control rods extracted and inserted in conditions in cold (293 K) also including an analysis for shutdown margin, so that satisfied the Operation Technical Specifications. The position they must have the control rods to reach a power equal to 1 M Wth, were obtained from practice entitled Operation in Manual Mode performed at Instituto Nacional de Investigaciones Nucleares (ININ). Later, the behavior of the K eff was analyzed considering different temperatures in the fuel elements, achieving calculate subsequently the values that best represent the actual reactor operation. Finally, the calculations in the developed model for to obtain the distribution of average flow of thermal, epithermal and fast neutrons in the six new experimental facilities are presented. (Author)

  6. Calibration and Validation of the Dynamic Wake Meandering Model for Implementation in an Aeroelastic Code

    DEFF Research Database (Denmark)

    Aagaard Madsen, Helge; Larsen, Gunner Chr.; Larsen, Torben J.

    2010-01-01

    in an aeroelastic model. Calibration and validation of the different parts of the model is carried out by comparisons with actuator disk and actuator line (ACL) computations as well as with inflow measurements on a full-scale 2 MW turbine. It is shown that the load generating part of the increased turbulence......As the major part of new wind turbines are installed in clusters or wind farms, there is a strong need for reliable and accurate tools for predicting the increased loadings due to wake operation and the associated reduced power production. The dynamic wake meandering (DWM) model has been developed...... on this background, and the basic physical mechanisms in the wake—i.e., the velocity deficit, the meandering of the deficit, and the added turbulence—are modeled as simply as possible in order to make fast computations. In the present paper, the DWM model is presented in a version suitable for full integration...

  7. Efficacy analysis of LDPC coded APSK modulated differential space-time-frequency coded for wireless body area network using MB-pulsed OFDM UWB technology.

    Science.gov (United States)

    Manimegalai, C T; Gauni, Sabitha; Kalimuthu, K

    2017-12-04

    Wireless body area network (WBAN) is a breakthrough technology in healthcare areas such as hospital and telemedicine. The human body has a complex mixture of different tissues. It is expected that the nature of propagation of electromagnetic signals is distinct in each of these tissues. This forms the base for the WBAN, which is different from other environments. In this paper, the knowledge of Ultra Wide Band (UWB) channel is explored in the WBAN (IEEE 802.15.6) system. The measurements of parameters in frequency range from 3.1-10.6 GHz are taken. The proposed system, transmits data up to 480 Mbps by using LDPC coded APSK Modulated Differential Space-Time-Frequency Coded MB-OFDM to increase the throughput and power efficiency.

  8. Validation of the thermal code of RadTherm-IR, IR-Workbench, and F-TOM

    Science.gov (United States)

    Schwenger, Frédéric; Grossmann, Peter; Malaplate, Alain

    2009-05-01

    System assessment by image simulation requires synthetic scenarios that can be viewed by the device to be simulated. In addition to physical modeling of the camera, a reliable modeling of scene elements is necessary. Software products for modeling of target data in the IR should be capable of (i) predicting surface temperatures of scene elements over a long period of time and (ii) computing sensor views of the scenario. For such applications, FGAN-FOM acquired the software products RadTherm-IR (ThermoAnalytics Inc., Calumet, USA; IR-Workbench (OKTAL-SE, Toulouse, France). Inspection of the accuracy of simulation results by validation is necessary before using these products for applications. In the first step of validation, the performance of both "thermal solvers" was determined through comparison of the computed diurnal surface temperatures of a simple object with the corresponding values from measurements. CUBI is a rather simple geometric object with well known material parameters which makes it suitable for testing and validating object models in IR. It was used in this study as a test body. Comparison of calculated and measured surface temperature values will be presented, together with the results from the FGAN-FOM thermal object code F-TOM. In the second validation step, radiances of the simulated sensor views computed by RadTherm-IR and IR-Workbench will be compared with radiances retrieved from the recorded sensor images taken by the sensor that was simulated. Strengths and weaknesses of the models RadTherm-IR, IR-Workbench and F-TOM will be discussed.

  9. Qualitative and quantitative validation of the SINBAD code on complex HPGe gamma-ray spectra

    Energy Technology Data Exchange (ETDEWEB)

    Rohee, E.; Coulon, R.; Normand, S.; Carrel, F. [CEA, LIST, Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Dautremer, T.; Barat, E.; Montagu, T. [CEA, LIST, Laboratoire Modelisation, Simulation et Systemes, F-91191 Gif-sur-Yvette, (France); Jammes, C. [CEA/DEN/SPEx/LDCI, Centre de Cadarache, F-13109 Saint-Paul-lez-Durance, (France)

    2015-07-01

    Radionuclides identification and quantification is a serious concern for many applications as safety or security of nuclear power plant or fuel cycle facility, CBRN risk identification, environmental radioprotection and waste measurements. High resolution gamma-ray spectrometry based on HPGe detectors is a performing solution for all these topics. During last decades, a great number of software has been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when photoelectric peaks are folded together with a high ratio between theirs amplitudes, when the Compton background is much larger compared to the signal of a single peak and when spectra are composed of a great number of peaks. This study deals with the comparison between conventional methods in radionuclides identification and quantification and the code called SINBAD ('Spectrometrie par Inference Non parametrique Bayesienne Deconvolutive'). For many years, SINBAD has been developed by CEA LIST for unfolding complex spectra from HPGe detectors. Contrary to conventional methods using fitting procedures, SINBAD uses a probabilistic approach with Bayesian inference to describe spectrum data. This conventional fitting method founded for example in Genie 2000 is compared with the nonparametric SINBAD approach regarding some key figures of merit as the peak centroid evaluation (identification) and peak surface evaluation (quantification). Unfriendly cases are studied for nuclides detection with closed gamma-rays energies and high photoelectric peak intensity differences. Tests are performed with spectra from the International Atomic Energy Agency (IAEA) for gamma spectra analysis software benchmark and with spectra acquired at the laboratory. Results show that SINBAD and Genie 2000 performances are quite similar with sometimes best results for SINBAD with the important difference that to achieve same performances the nonparametric method is user-friendly compared

  10. Qualitative and quantitative validation of the SINBAD code on complex HPGe gamma-ray spectra

    International Nuclear Information System (INIS)

    Rohee, E.; Coulon, R.; Normand, S.; Carrel, F.; Dautremer, T.; Barat, E.; Montagu, T.; Jammes, C.

    2015-01-01

    Radionuclides identification and quantification is a serious concern for many applications as safety or security of nuclear power plant or fuel cycle facility, CBRN risk identification, environmental radioprotection and waste measurements. High resolution gamma-ray spectrometry based on HPGe detectors is a performing solution for all these topics. During last decades, a great number of software has been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when photoelectric peaks are folded together with a high ratio between theirs amplitudes, when the Compton background is much larger compared to the signal of a single peak and when spectra are composed of a great number of peaks. This study deals with the comparison between conventional methods in radionuclides identification and quantification and the code called SINBAD ('Spectrometrie par Inference Non parametrique Bayesienne Deconvolutive'). For many years, SINBAD has been developed by CEA LIST for unfolding complex spectra from HPGe detectors. Contrary to conventional methods using fitting procedures, SINBAD uses a probabilistic approach with Bayesian inference to describe spectrum data. This conventional fitting method founded for example in Genie 2000 is compared with the nonparametric SINBAD approach regarding some key figures of merit as the peak centroid evaluation (identification) and peak surface evaluation (quantification). Unfriendly cases are studied for nuclides detection with closed gamma-rays energies and high photoelectric peak intensity differences. Tests are performed with spectra from the International Atomic Energy Agency (IAEA) for gamma spectra analysis software benchmark and with spectra acquired at the laboratory. Results show that SINBAD and Genie 2000 performances are quite similar with sometimes best results for SINBAD with the important difference that to achieve same performances the nonparametric method is user-friendly compared

  11. Initial validation of 4D-model for a clinical PET scanner using the Monte Carlo code gate

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Igor F.; Lima, Fernando R.A.; Gomes, Marcelo S., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Vieira, Jose W.; Pacheco, Ludimila M. [Instituto Federal de Educacao, Ciencia e Tecnologia (IFPE), Recife, PE (Brazil); Chaves, Rosa M. [Instituto de Radium e Supervoltagem Ivo Roesler, Recife, PE (Brazil)

    2011-07-01

    Building exposure computational models (ECM) of emission tomography (PET and SPECT) currently has several dedicated computing tools based on Monte Carlo techniques (SimSET, SORTEO, SIMIND, GATE). This paper is divided into two steps: (1) using the dedicated code GATE (Geant4 Application for Tomographic Emission) to build a 4D model (where the fourth dimension is the time) of a clinical PET scanner from General Electric, GE ADVANCE, simulating the geometric and electronic structures suitable for this scanner, as well as some phenomena 4D, for example, rotating gantry; (2) the next step is to evaluate the performance of the model built here in the reproduction of test noise equivalent count rate (NEC) based on the NEMA Standards Publication NU protocols 2-2007 for this tomography. The results for steps (1) and (2) will be compared with experimental and theoretical values of the literature showing actual state of art of validation. (author)

  12. Physical analysis and modelling of aerosols transport. implementation in a finite elements code. Experimental validation in laminar and turbulent flows

    International Nuclear Information System (INIS)

    Armand, Patrick

    1995-01-01

    The aim of this work consists in the Fluid Mechanics and aerosol Physics coupling. In the first part, the order of magnitude analysis of the particle dynamics is done. This particle is embedded in a non-uniform unsteady flow. Flow approximations around the inclusion are described. Corresponding aerodynamic drag formulae are expressed. Possible situations related to the problem data are extensively listed. In the second part, one studies the turbulent particles transport. Eulerian approach which is particularly well adapted to industrial codes is preferred in comparison with the Lagrangian methods. One chooses the two-fluid formalism in which career gas-particles slip is taken into account. Turbulence modelling gets through a k-epsilon modulated by the inclusions action on the flow. The model is implemented In a finite elements code. Finally, In the third part, one validates the modelling in laminar and turbulent cases. We compare simulations to various experiments (settling battery, inertial impaction in a bend, jets loaded with glass beads particles) which are taken in the literature or done by ourselves at the laboratory. The results are very close. It is a good point when it is thought of the particles transport model and associated software future use. (author) [fr

  13. Two-dimensional Kolmogorov complexity and an empirical validation of the Coding theorem method by compressibility

    Directory of Open Access Journals (Sweden)

    Hector Zenil

    2015-09-01

    Full Text Available We propose a measure based upon the fundamental theoretical concept in algorithmic information theory that provides a natural approach to the problem of evaluating n-dimensional complexity by using an n-dimensional deterministic Turing machine. The technique is interesting because it provides a natural algorithmic process for symmetry breaking generating complex n-dimensional structures from perfectly symmetric and fully deterministic computational rules producing a distribution of patterns as described by algorithmic probability. Algorithmic probability also elegantly connects the frequency of occurrence of a pattern with its algorithmic complexity, hence effectively providing estimations to the complexity of the generated patterns. Experiments to validate estimations of algorithmic complexity based on these concepts are presented, showing that the measure is stable in the face of some changes in computational formalism and that results are in agreement with the results obtained using lossless compression algorithms when both methods overlap in their range of applicability. We then use the output frequency of the set of 2-dimensional Turing machines to classify the algorithmic complexity of the space-time evolutions of Elementary Cellular Automata.

  14. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  15. Experimental Definition and Validation of Protein Coding Transcripts in Chlamydomonas reinhardtii

    Energy Technology Data Exchange (ETDEWEB)

    Kourosh Salehi-Ashtiani; Jason A. Papin

    2012-01-13

    Algal fuel sources promise unsurpassed yields in a carbon neutral manner that minimizes resource competition between agriculture and fuel crops. Many challenges must be addressed before algal biofuels can be accepted as a component of the fossil fuel replacement strategy. One significant challenge is that the cost of algal fuel production must become competitive with existing fuel alternatives. Algal biofuel production presents the opportunity to fine-tune microbial metabolic machinery for an optimal blend of biomass constituents and desired fuel molecules. Genome-scale model-driven algal metabolic design promises to facilitate both goals by directing the utilization of metabolites in the complex, interconnected metabolic networks to optimize production of the compounds of interest. Using Chlamydomonas reinhardtii as a model, we developed a systems-level methodology bridging metabolic network reconstruction with annotation and experimental verification of enzyme encoding open reading frames. We reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. Our approach to generate a predictive metabolic model integrated with cloned open reading frames, provides a cost-effective platform to generate metabolic engineering resources. While the generated resources are specific to algal systems, the approach that we have developed is not specific to algae and

  16. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    Science.gov (United States)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  17. Validation of the DITUBS computer code system for LWHCR physics calculations of the PROTEUS-Phase-II experiments

    International Nuclear Information System (INIS)

    Axmann, J.K.

    1992-02-01

    Research and development activities related to the light water high conversion reactor concept have been conducted at the Paul Scherrer Institut (PSI) in the framework of a joint Swiss/German co-operation, together with the Karlsruhe Nuclear Research Centre, Siemens/KWU and the Technical University of Braunschweig. The present report documents principally the validation of the DITUBS computer code system, developed at the Technical University of Braunschweig for LWHCR physics design calculations. Experimental results from six of the fourteen PROTEUS-LWHCR core configurations investigated in the Phase II programme serve as a bsis for the study. Thus, alternative methods and data-set options within the DITUBS system have been developed and applied for (a) obtaining an independent set of calculated correction factors for various individual effects in the experiments and (b) achieving improvements in C/E (calculation/experiment) values for the measured integral parameters, viz. k ∞ and reaction rate ratios. The solution of numerical benchmark problems - for validation of burnup calculations and fuel-element-geometry treatment - form part of the study, the DITUBS system being finally used to address questions related to technical and economic feasibility for range of LWHCR designs. (author) figs., tabs., 112 refs

  18. Validation of the coupling of mesh models to GEANT4 Monte Carlo code for simulation of internal sources of photons

    International Nuclear Information System (INIS)

    Caribe, Paulo Rauli Rafeson Vasconcelos; Cassola, Vagner Ferreira; Kramer, Richard; Khoury, Helen Jamil

    2013-01-01

    The use of three-dimensional models described by polygonal meshes in numerical dosimetry enables more accurate modeling of complex objects than the use of simple solid. The objectives of this work were validate the coupling of mesh models to the Monte Carlo code GEANT4 and evaluate the influence of the number of vertices in the simulations to obtain absorbed fractions of energy (AFEs). Validation of the coupling was performed to internal sources of photons with energies between 10 keV and 1 MeV for spherical geometries described by the GEANT4 and three-dimensional models with different number of vertices and triangular or quadrilateral faces modeled using Blender program. As a result it was found that there were no significant differences between AFEs for objects described by mesh models and objects described using solid volumes of GEANT4. Since that maintained the shape and the volume the decrease in the number of vertices to describe an object does not influence so meant dosimetric data, but significantly decreases the time required to achieve the dosimetric calculations, especially for energies less than 100 keV

  19. Distrubtion Tolerant Network Technology Flight Validation Report: DINET

    Science.gov (United States)

    Jones, Ross M.

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then, they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions.

  20. Distribution Tolerant Network Technology Flight Validation Report: DINET

    Science.gov (United States)

    Jones, Ross M.

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then, they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions.

  1. Validation Study on the MCC-based Technology

    International Nuclear Information System (INIS)

    Park, Sungkeun; Lee, Dowhan; Kang, Shincheul; Choi, Hyunwoo; Chai, Jangbom

    2006-01-01

    KEPRI and M and D Corporation has developed a methodology, called the NEST I (Non-intrusive Evaluation of Stem Thrust), for determining the stem thrust for a Motor Operated Valve (MOV) based on the motor torque and the stem displacement. The motor torque is determined using another method called NEET (Non-intrusive Evaluation of Electric Torque) which uses the voltage and current data from three phases to obtain the motor torque. The stem displacement is obtained from the voltage and current data along with the nameplate information of the motor, actuator and stem. The motor data (voltage, current and coil current) are measured using MOVIDS (Motor Operated Valve Intelligent Diagnostic System). The motor torque is determined using a NEET algorithm and the stem thrust is calculated using the NEST I method. The goal of this testing was to obtain data from operation of a MOV and to compare the actual measured thrust with the thrust calculated using the NEET / NEST I methods and therefore validate the NEET / NEST I methods

  2. Development of safety analysis codes and experimental validation for a very high temperature gas-cooled reactor Final report

    Energy Technology Data Exchange (ETDEWEB)

    Chang Oh

    2006-03-01

    The very high-temperature gas-cooled reactor (VHTR) is envisioned as a single- or dual-purpose reactor for electricity and hydrogen generation. The concept has average coolant temperatures above 9000C and operational fuel temperatures above 12500C. The concept provides the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperature to support process heat applications, such as coal gasification, desalination or cogenerative processes, the VHTR’s higher temperatures allow broader applications, including thermochemical hydrogen production. However, the very high temperatures of this reactor concept can be detrimental to safety if a loss-of-coolant accident (LOCA) occurs. Following the loss of coolant through the break and coolant depressurization, air will enter the core through the break by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structure and fuel. The oxidation will accelerate heatup of the reactor core and the release of toxic gasses (CO and CO2) and fission products. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. Prior to the start of this Korean/United States collaboration, no computer codes were available that had been sufficiently developed and validated to reliably simulate a LOCA in the VHTR. Therefore, we have worked for the past three years on developing and validating advanced computational methods for simulating LOCAs in a VHTR. Research Objectives As described above, a pipe break may lead to significant fuel damage and fission product release in the VHTR. The objectives of this Korean/United States collaboration were to develop and validate advanced computational methods for VHTR safety analysis. The methods that have been developed are now

  3. Validation of coupled Relap5-3D code in the analysis of RBMK-1500 specific transients

    International Nuclear Information System (INIS)

    Evaldas, Bubelis; Algirdas, Kaliatka; Eugenijus, Uspuras

    2003-01-01

    This paper deals with the modelling of RBMK-1500 specific transients taking place at Ignalina NPP. These transients include: measurements of void and fast power reactivity coefficients, change of graphite cooling conditions and reactor power reduction transients. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Graphite temperature reactivity coefficient at the plant is determined by changing graphite cooling conditions in the reactor cavity. This type of transient is very unique and important from the gap between fuel channel and the graphite bricks model validation point of view. The measurement results, obtained during this transient, allowed to determine the thermal conductivity coefficient for this gap and to validate the graphite temperature reactivity feedback model. Reactor power reduction is a regular operation procedure during the entire lifetime of the reactor. In all cases it starts by either a scram or a power reduction signal activation by the reactor control and protection system or by an operator. The obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviours of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modelling of the neutronic processes taking place in RBMK- 1500 reactor core. And finally, the performed validation of RELAP5-3D model of Ignalina NPP RBMK-1500

  4. Software requirements, design, and verification and validation for the FEHM application - a finite-element heat- and mass-transfer code

    International Nuclear Information System (INIS)

    Dash, Z.V.; Robinson, B.A.; Zyvoloski, G.A.

    1997-07-01

    The requirements, design, and verification and validation of the software used in the FEHM application, a finite-element heat- and mass-transfer computer code that can simulate nonisothermal multiphase multicomponent flow in porous media, are described. The test of the DOE Code Comparison Project, Problem Five, Case A, which verifies that FEHM has correctly implemented heat and mass transfer and phase partitioning, is also covered

  5. Laser technology to manage periodontal disease: a valid concept?

    Science.gov (United States)

    Low, Samuel B; Mott, Angie

    2014-06-01

    Present day dental lasers can create oral environments conducive for periodontal repair. With the bacterial etiology of periodontitis and the resulting host inflammatory reaction, clinicians continue to search for therapeutic modalities to assist in the non-surgical management of periodontal disease. Traditional chairside therapies consist of mechanical debridement with manual and/or ultrasonic instrumentation with the objective of removing calculus, biofilm, and endotoxin from tooth root surfaces. Decreasing the microbial stimuli and associated end products decreases the inflammatory reaction and allows the host an opportunity to regenerate tissue through wound healing. The purpose of this article is to examine whether dental lasers, which have been in use for the past 3 decades, may augment traditional non-surgical periodontal therapy. Review of research publications related to lasers and non-surgical periodontics with attention focused on systematic studies. Studies utilizing laser technology may demonstrate positive effects on 1) selectively decreasing the biofilm environment, 2) removing calculus deposits and neutralizing endotoxin, 3) removing sulcular epithelium to assist in reattachment and decreased pocket depth, and 4) biostimulation for enhanced wound healing. Comparisons of studies to determine the difference between lasers and their respective effects on the periodontium are difficult to assess due to a wide variation of laser protocols. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Technological change in the wine market? The role of QR codes and wine apps in consumer wine purchases

    Directory of Open Access Journals (Sweden)

    Lindsey M. Higgins

    2014-06-01

    Full Text Available As an experiential good, wine purchases in the absence of tastings are often challenging and information-laden decisions. Technology has shaped the way consumers negotiate this complex purchase process. Using a sample of 631 US wine consumers, this research aims to identify the role of mobile applications and QR codes in the wine purchase decision. Results suggest that wine consumers that consider themselves wine connoisseurs or experts, enjoy talking about wine, and are interested in wine that is produced locally, organically, or sustainably are more likely to employ technology in their wine purchase decision. While disruption appears to have occurred on the supply side (number of wine applications available and the number of wine labels with a QR code, this research suggests that relatively little change is occurring on the demand side (a relatively small segment of the population—those already interested in wine—are employing the technology to aid in their purchase decision.

  7. Validation of an instrument to measure students' motivation and self-regulation towards technology learning

    Science.gov (United States)

    Liou, Pey-Yan; Kuo, Pei-Jung

    2014-05-01

    Background:Few studies have examined students' attitudinal perceptions of technology. There is no appropriate instrument to measure senior high school students' motivation and self-regulation toward technology learning among the current existing instruments in the field of technology education. Purpose:The present study is to validate an instrument for assessing senior high school students' motivation and self-regulation towards technology learning. Sample:A total of 1822 Taiwanese senior high school students (1020 males and 802 females) responded to the newly developed instrument. Design and method:The Motivation and Self-regulation towards Technology Learning (MSRTL) instrument was developed based on the previous instruments measuring students' motivation and self-regulation towards science learning. Exploratory and confirmatory factor analyses were utilized to investigate the structure of the items. Cronbach's alpha was applied for measuring the internal consistency of each scale. Furthermore, multivariate analysis of variance was used to examine gender differences. Results:Seven scales, including 'Technology learning self-efficacy,' 'Technology learning value,' 'Technology active learning strategies,' 'Technology learning environment stimulation,' 'Technology learning goal-orientation,' 'Technology learning self-regulation-triggering,' and 'Technology learning self-regulation-implementing' were confirmed for the MSRTL instrument. Moreover, the results also showed that male and female students did not present the same degree of preference in all of the scales. Conclusions:The MSRTL instrument composed of seven scales corresponding to 39 items was shown to be valid based on validity and reliability analyses. While male students tended to express more positive and active performance in the motivation scales, no gender differences were found in the self-regulation scales.

  8. Large-scale, multi-compartment tests in PANDA for LWR-containment analysis and code validation

    International Nuclear Information System (INIS)

    Paladino, Domenico; Auban, Olivier; Zboray, Robert

    2006-01-01

    The large-scale thermal-hydraulic PANDA facility has been used for the last years for investigating passive decay heat removal systems and related containment phenomena relevant for next-generation and current light water reactors. As part of the 5. EURATOM framework program project TEMPEST, a series of tests was performed in PANDA to experimentally investigate the distribution of hydrogen inside the containment and its effect on the performance of the Passive Containment Cooling System (PCCS) designed for the Economic Simplified Boiling Water Reactor (ESBWR). In a postulated severe accident, a large amount of hydrogen could be released in the Reactor Pressure Vessel (RPV) as a consequence of the cladding Metal- Water (M-W) reaction and discharged together with steam to the Drywell (DW) compartment. In PANDA tests, hydrogen was simulated by using helium. This paper illustrates the results of a TEMPEST test performed in PANDA and named as Test T1.2. In Test T1.2, the gas stratification (steam-helium) patterns forming in the large-scale multi-compartment PANDA DW, and the effect of non-condensable gas (helium) on the overall behaviour of the PCCS were identified. Gas mixing and stratification in a large-scale multi-compartment system are currently being further investigated in PANDA in the frame of the OECD project SETH. The testing philosophy in this new PANDA program is to produce data for code validation in relation to specific phenomena, such as: gas stratification in the containment, gas transport between containment compartments, wall condensation, etc. These types of phenomena are driven by buoyant high-momentum injections (jets) and/or low momentum injection (plumes), depending on the transient scenario. In this context, the new SETH tests in PANDA are particularly valuable to produce an experimental database for code assessment. This paper also presents an overview of the PANDA SETH tests and the major improvements in instrumentation carried out in the PANDA

  9. Validation of CATHARE Code for Gas-Cooled Reactors: Comparison with E.V.O Experimental Data on Oberhausen II Facility

    International Nuclear Information System (INIS)

    Bentivoglio, Fabrice; Tauveron, Nicolas

    2006-01-01

    Extensively validated and qualified for light-water reactor safety studies, the thermal-hydraulics code CATHARE has been adapted to deal also with Gas-Cooled Reactor applications. In order to validate the code for these new applications, CEA (Commissariat a l'Energie Atomique) has initiated an ambitious long-term experimental program. The foreseen experimental facilities range from small-scale loops for physical correlations, to component technology and system demonstration loops. In the short-term perspective, CATHARE is being validated against existing experimental data, in particular from the German power plant Oberhausen II. Oberhausen II, operated by the German utility E.V.O (Energie Versorgung Oberhausen AG), is a 50 electrical Megawatts (MW(e)) direct-cycle Helium turbine plant. The power source is a gas burner instead of a nuclear reactor core, but the power conversion system resembles those of the GFR (Gas-cooled Fast Reactor) and other high-temperature reactor concepts. Oberhausen II was operated for more than 25 000 hours between 1974 and 1988. Design specifications, drawings and experimental data have been obtained through the European HTR-E project, offering a unique opportunity to validate CATHARE on a large-scale Brayton cycle. Available measurements of temperature, pressure and mass flow rate throughout the circuit have allowed a very comprehensive thermal-hydraulic description of the plant, in steady-state conditions for design data and operating data as well as during transients. First, the paper presents the modeling of the Oberhausen II plant with the CATHARE code, with a complete description of the modeling of each component: the recuperator, a complex gas to gas counter flow heat exchanger, the pre-cooler and inter-cooler, two complex gas to water cross flow heat exchanger, the heater, which is a gas burner, and the two turbines and two compressors. A particular attention is given to the modeling of leakages all along the circuit and to the

  10. Development and Application of Subchannel Analysis Code Technology for Advanced Reactor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, K. W

    2006-01-15

    A study has been performed for the development and assessment of a subchannel analysis code which is purposed to be used for the analysis of advanced reactor conditions with various configurations of reactor core and several kinds of reactor coolant fluids. The subchannel analysis code was developed on the basis of MATRA code which is being developed at KAERI. A GUI (Graphic User Interface) system was adopted in order to reduce input error and to enhance user convenience. The subchannel code was complemented in the property calculation modules by including various fluids such as heavy liquid metal, gas, refrigerant,and supercritical water. The subchannel code was applied to calculate the local thermal hydraulic conditions inside the non-square test bundles which was employed for the analysis of CHF. The applicability of the subchannel code was evaluated for a high temperature gas cooled reactor condition and supercritical pressure conditions with water and Freon. A subchannel analysis has been conducted for European ADS(Accelerator-Driven subcritical System) with Pb-Bi coolant through the international cooperation work between KAERI and FZK, Germany. In addition, the prediction capability of the subchannel code was evaluated for the subchannel void distribution data by participating an international code benchmark program which was organized by OECD/NRC.

  11. Validation and Comparison of 2D and 3D Codes for Nearshore Motion of Long Waves Using Benchmark Problems

    Science.gov (United States)

    Velioǧlu, Deniz; Cevdet Yalçıner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Tsunamis are huge waves with long wave periods and wave lengths that can cause great devastation and loss of life when they strike a coast. The interest in experimental and numerical modeling of tsunami propagation and inundation increased considerably after the 2011 Great East Japan earthquake. In this study, two numerical codes, FLOW 3D and NAMI DANCE, that analyze tsunami propagation and inundation patterns are considered. Flow 3D simulates linear and nonlinear propagating surface waves as well as long waves by solving three-dimensional Navier-Stokes (3D-NS) equations. NAMI DANCE uses finite difference computational method to solve 2D depth-averaged linear and nonlinear forms of shallow water equations (NSWE) in long wave problems, specifically tsunamis. In order to validate these two codes and analyze the differences between 3D-NS and 2D depth-averaged NSWE equations, two benchmark problems are applied. One benchmark problem investigates the runup of long waves over a complex 3D beach. The experimental setup is a 1:400 scale model of Monai Valley located on the west coast of Okushiri Island, Japan. Other benchmark problem is discussed in 2015 National Tsunami Hazard Mitigation Program (NTHMP) Annual meeting in Portland, USA. It is a field dataset, recording the Japan 2011 tsunami in Hilo Harbor, Hawaii. The computed water surface elevation and velocity data are compared with the measured data. The comparisons showed that both codes are in fairly good agreement with each other and benchmark data. The differences between 3D-NS and 2D depth-averaged NSWE equations are highlighted. All results are presented with discussions and comparisons. Acknowledgements: Partial support by Japan-Turkey Joint Research Project by JICA on earthquakes and tsunamis in Marmara Region (JICA SATREPS - MarDiM Project), 603839 ASTARTE Project of EU, UDAP-C-12-14 project of AFAD Turkey, 108Y227, 113M556 and 213M534 projects of TUBITAK Turkey, RAPSODI (CONCERT_Dis-021) of CONCERT

  12. Validation of One-Dimensional Module of MARS-KS1.2 Computer Code By Comparison with the RELAP5/MOD3.3/patch3 Developmental Assessment Results

    International Nuclear Information System (INIS)

    Bae, S. W.; Chung, B. D.

    2010-07-01

    This report records the results of the code validation for the one-dimensional module of the MARS-KS thermal hydraulics analysis code by means of result-comparison with the RELAP5/MOD3.3 computer code. For the validation calculations, simulations of the RELAP5 Code Developmental Assessment Problem, which consists of 22 simulation problems in 3 categories, have been selected. The results of the 3 categories of simulations demonstrate that the one-dimensional module of the MARS code and the RELAP5/MOD3.3 code are essentially the same code. This is expected as the two codes have basically the same set of field equations, constitutive equations and main thermal hydraulic models. The result suggests that the high level of code validity of the RELAP5/MOD3.3 can be directly applied to the MARS one-dimensional module

  13. NanoString, a novel digital color-coded barcode technology: current and future applications in molecular diagnostics.

    Science.gov (United States)

    Tsang, Hin-Fung; Xue, Vivian Weiwen; Koh, Su-Pin; Chiu, Ya-Ming; Ng, Lawrence Po-Wah; Wong, Sze-Chuen Cesar

    2017-01-01

    Formalin-fixed, paraffin-embedded (FFPE) tissue sample is a gold mine of resources for molecular diagnosis and retrospective clinical studies. Although molecular technologies have expanded the range of mutations identified in FFPE samples, the applications of existing technologies are limited by the low nucleic acids yield and poor extraction quality. As a result, the routine clinical applications of molecular diagnosis using FFPE samples has been associated with many practical challenges. NanoString technologies utilize a novel digital color-coded barcode technology based on direct multiplexed measurement of gene expression and offer high levels of precision and sensitivity. Each color-coded barcode is attached to a single target-specific probe corresponding to a single gene which can be individually counted without amplification. Therefore, NanoString is especially useful for measuring gene expression in degraded clinical specimens. Areas covered: This article describes the applications of NanoString technologies in molecular diagnostics and challenges associated with its applications and the future development. Expert commentary: Although NanoString technology is still in the early stages of clinical use, it is expected that NanoString-based cancer expression panels would play more important roles in the future in classifying cancer patients and in predicting the response to therapy for better personal therapeutic care.

  14. Validation of a modified PENELOPE Monte Carlo code for applications in digital and dual-energy mammography

    Science.gov (United States)

    Del Lama, L. S.; Cunha, D. M.; Poletti, M. E.

    2017-08-01

    The presence and morphology of microcalcification clusters are the main point to provide early indications of breast carcinomas. However, the visualization of those structures may be jeopardized due to overlapping tissues even for digital mammography systems. Although digital mammography is the current standard for breast cancer diagnosis, further improvements should be achieved in order to address some of those physical limitations. One possible solution for such issues is the application of the dual-energy technique (DE), which is able to highlight specific lesions or cancel out the tissue background. In this sense, this work aimed to evaluate several quantities of interest in radiation applications and compare those values with works present in the literature to validate a modified PENELOPE code for digital mammography applications. For instance, the scatter-to-primary ratio (SPR), the scatter fraction (SF) and the normalized mean glandular dose (DgN) were evaluated by simulations and the resulting values were compared to those found in earlier studies. Our results present a good correlation for the evaluated quantities, showing agreement equal or better than 5% for the scatter and dosimetric-related quantities when compared to the literature. Finally, a DE imaging chain was simulated and the visualization of microcalcifications was investigated.

  15. Validation of RALOC4 code for Ignalina NPP Accident Localisation System employing parameters measured during MSV opening

    International Nuclear Information System (INIS)

    Urbonavicius, E.; Rimkevicius, S.

    2001-01-01

    Accident Localisation System (ALS) of Ignalina NPP is a pressure suppression type confinement. It consists of a number of interconnected compartments with 10 condensing pools to condense the accident-generated steam and to reduce the peak pressures that can be reached during any LOCA. The condensing pools are located at five elevations in two ALS towers. In the case of main safety valve (MSV) opening the released steam is directed to the top (5 th ) condensing pool of ALS. The ALS thermal hydraulic parameters measured during unintended opening of single MSV which appeared on November 8, 1998 at Ignalina NPP Unit 2 were used for validation of RALOC4 code (Germany). Post-event calculations performed and the calculated water temperatures and water levels in condensing pools as well as condenser tray cooling system (CTCS) parameters were compared with corresponding measured data. The results of the performed sensitivity analysis showed that in the best-estimate analysis the heat transfer coefficient in CTCS heat exchangers could be increased to 2500 W/(m 2 .K) compared to conservative value of 1000 W/(m 2 .K) applied in former calculations.(author)

  16. Test and validation of the iterative code for the neutrons spectrometry and dosimetry: NSDUAZ; Prueba y validacion del codigo iterativo para la espectrometria y dosimetria de neutrones: NSDUAZ

    Energy Technology Data Exchange (ETDEWEB)

    Reyes H, A.; Ortiz R, J. M.; Reyes A, A.; Castaneda M, R.; Solis S, L. O.; Vega C, H. R., E-mail: alfredo_reyesh@hotmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Lopez Velarde 801, Col. Centro, 98000 Zacatecas (Mexico)

    2014-08-15

    In this work was realized the test and validation of an iterative code for neutronic spectrometry known as Neutron Spectrometry and Dosimetry of the Universidad Autonoma de Zacatecas (NSDUAZ). This code was designed in a user graph interface, friendly and intuitive in the environment programming of LabVIEW using the iterative algorithm known as SPUNIT. The main characteristics of the program are: the automatic selection of the initial spectrum starting from the neutrons spectra catalog compiled by the International Atomic Energy Agency, the possibility to generate a report in HTML format that shows in graph and numeric way the neutrons flowing and calculates the ambient dose equivalent with base to this. To prove the designed code, the count rates of a spectrometer system of Bonner spheres were used with a detector of {sup 6}LiI(Eu) with 7 polyethylene spheres with diameter of 0, 2, 3, 5, 8, 10 and 12. The count rates measured with two neutron sources: {sup 252}Cf and {sup 239}PuBe were used to validate the code, the obtained results were compared against those obtained using the BUNKIUT code. We find that the reconstructed spectra present an error that is inside the limit reported in the literature that oscillates around 15%. Therefore, it was concluded that the designed code presents similar results to those techniques used at the present time. (Author)

  17. Validation of fuel performance codes at the NRI Rez plc for Temelin and Dukovany NPPs fuel safety evaluations and operation support

    International Nuclear Information System (INIS)

    Valach, M.; Hejna, J.; Zymak, J.

    2003-05-01

    The report summarises the first phase of the FUMEX II related work performed in the period September 2002 - May 2003. An inventory of the PIN and FRAS codes family used and developed during previous years was made in light of their applicability (validity) in the domain of high burn-up and FUMEX II Project Experimental database. KOLA data were chosen as appropriate for the first step of both codes fixing (both tuned for VVER fuel originally). The modern requirements, expressed by adaptation of the UO 2 conductivity degradation from OECD HRP, RIM and FGR (athermal) modelling implementation into the PIN code and a diffusion FGR model development planned for embedding, into this code allow us to reasonably shadow or keep tight contact with top quality models as TRANSURANUS, COPERNIC, CYRANO, FEMAXI, FRAPCON3 or ENIGMA. Testing and validation runs with prepared input KOLA deck were made. FUMEX II exercise propose LOCA and RIA like transients, so we started development of those two codes coupling - denominated as PIN2FRAS code. Principles of the interface were tested, benchmarking on tentative RIA pulses on highly burned KOLA fuel are presented as the first achievement from our work. (author)

  18. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  19. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kress, T. S. [comp.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  20. Feasibility of use of color-coded rings by nurse midwives: An appropriate technology based on partographic principles

    Directory of Open Access Journals (Sweden)

    Asha K Pratinidhi

    2013-01-01

    Full Text Available Objectives: To study the feasibility of use of color-coded rings as a proxy for partograph for early identification of slow progress of labor. Materials and Methods: Color-coded rings were devised as a tool using appropriate technology to translate the partographic principles into simpler, easy to understand methodology. The rings were in pairs of 4 colors i.e., red, blue, yellow, and green, ranging from 3 cm to 10 cm in diameter with a difference of 4 cm between rings of the same color. The midwife performed p/v examination of the woman in labor to assess the initial cervical dilatation and identify corresponding ring. P/V was to be repeated after 4 hours to reassess the cervical dilatation and compare it with the bigger ring of the same color indicating expected cervical dilatation. If existing cervical dilatation measured lesser, it was interpreted as slow progress of labor indicating referral. Results: 44 women [23 (22.1% primis and 21 (13% multis] showed delayed progress of labor as judged by use of color-coded rings. 20 women (4 primis and 16 multis showed satisfactory progress or delivered by the time arrangements for referral were made. Conclusion: Use of color-coded rings may serve as a valuable tool based on appropriate technology to assess slow progress of labor not only in the hands of nurse midwives but it also can serve as a training tool for TBAs to help facilitate timely referral of such cases.

  1. Targeting non-coding RNAs in Plants with the CRISPR-Cas technology is a challenge yet worth accepting

    Directory of Open Access Journals (Sweden)

    Jolly eBasak

    2015-11-01

    Full Text Available Non-coding RNAs (ncRNAs have emerged as versatile master regulator of biological functions in recent years. MicroRNAs (miRNAs are small endogenous ncRNAs of 18-24 nucleotides in length that originates from long self-complementary precursors. Besides their direct involvement in developmental processes, plant miRNAs play key roles in gene regulatory networks and varied biological processes. Alternatively, long ncRNAs (lncRNAs are a large and diverse class of transcribed ncRNAs whose length exceed that of 200 nucleotides. Plant lncRNAs are transcribed by different RNA polymerases, showing diverse structural features. Plant lncRNAs also are important regulators of gene expression in diverse biological processes. There has been a breakthrough in the technology of genome editing, the CRISPR-Cas9 (clustered regulatory interspaced short palindromic repeats/CRISPR-associated protein 9 technology, in the last decade. CRISPR loci are transcribed into ncRNA and eventually form a functional complex with Cas9 and further guide the complex to cleave complementary invading DNA. The CRISPR-Cas technology has been successfully applied in model plants such as Arabidopsis and tobacco and important crops like wheat, maize and rice. However, all these studies are focused on protein coding genes. Information about targeting non-coding genes is scarce. Hitherto, the CRISPR-Cas technology has been exclusively used in vertebrate systems to engineer miRNA/lncRNAs, but it is still relatively unexplored in plants. While briefing miRNAs, lncRNAs and applications of the CRISPR-Cas technology in human and animals, this review essentially elaborates several strategies to overcome the challenges of applying the CRISPR-Cas technology in editing ncRNAs in plants and the future perspective of this field.

  2. Validation of an administrative claims-based diagnostic code for pneumonia in a US-based commercially insured COPD population

    Directory of Open Access Journals (Sweden)

    Kern DM

    2015-07-01

    Full Text Available David M Kern,1 Jill Davis,2 Setareh A Williams,3 Ozgur Tunceli,1 Bingcao Wu,1 Sally Hollis,4 Charlie Strange,5 Frank Trudo2 1HealthCore, Inc., Wilmington, DE, 2AstraZeneca Pharmaceuticals, Wilmington, DE, 3AstraZeneca Pharmaceuticals, Gaithersburg, MD, USA; 4AstraZeneca Pharmaceuticals, Cheshire, UK; 5Department of Medicine, Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, SC, USA Objective: To estimate the accuracy of claims-based pneumonia diagnoses in COPD patients using clinical information in medical records as the reference standard.Methods: Selecting from a repository containing members’ data from 14 regional United States health plans, this validation study identified pneumonia diagnoses within a group of patients initiating treatment for COPD between March 1, 2009 and March 31, 2012. Patients with ≥1 claim for pneumonia (International Classification of Diseases Version 9-CM code 480.xx–486.xx were identified during the 12 months following treatment initiation. A subset of 800 patients was randomly selected to abstract medical record data (paper based and electronic for a target sample of 400 patients, to estimate validity within 5% margin of error. Positive predictive value (PPV was calculated for the claims diagnosis of pneumonia relative to the reference standard, defined as a documented diagnosis in the medical record.Results: A total of 388 records were reviewed; 311 included a documented pneumonia diagnosis, indicating 80.2% (95% confidence interval [CI]: 75.8% to 84.0% of claims-identified pneumonia diagnoses were validated by the medical charts. Claims-based diagnoses in inpatient or emergency departments (n=185 had greater PPV versus outpatient settings (n=203, 87.6% (95% CI: 81.9%–92.0% versus 73.4% (95% CI: 66.8%–79.3%, respectively. Claims-diagnoses verified with paper-based charts had similar PPV as the overall study sample, 80.2% (95% CI: 71.1%–87.5%, and

  3. Technological change in the wine market? The role of QR codes and wine apps in consumer wine purchases

    OpenAIRE

    Lindsey M. Higgins; Marianne McGarry Wolf; Mitchell J. Wolf

    2014-01-01

    As an experiential good, wine purchases in the absence of tastings are often challenging and information-laden decisions. Technology has shaped the way consumers negotiate this complex purchase process. Using a sample of 631 US wine consumers, this research aims to identify the role of mobile applications and QR codes in the wine purchase decision. Results suggest that wine consumers that consider themselves wine connoisseurs or experts, enjoy talking about wine, and are interested in wine th...

  4. Assistive technology for visually impaired women for use of the female condom: a validation study

    Directory of Open Access Journals (Sweden)

    Luana Duarte Wanderley Cavalcante

    2015-02-01

    Full Text Available OBJECTIVE To validate assistive technology for visually impaired women to learn how to use the female condom. METHOD a methodological development study conducted on a web page, with data collection between May and October 2012. Participants were 14 judges; seven judges in sexual and reproductive health (1st stage and seven in special education (2nd stage. RESULTS All items have reached the adopted parameter of 70% agreement. In Stage 1 new materials were added to represent the cervix, and instructions that must be heard twice were included in the 2nd stage. CONCLUSION The technology has been validated and is appropriate for its objectives, structure / presentation and relevance. It is an innovative, low cost and valid instrument for promoting health and one which may help women with visual disabilities to use the female condom.

  5. Adverse drug events in German hospital routine data: A validation of International Classification of Diseases, 10th revision (ICD-10 diagnostic codes.

    Directory of Open Access Journals (Sweden)

    Nils Kuklik

    Full Text Available Adverse drug events (ADEs during hospital stays are a significant problem of healthcare systems. Established monitoring systems lack completeness or are cost intensive. Routinely assigned International Statistical Classification of Diseases and Related Health Problems (ICD codes could complement existing systems for ADE identification. To analyze the potential of using routine data for ADE detection, the validity of a set of ICD codes was determined focusing on hospital-acquired events.The study utilized routine data from four German hospitals covering the years 2014 and 2015. A set of ICD, 10th Revision, German Modification (ICD-10-GM diagnoses coded most frequently in the routine data and identified as codes indicating ADEs was analyzed. Data from psychiatric and psychotherapeutic departments were excluded. Retrospective chart review was performed to calculate positive predictive values (PPV and sensitivity.Of 807 reviewed ADE codes, 91.2% (95%-confidence interval: 89.0, 93.1 were identified as disease in the medical records and 65.1% (61.7, 68.3 were confirmed as ADE. For code groups being predominantly hospital-acquired, 78.5% (73.7, 82.9 were confirmed as ADE, ranging from 68.5% to 94.4% dependent on the ICD code. However, sensitivity of inpatient ADEs was relatively low. 49.7% (45.2, 54.2 of 495 identified hospital-acquired ADEs were coded as disease in the routine data, from which a subgroup of 12.1% (9.4, 15.3 was coded as drug-associated disease.ICD codes from routine data can provide an important contribution to the development and improvement of ADE monitoring systems. Documentation quality is crucial to further increase the PPV, and actions against under-reporting of ADEs in routine data need to be taken.

  6. Ella-V and technology usage technology usage in an english language and literacy acquisition validation randomized controlled trial study

    Directory of Open Access Journals (Sweden)

    Roisin P. Corcoran

    2014-12-01

    Full Text Available This paper describes the use of technology to provide virtual professional development (VPD for teachers and to conduct classroom observations in a study of English Language Learner (ELL instruction in grades K–3. The technology applications were part of a cluster randomized control trial (RCT design for a federally funded longitudinal validation study of a particular program, English Language and Literacy Acquisition-Validation, ELLA- V, to determine its degree of impact on English oral language/literacy, reading, and science across 63 randomly assigned urban, suburban, and rural schools (first year of implementation. ELLA-V also examines the impact of bimonthly VPD for treatment teachers compared to comparison group teachers on pedagogical skills, measured by sound observation instruments, and on student achievement, measured by state/national English language/literacy/reading tests and a national science test. This study features extensive technology use via virtual observations, bimonthly VPD, and randomly assigned treatment and control schools with students served in English as second language (ESL instructional time. The study design and methodology are discussed relativeto the specialized uses of technology and issues involving the evaluation of technology’s contribution to the intervention of interest and of the efficient, cost-effective execution of the study.

  7. Supply Chain Management: A Case Study of Using EDI and Bar Code Information Technology

    National Research Council Canada - National Science Library

    Wu, Wei-Ya

    1997-01-01

    ...) technology at Kang Kuo Company in Republic of China (ROC) is presented. This case study of the Kang Kuo Company provides insights into the use of information technologies through the supply chain in a Taiwan company...

  8. Advanced Technology and Mitigation (ATDM) SPARC Re-Entry Code Fiscal Year 2017 Progress and Accomplishments for ECP.

    Energy Technology Data Exchange (ETDEWEB)

    Crozier, Paul [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Howard, Micah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freno, Brian Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bova, Steven W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carnes, Brian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    The SPARC (Sandia Parallel Aerodynamics and Reentry Code) will provide nuclear weapon qualification evidence for the random vibration and thermal environments created by re-entry of a warhead into the earth’s atmosphere. SPARC incorporates the innovative approaches of ATDM projects on several fronts including: effective harnessing of heterogeneous compute nodes using Kokkos, exascale-ready parallel scalability through asynchronous multi-tasking, uncertainty quantification through Sacado integration, implementation of state-of-the-art reentry physics and multiscale models, use of advanced verification and validation methods, and enabling of improved workflows for users. SPARC is being developed primarily for the Department of Energy nuclear weapon program, with additional development and use of the code is being supported by the Department of Defense for conventional weapons programs.

  9. The responsibility of technology. Weighting goods - risk assessment - codes of conduct

    International Nuclear Information System (INIS)

    Lenk, H.; Maring, M.

    1991-01-01

    Under the heading 'The responsibility of technology' nineteen authors contribute to the following topics: Basic questions of weighting 'goods' relating to resources, the relationship of technical possibilities and (self)restraint of individuals and collectives, striving for knowledge and appropriate limits, possibilities of directing and influencing technology and questions concerning (the limits of) individual and collective responsibility. Dealt with in more detail are: The assessment of technical risks and uncertainties from an ethical and legal point of view, problems in nuclear research and technology illustrated by the example of Chernobyl, information technology, and the influence of the press when presenting technology matters. (orig./HSCH) [de

  10. Validation study of SRAC2006 code system based on evaluated nuclear data libraries for TRIGA calculations by benchmarking integral parameters of TRX and BAPL lattices of thermal reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Sarker, M.M.; Islam, S.M.A.

    2013-01-01

    Highlights: ► To validate the SRAC2006 code system for TRIGA neutronics calculations. ► TRX and BAPL lattices are treated as standard benchmarks for this purpose. ► To compare the calculated results with experiment as well as MCNP values in this study. ► The study demonstrates a good agreement with the experiment and the MCNP results. ► Thus, this analysis reflects the validation study of the SRAC2006 code system. - Abstract: The goal of this study is to present the validation study of the SRAC2006 code system based on evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3 for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. This study is achieved through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors. In integral measurements, the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 are treated as standard benchmarks for validating/testing the SRAC2006 code system as well as nuclear data libraries. The integral parameters of the said lattices are calculated using the collision probability transport code PIJ of the SRAC2006 code system at room temperature 20 °C based on the above libraries. The calculated integral parameters are compared to the measured values as well as the MCNP values based on the Chinese evaluated nuclear data library CENDL-3.0. It was found that in most cases, the values of integral parameters demonstrate a good agreement with the experiment and the MCNP results. In addition, the group constants in SRAC format for TRX and BAPL lattices in fast and thermal energy range respectively are compared between the above libraries and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation study of the SRAC2006 code system based on evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 and can also be essential to implement further neutronics calculations

  11. Three-dimensional all-speed CFD code for safety analysis of nuclear reactor containment: Status of GASFLOW parallelization, model development, validation and application

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Jianjun, E-mail: jianjun.xiao@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Travis, John R., E-mail: jack_travis@comcast.com [Engineering and Scientific Software Inc., 3010 Old Pecos Trail, Santa Fe, NM 87505 (United States); Royl, Peter, E-mail: peter.royl@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Necker, Gottfried, E-mail: gottfried.necker@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Svishchev, Anatoly, E-mail: anatoly.svishchev@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Jordan, Thomas, E-mail: thomas.jordan@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany)

    2016-05-15

    Highlights: • 3-D scalable semi-implicit pressure-based CFD code for containment safety analysis. • Robust solution algorithm valid for all-speed flows. • Well validated and widely used CFD code for hydrogen safety analysis. • Code applied in various types of nuclear reactor containments. • Parallelization enables high-fidelity models in large scale containment simulations. - Abstract: GASFLOW is a three dimensional semi-implicit all-speed CFD code which can be used to predict fluid dynamics, chemical kinetics, heat and mass transfer, aerosol transportation and other related phenomena involved in postulated accidents in nuclear reactor containments. The main purpose of the paper is to give a brief review on recent GASFLOW code development, validations and applications in the field of nuclear safety. GASFLOW code has been well validated by international experimental benchmarks, and has been widely applied to hydrogen safety analysis in various types of nuclear power plants in European and Asian countries, which have been summarized in this paper. Furthermore, four benchmark tests of a lid-driven cavity flow, low Mach number jet flow, 1-D shock tube and supersonic flow over a forward-facing step are presented in order to demonstrate the accuracy and wide-ranging capability of ICE’d ALE solution algorithm for all-speed flows. GASFLOW has been successfully parallelized using the paradigms of Message Passing Interface (MPI) and domain decomposition. The parallel version, GASFLOW-MPI, adds great value to large scale containment simulations by enabling high-fidelity models, including more geometric details and more complex physics. It will be helpful for the nuclear safety engineers to better understand the hydrogen safety related physical phenomena during the severe accident, to optimize the design of the hydrogen risk mitigation systems and to fulfill the licensing requirements by the nuclear regulatory authorities. GASFLOW-MPI is targeting a high

  12. Development of an automatic validation system for simulation codes of the fusion research; Entwicklung eines automatischen Validierungssystems fuer Simulationscodes der Fusionsforschung

    Energy Technology Data Exchange (ETDEWEB)

    Galonska, Andreas

    2010-03-15

    In the present master thesis the development oa an automatic validation system for the simulation code ERO is documented. This 3D Monte-carlo code models the transport of impurities as well as plasma-wall interaction processes and has great importance for the fusion research. The validation system is based on JuBE (Julich Benchmarking Environment), the flexibility of which allows a slight extension of the system to other codes, for instance such, which are operated in the framework of the EU Task Force ITM (Integrated Tokamak Modelling). The chosen solution - JuBE and a special program for the ''intellectual'' comparison of actual and reference-edition data of ERO is described and founded. The use of this program and the configuration of JuBE are detailedly described. Simulations to different plasma experiments, which serve as reference cases for the automatic validation, are explained. The working of the system is illustrated by the description of a test case. This treats the failure localization and improvement in the parallelization of an important ERO module (tracking of physically eroded particle). It is demonstrated, how the system reacts in an erroneous validation and the subsequently performed error correction leads to a positive result. Finally a speed-up curve of the parallelization is established by means of the output data of JuBE.

  13. Implementation, verification, and validation of the FPIN2 metal fuel pin mechanics model in the SASSYS/SAS4A LMR transient analysis codes

    International Nuclear Information System (INIS)

    Sofu, T.; Kramer, J.M.

    1994-01-01

    The metal fuel version of the FPIN2 code which provides a validated pin mechanics model is coupled with SASSYS/SAS4A Version 3.0 for single pin calculations. In this implementation, SASSY/SAS4A provides pin temperatures, and FPIN2 performs analysis of pin deformation and predicts the time and location of cladding failure. FPIN2 results are also used for the estimates of axial expansion of fuel and associated reactivity effects. The revalidation of the integrated SAS-FPIN2 code system is performed using TREAT tests

  14. Validation evaluation of the technological process of leonurus turkestanicus liquid extract production

    Directory of Open Access Journals (Sweden)

    Олеся Владимировна Сермухамедова

    2016-01-01

    Full Text Available Aim. The transfer of Leonurus turkestanicus liquid extract production technology and its validation at LLC «FitOleum» (GMP compliance report conclusion CT PK 1617-2006 «Good Manufacturing Practice. Remedies production. Basic provisions» № 18, Nov. 21, 2014..Methods. Different known statistical methods have been used to evaluate and to interpret both technological parameters and indicators that were determined during quality control of herbal material, intermediate product, and final product. A Statistical process control (SPC concept is applied as a basis for all accepted international instruments ICH Q8 «Pharmaceutical Development», ICH Q10 «Pharmaceutical Quality System», PAT Concept, and FDA Guidance for Process Validation.Results. As a result of research normative documents that regulate manufacturing process in the test conditions have been developed. A comparable data across technological parameters of three consistently manufactured test production series have been obtained, as well as validity of the extract production technological process has been proved.Conclusion. On the basis of research, the experimental-industrial procedure for Leonurus turkestanicus herb liquid extract production at the LLC «FitOleum» has been developed

  15. Employing optical code division multiple access technology in the all fiber loop vibration sensor system

    Science.gov (United States)

    Tseng, Shin-Pin; Yen, Chih-Ta; Syu, Rong-Shun; Cheng, Hsu-Chih

    2013-12-01

    This study proposes a spectral amplitude coding-optical code division multiple access (SAC-OCDMA) framework to access the vibration frequency of a test object on the all fiber loop vibration sensor (AFLVS). Each user possesses an individual SAC, and fiber Bragg grating (FBG) encoders/decoders using multiple FBG arrays were adopted, providing excellent orthogonal properties in the frequency domain. The system also mitigates multiple access interference (MAI) among users. When an optical fiber is bent to a point exceeding the critical radius, the fiber loop sensor becomes sensitive to external physical parameters (e.g., temperature, strain, and vibration). The AFLVS involves placing a fiber loop with a specific radius on a designed vibration platform.

  16. [Care with the child's health and validation of an educational technology for riverside families].

    Science.gov (United States)

    Teixeira, Elizabeth; de Almeida Siqueira, Aldo; da Silva, Joselice Pereira; Lavor, Lília Cunha

    2011-01-01

    This study aimed to assess the knowledge and ways of caring for the child health 0-5 years between riverine (Phase 1), and to validate an educational technology (Phase 2). It was carried out a descriptive qualitative study. With the mothers, focus groups and content analysis were used, and with judges-specialists and target-public-applied, forms. The study revealed that the concern with the care of a child between the riverine families permeates the adversity daily, with dedication and commitment of these families in maintaining the health of their children. The sensitivity listening of mothers indicated the need for a closer relationship between nursing professionals and family. The validation of the educational technology was convergent, within the parameters considered adequate.

  17. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  18. Validation of the Technological Process of the Preparation "Milk by Vidal".

    Science.gov (United States)

    Savchenko, L P; Mishchenko, V A; Georgiyants, V A

    2017-01-01

    Validation was performed on the technological process of the compounded preparation "Milk by Vidal" in accordance with the requirements of the regulatory framework of Ukraine. Critical stages of formulation which can affect the quality of the finished preparation were considered during the research. The obtained results indicated that the quality of the finished preparation met the requirements of the State Pharmacopoeia of Ukraine. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  19. A code guidance system for integrated nuclear data evaluation system on the basis of knowledge engineering technology

    International Nuclear Information System (INIS)

    Fukahori, Tokio; Nakagawa, Tsuneo

    1994-01-01

    The integrated nuclear data evaluation system (INDES) is being made in order to support the nuclear data evaluation work. A guidance system in INDES, 'Evaluation Tutor (ET)', is under development in order to support users in selecting the most suitable set of theoretical calculation codes by applying knowledge engineering technology and the experiences of evaluation work for JENDL-3. In this paper, the function of ET is introduced as well as the functions and databases of INDES. An example run of ET for 56 Fe in the 1-20 MeV neutron energy region is also explained. (author)

  20. Temperature and heat flux datasets of a complex object in a fire plume for the validation of fire and thermal response codes.

    Energy Technology Data Exchange (ETDEWEB)

    Jernigan, Dann A.; Blanchat, Thomas K.

    2010-09-01

    It is necessary to improve understanding and develop temporally- and spatially-resolved integral scale validation data of the heat flux incident to a complex object in addition to measuring the thermal response of said object located within the fire plume for the validation of the SIERRA/FUEGO/SYRINX fire and SIERRA/CALORE codes. To meet this objective, a complex calorimeter with sufficient instrumentation to allow validation of the coupling between FUEGO/SYRINX/CALORE has been designed, fabricated, and tested in the Fire Laboratory for Accreditation of Models and Experiments (FLAME) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparison between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. This report presents the data validation steps and processes, the results of the penlight radiant heat experiments (for the purpose of validating the CALORE heat transfer modeling of the complex calorimeter), and the results of the fire tests in FLAME.

  1. CFD code development for incompressible two-phase flow using two-fluid model: preliminary calculation and plume validation experiment

    International Nuclear Information System (INIS)

    Heo, B. G.; Jung, C. H.; Yoon, H. Y.; Yeo, D. J.; Song, C. H.

    2002-01-01

    A multidimensional numerical code for solving incompressible two-fluid is presented based on the Finite Volume Method (FVM) and the Simplified Marker And Cell (SMAC) method. Details of the present method and comparisons between the calculation and experiment are described for two-dimensional flow patterns of bubbly flow which show good agreement. Further implementations of the interfacial correlations are required for the application of the present code to various two-phase problems

  2. Leveraging Quick Response Code Technology to Facilitate Simulation-Based Leaderboard Competition.

    Science.gov (United States)

    Chang, Todd P; Doughty, Cara B; Mitchell, Diana; Rutledge, Chrystal; Auerbach, Marc A; Frisell, Karin; Jani, Priti; Kessler, David O; Wolfe, Heather; MacKinnon, Ralph J; Dewan, Maya; Pirie, Jonathan; Lemke, Daniel; Khattab, Mona; Tofil, Nancy; Nagamuthu, Chenthila; Walsh, Catharine M

    2018-02-01

    Leaderboards provide feedback on relative performance and a competitive atmosphere for both self-guided improvement and social comparison. Because simulation can provide substantial quantitative participant feedback, leaderboards can be used, not only locally but also in a multidepartment, multicenter fashion. Quick Response (QR) codes can be integrated to allow participants to access and upload data. We present the development, implementation, and initial evaluation of an online leaderboard employing principles of gamification using points, badges, and leaderboards designed to enhance competition among healthcare providers. This article details the fundamentals behind the development and implementation of a user-friendly, online, multinational leaderboard that employs principles of gamification to enhance competition and integrates a QR code system to promote both self-reporting of performance data and data integrity. An open-ended survey was administered to capture perceptions of leaderboard implementation. Conceptual step-by-step instructions detailing how to apply the QR code system to any leaderboard using simulated or real performance metrics are outlined using an illustrative example of a leaderboard that employed simulated cardiopulmonary resuscitation performance scores to compare participants across 17 hospitals in 4 countries for 16 months. The following three major descriptive categories that captured perceptions of leaderboard implementation emerged from initial evaluation data from 10 sites: (1) competition, (2) longevity, and (3) perceived deficits. A well-designed leaderboard should be user-friendly and encompass best practices in gamification principles while collecting and storing data for research analyses. Easy storage and export of data allow for longitudinal record keeping that can be leveraged both to track compliance and to enable social competition.

  3. Validation of tissue microarray technology in squamous cell carcinoma of the esophagus.

    Science.gov (United States)

    Boone, Judith; van Hillegersberg, Richard; van Diest, Paul J; Offerhaus, G Johan A; Rinkes, Inne H M Borel; Kate, Fiebo J W Ten

    2008-05-01

    Tissue microarray (TMA) technology has been developed to facilitate high-throughput immunohistochemical and in situ hybridization analysis of tissues by inserting small tissue biopsy cores into a single paraffin block. Several studies have revealed novel prognostic biomarkers in esophageal squamous cell carcinoma (ESCC) by means of TMA technology, although this technique has not yet been validated for these tumors. Because representativeness of the donor tissue cores may be a disadvantage compared to full sections, the aim of this study was to assess if TMA technology provides representative immunohistochemical results in ESCC. A TMA was constructed containing triplicate cores of 108 formalin-fixed, paraffin-embedded squamous cell carcinomas of the esophagus. The agreement in the differentiation grade and immunohistochemical staining scores of CK5/6, CK14, E-cadherin, Ki-67, and p53 between TMA cores and a subset of 64 randomly selected donor paraffin blocks was determined using kappa statistics. The concurrence between TMA cores and donor blocks was moderate for Ki-67 (kappa = 0.42) and E-cadherin (kappa = 0.47), substantial for differentiation grade (kappa = 0.65) and CK14 (kappa = 0.71), and almost perfect for p53 (kappa = 0.86) and CK5/6 (kappa = 0.93). TMA technology appears to be a valid method for immunohistochemical analysis of molecular markers in ESCC provided that the staining pattern in the tumor is homogeneous.

  4. Analysis: including visually impaired participants in validation design studies of diabetes technology.

    Science.gov (United States)

    Uslan, Mark; Blubaugh, Morgan

    2010-09-01

    In an article in this issue of Journal of Diabetes Science and Technology, Sherwyn Schwartz, M.D., presents a study to validate the design of the ClikSTAR® insulin pen from sanofi-aventis and demonstrates that the device can be used correctly by participants with diabetes. Concern with this article lies with the selection of participants, which was meant to reflect the intended audience for the insulin pen device but does not address the inclusion of visually impaired individuals, who comprise over 20% of the adult diabetes population. Visually impaired individuals need to be included as part of the intended audience for insulin administration technology, and manufacturers of these devices need to design their products for safe use by all people, including those who are visually impaired. The study demonstrated successful use of the ClikSTAR insulin pen in a population that did not include subjects with severe visual impairment. We believe that future validation studies for insulin administration technology should also include samples of visually impaired users and that visually impaired patients will embrace the use of insulin pens designed with their needs in mind. © 2010 Diabetes Technology Society.

  5. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors; Validacion del codigo AZTRAN 1.1 con problemas Benchmark de reactores LWR

    Energy Technology Data Exchange (ETDEWEB)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Xolocostli M, J. V.; Gomez T, A. M., E-mail: amhed.jvq@gmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S{sub N}, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO{sub 2} cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  6. French RSE-M and RCC-MR code appendices for flaw analysis: Presentation of the fracture parameters calculation-Part V: Elements of validation

    Energy Technology Data Exchange (ETDEWEB)

    Marie, S. [CEA Saclay, DEN/DM2S/SEMT/LISN, 91191 Gif sur Yvette Cedex (France)], E-mail: stephane.marie@cea.fr; Chapuliot, S.; Kayser, Y. [CEA Saclay, DEN/DM2S/SEMT/LISN, 91191 Gif sur Yvette Cedex (France); Lacire, M.H. [CEA Saclay, DEN/DDIN, 91191 Gif sur Yvette Cedex (France); Drubay, B. [CEA Saclay, DEN/DM2S/SEMT/LISN, 91191 Gif sur Yvette Cedex (France); Barthelet, B. [EDF/EPN, Site Cap Ampere, 1 place Pleyel 93207, Saint Denis Cedex 1 (France); Le Delliou, P. [EDF Pole Industrie-Division R and D, Site des Renardieres, Route de Sens, Ecuelles, 77250 Moret sur Loing Cedex (France); Rougier, V. [EDF/UTO, SIS/GAM, 6, avenue Montaigne, 93192 Noisy le Grand (France); Naudin, C. [EDF/SEPTEN, 12-14, avenue Dutrievoz, 69628 Villeurbanne Cedex (France); Gilles, P.; Triay, M. [AREVA ANP, Tour AREVA, 92084 Paris La Defense Cedex 16 (France)

    2007-10-15

    French nuclear codes include flaw assessment procedures: the RSE-M Code 'Rules for In-service Inspection of Nuclear Power Plant Components' and the RCC-MR code 'Design and Construction Rules for Mechanical Components of FBR Nuclear Islands and High Temperature Applications'. Development of analytical methods has been made for the last 10 years in the framework of a collaboration between CEA, EDF and AREVA-NP, and by R and D actions involving CEA and IRSN. These activities have led to a unification of the common methods of the two codes. The calculation of fracture mechanics parameters, in particular the stress intensity factor K{sub I} and the J integral, has been widely developed for industrial configurations. All the developments have been integrated in the 2005 edition of RSE-M and in 2007 edition of RCC-MR. This series of articles consists of 5 parts: the first part presents an overview of the methods proposed in the RCC-MR and RSE-M codes. Parts II-IV provide the compendia for specific components. The geometries are plates (part II), pipes (part III) and elbows (part IV). This part presents validation of the methods, with details on the process followed for their development and of the evaluation accuracy of the proposed analytical methods.

  7. French RSE-M and RCC-MR code appendices for flaw analysis: Presentation of the fracture parameters calculation-Part V: Elements of validation

    International Nuclear Information System (INIS)

    Marie, S.; Chapuliot, S.; Kayser, Y.; Lacire, M.H.; Drubay, B.; Barthelet, B.; Le Delliou, P.; Rougier, V.; Naudin, C.; Gilles, P.; Triay, M.

    2007-01-01

    French nuclear codes include flaw assessment procedures: the RSE-M Code 'Rules for In-service Inspection of Nuclear Power Plant Components' and the RCC-MR code 'Design and Construction Rules for Mechanical Components of FBR Nuclear Islands and High Temperature Applications'. Development of analytical methods has been made for the last 10 years in the framework of a collaboration between CEA, EDF and AREVA-NP, and by R and D actions involving CEA and IRSN. These activities have led to a unification of the common methods of the two codes. The calculation of fracture mechanics parameters, in particular the stress intensity factor K I and the J integral, has been widely developed for industrial configurations. All the developments have been integrated in the 2005 edition of RSE-M and in 2007 edition of RCC-MR. This series of articles consists of 5 parts: the first part presents an overview of the methods proposed in the RCC-MR and RSE-M codes. Parts II-IV provide the compendia for specific components. The geometries are plates (part II), pipes (part III) and elbows (part IV). This part presents validation of the methods, with details on the process followed for their development and of the evaluation accuracy of the proposed analytical methods

  8. Color coded multiple access scheme for bidirectional multiuser visible light communications in smart home technologies

    Science.gov (United States)

    Tiwari, Samrat Vikramaditya; Sewaiwar, Atul; Chung, Yeon-Ho

    2015-10-01

    In optical wireless communications, multiple channel transmission is an attractive solution to enhancing capacity and system performance. A new modulation scheme called color coded multiple access (CCMA) for bidirectional multiuser visible light communications (VLC) is presented for smart home applications. The proposed scheme uses red, green and blue (RGB) light emitting diodes (LED) for downlink and phosphor based white LED (P-LED) for uplink to establish a bidirectional VLC and also employs orthogonal codes to support multiple users and devices. The downlink transmission for data user devices and smart home devices is provided using red and green colors from the RGB LEDs, respectively, while uplink transmission from both types of devices is performed using the blue color from P-LEDs. Simulations are conducted to verify the performance of the proposed scheme. It is found that the proposed bidirectional multiuser scheme is efficient in terms of data rate and performance. In addition, since the proposed scheme uses RGB signals for downlink data transmission, it provides flicker-free illumination that would lend itself to multiuser VLC system for smart home applications.

  9. Development of Evaluation Technology for Hydrogen Combustion in containment and Accident Management Code for CANDU

    International Nuclear Information System (INIS)

    Kim, S. B.; Kim, D. H.; Song, Y. M.

    2011-08-01

    For a licensing of nuclear power plant(NPP) construction and operation, the hydrogen combustion and hydrogen mitigation system in the containment is one of the important safety issues. Hydrogen safety and its control for the new NPPs(Shin-Wolsong 1 and 2, Shin-Ulchin 1 and 2) have been evaluated in detail by using the 3-dimensional analysis code GASFLOW. The experimental and computational studies on the hydrogen combustion, and participations of the OEDE/NEA programs such as THAI and ISP-49 secures the resolving capabilities of the hydrogen safety and its control for the domestic nuclear power plants. ISAAC4.0, which has been developed for the assessment of severe accident management at CANDU plants, was already delivered to the regulatory body (KINS) for the assessment of the severe accident management guidelines (SAMG) for Wolsong units 1 to 4, which are scheduled to be submitted to KINS. The models for severe accident management strategy were newly added and the graphic simulator, CAVIAR, was coupled to addition, the ISAAC computer code is anticipated as a platform for the development and maintenance of Wolsong plant risk monitor and Wolsong-specific SAMG

  10. On-going activities in the European JASMIN project for the development and validation of ASTEC-Na SFR safety simulation code - 15072

    International Nuclear Information System (INIS)

    Girault, N.; Cloarec, L.; Herranz, L.; Bandini, G.; Perez-Martin, S.; Ammirabile, L.

    2015-01-01

    The 4-year JASMIN collaborative project (Joint Advanced Severe accidents Modelling and Integration for Na-cooled fast reactors), started in Dec.2011 in the frame of the 7. Framework Programme of the European Commission. It aims at developing a new European simulation code, ASTEC-Na, dealing with the primary phase of SFR core disruptive accidents. The development of a new code, based on a robust advanced simulation tool and able to encompass the in-vessel and in-containment phenomena occurring during a severe accident is indeed of utmost interest for advanced and innovative future SFRs for which an enhanced safety level will be required. This code, based on the ASTEC European code system developed by IRSN and GRS for severe accidents in water-cooled reactors, is progressively integrating and capitalizing the state-of-the-art knowledge of SFR accidents through physical model improvement or development of new ones. New models are assessed on in-pile (CABRI, SCARABEE etc...) and out-of pile experiments conducted during the 70's-80's and code-o-code benchmarking with current accident simulation tools for SFRs is also conducted. During the 2 and a half first years of the project, model specifications and developments were conducted and the validation test matrix was built. The first version of ASTEC-Na available in early 2014 already includes a thermal-hydraulics module able to simulate single and two-phase sodium flow conditions, a zero point neutronic model with simple definition of channel and axial dependences of reactivity feedbacks and models derived from SCANAIR IRSN code for simulating fuel pin thermo-mechanical behaviour and fission gas release/retention. Meanwhile, models have been developed in the source term area for in-containment particle generation and particle chemical transformation, but their implementation is still to be done. As a first validation step, the ASTEC-Na calculations were satisfactorily compared to thermal-hydraulics experimental results

  11. Abstraction carrying code and resource-awareness

    OpenAIRE

    Hermenegildo, Manuel V.; Albert Albiol, Elvira; López García, Pedro; Puebla Sánchez, Alvaro Germán

    2005-01-01

    Proof-Carrying Code (PCC) is a general approach to mobile code safety in which the code supplier augments the program with a certifícate (or proof). The intended benefit is that the program consumer can locally validate the certifícate w.r.t. the "untrusted" program by means of a certifícate checker—a process which should be much simpler, eíñcient, and automatic than generating the original proof. Abstraction Carrying Code (ACC) is an enabling technology for PCC in which an abstract mod...

  12. Coding a Weather Model: DOE-FIU Science & Technology Workforce Development Program.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Jon David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    DOE Fellow, Andres Cremisini, completed a 10-week internship with Sandia National Laboratories (SNL) in Albuquerque, New Mexico. Under the management of Kristopher Klingler and the mentorship of Jon Bradley, he was tasked with conceiving and coding a realistic weather model for use in physical security applications. The objective was to make a weather model that could use real data to accurately predict wind and precipitation conditions at any location of interest on the globe at any user-determined time. The intern received guidance on software design, the C++ programming language and clear communication of project goals and ongoing progress. In addition, Mr. Cremisini was given license to structure the program however he best saw fit, an experience that will benefit ongoing research endeavors.

  13. Proposal to consistently apply the International Code of Nomenclature of Prokaryotes (ICNP) to names of the oxygenic photosynthetic bacteria (cyanobacteria), including those validly published under the International Code of Botanical Nomenclature (ICBN)/International Code of Nomenclature for algae, fungi and plants (ICN), and proposal to change Principle 2 of the ICNP.

    Science.gov (United States)

    Pinevich, Alexander V

    2015-03-01

    This taxonomic note was motivated by the recent proposal [Oren & Garrity (2014) Int J Syst Evol Microbiol 64, 309-310] to exclude the oxygenic photosynthetic bacteria (cyanobacteria) from the wording of General Consideration 5 of the International Code of Nomenclature of Prokaryotes (ICNP), which entails unilateral coverage of these prokaryotes by the International Code of Nomenclature for algae, fungi, and plants (ICN; formerly the International Code of Botanical Nomenclature, ICBN). On the basis of key viewpoints, approaches and rules in the systematics, taxonomy and nomenclature of prokaryotes it is reciprocally proposed to apply the ICNP to names of cyanobacteria including those validly published under the ICBN/ICN. For this purpose, a change to Principle 2 of the ICNP is proposed to enable validation of cyanobacterial names published under the ICBN/ICN rules. © 2015 IUMS.

  14. The validity of visual acuity assessment using mobile technology devices in the primary care setting.

    Science.gov (United States)

    O'Neill, Samuel; McAndrew, Darryl J

    2016-04-01

    The assessment of visual acuity is indicated in a number of clinical circumstances. It is commonly conducted through the use of a Snellen wall chart. Mobile technology developments and adoption rates by clinicians may potentially provide more convenient methods of assessing visual acuity. Limited data exist on the validity of these devices and applications. The objective of this study was to evaluate the assessment of distance visual acuity using mobile technology devices against the commonly used 3-metre Snellen chart in a primary care setting. A prospective quantitative comparative study was conducted at a regional medical practice. The visual acuity of 60 participants was assessed on a Snellen wall chart and two mobile technology devices (iPhone, iPad). Visual acuity intervals were converted to logarithm of minimum angle of resolution (logMAR) scores and subjected to intraclass correlation coefficient (ICC) assessment. The results show a high level of general agreement between testing modality (ICC 0.917 with a 95% confidence interval of 0.887-0.940). The high level of agreement of visual acuity results between the Snellen wall chart and both mobile technology devices suggests that clinicians can use this technology with confidence in the primary care setting.

  15. Development and Validation of Web-Based Courseware for Junior Secondary School Basic Technology Students in Nigeria

    Directory of Open Access Journals (Sweden)

    Amosa Isiaka Gambari

    2018-02-01

    Full Text Available This research aimed to develop and validate a web-based courseware for junior secondary school basic technology students in Nigeria. In this study, a mixed method quantitative pilot study design with qualitative components was used to test and ascertain the ease of development and validation of the web-based courseware. Dick and Carey instructional system design model was adopted for developing the courseware. Convenience sampling technique was used in selecting the three content, computer and educational technology experts to validate the web-based courseware. Non-randomized and non-equivalent Junior secondary school students from two schools were used for field trial validation. Four validating instruments were employed in conducting this study: (i Content Validation Assessment Report (CVAR; (ii Computer Expert Validation Assessment Report (CEAR; (iii Educational Technology Experts Validation Assessment Report (ETEVAR; and (iv Students Validation Questionnaire (SVQ. All the instruments were face and content validated. SVQ was pilot tested and reliability coefficient of 0.85 was obtained using Cronbach Alpha. CVAR, CEAR, ETEVAR were administered on content specialists, computer experts, and educational technology experts, while SVQ was administered on 83 JSS students from two selected secondary schools in Minna. The findings revealed that the process of developing web-based courseware using Dick and Carey Instructional System Design was successful. In addition, the report from the validating team revealed that the web-based courseware is valuable for learning basic technology. It is therefore recommended that web-based courseware should be produced to teach basic technology concepts on large scale.

  16. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  17. Creating Tomorrow's Technologists: Contrasting Information Technology Curriculum in North American Library and Information Science Graduate Programs against Code4lib Job Listings

    Science.gov (United States)

    Maceli, Monica

    2015-01-01

    This research study explores technology-related course offerings in ALA-accredited library and information science (LIS) graduate programs in North America. These data are juxtaposed against a text analysis of several thousand LIS-specific technology job listings from the Code4lib jobs website. Starting in 2003, as a popular library technology…

  18. Refining the accuracy of validated target identification through coding variant fine-mapping in type 2 diabetes

    DEFF Research Database (Denmark)

    Mahajan, Anubha; Wessel, Jennifer; Willems, Sara M

    2018-01-01

    are driven by low-frequency variants: even for these, effect sizes are modest (odds ratio ≤1.29). Second, when we used large-scale genome-wide association data to fine-map the associated variants in their regional context, accounting for the global enrichment of complex trait associations in coding sequence...

  19. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    Directory of Open Access Journals (Sweden)

    Frisoni Manuela

    2016-01-01

    Full Text Available ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1 containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2 containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E are shown and discussed in this paper.

  20. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  1. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Probabilistic Models

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results of comparison of calculated and experimental results for critical configurations, temperature coefficients, kinetic parameters and fission rates evaluated with probabilistic models spatial distributions are shown. (author)

  2. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Criticality Experiments

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results for critical configurations are shown. (author)

  3. Cost-Effective ISS Space-Environment Technology Validation of Advanced Roll-Out Solar Array (ROSA), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — DSS proposes to systematically mature, mitigate risk for; and perform hardware-based ground validations / demonstrations of a low-cost, high technology payoff,...

  4. Verification and validation of deterministic radiation transport numerical methods, codes, and nuclear data for estimating radiation dose to patients during CT scan

    International Nuclear Information System (INIS)

    Hykes, J. M.; Azmy, Y. Y.; Schunert, S.; King, S. H.; Klingensmith, J. J.

    2009-01-01

    The goal of this work is to determine the viability of modeling an important x-ray procedure, the computed tomography (CT) scan of a pregnant woman and her conceptus using a deterministic radiation transport program. A prior experimental study provides the deposited dose as measured in an anthropomorphic phantom, with detectors positioned in the estimated uterine location. In this paper, we first verify the discrete ordinates code TORT3.2 and a suitably constructed multigroup cross section library against the Monte Carlo code MCNP5. Using MCNP, we demonstrate that accounting for the transport of secondary electrons is unnecessary in tissue-equivalent material. After demonstrating proper verification, we proceed to validate the MCNP and TORT simulations against data measured for the CTDI FDA phantom. In the model, the computed edge-to-center dose ratio is within experimental uncertainty, while the computed exposures are less than 35% from the measured values. (authors)

  5. Instrument for assessing mobile technology acceptability in diabetes self-management: a validation and reliability study

    Directory of Open Access Journals (Sweden)

    Frandes M

    2017-02-01

    Full Text Available Mirela Frandes,1 Anca V Deiac,2 Bogdan Timar,1,3 Diana Lungeanu1,2 1Department of Functional Sciences, “Victor Babes” University of Medicine and Pharmacy of Timisoara, 2Department of Mathematics, Polytechnic University of Timisoara, 3Third Medical Clinic, Emergency Hospital of Timisoara, Timisoara, Romania Background: Nowadays, mobile technologies are part of everyday life, but the lack of instruments to assess their acceptability for the management of chronic diseases makes their actual adoption for this purpose slow.Objective: The objective of this study was to develop a survey instrument for assessing patients’ attitude toward and intention to use mobile technology for diabetes mellitus (DM self-management, as well as to identify sociodemographic characteristics and quality of life factors that affect them.Methods: We first conducted the documentation and instrument design phases, which were subsequently followed by the pilot study and instrument validation. Afterward, the instrument was administered 103 patients (median age: 37 years; range: 18–65 years diagnosed with type 1 or type 2 DM, who accepted to participate in the study. The reliability and construct validity were assessed by computing Cronbach’s alpha and using factor analysis, respectively.Results: The instrument included statements about the actual use of electronic devices for DM management, interaction between patient and physician, attitude toward using mobile technology, and quality of life evaluation. Cronbach’s alpha was 0.9 for attitude toward using mobile technology and 0.97 for attitude toward using mobile device applications for DM self-management. Younger patients (Spearman’s ρ=-0.429; P<0.001 with better glycemic control (Spearman’s ρ=-0.322; P<0.001 and higher education level (Kendall’s τ=0.51; P<0.001 had significantly more favorable attitude toward using mobile assistive applications for DM control. Moreover, patients with a higher quality of

  6. The Validation of Macro and Micro Observations of Parent-Child Dynamics Using the Relationship Affect Coding System in Early Childhood.

    Science.gov (United States)

    Dishion, Thomas J; Mun, Chung Jung; Tein, Jenn-Yun; Kim, Hanjoe; Shaw, Daniel S; Gardner, Frances; Wilson, Melvin N; Peterson, Jenene

    2017-04-01

    This study examined the validity of micro social observations and macro ratings of parent-child interaction in early to middle childhood. Seven hundred and thirty-one families representing multiple ethnic groups were recruited and screened as at risk in the context of Women, Infant, and Children (WIC) Nutritional Supplement service settings. Families were randomly assigned to the Family Checkup (FCU) intervention or the control condition at age 2 and videotaped in structured interactions in the home at ages 2, 3, 4, and 5. Parent-child interaction videotapes were micro-coded using the Relationship Affect Coding System (RACS) that captures the duration of two mutual dyadic states: positive engagement and coercion. Macro ratings of parenting skills were collected after coding the videotapes to assess parent use of positive behavior support and limit setting skills (or lack thereof). Confirmatory factor analyses revealed that the measurement model of macro ratings of limit setting and positive behavior support was not supported by the data, and thus, were excluded from further analyses. However, there was moderate stability in the families' micro social dynamics across early childhood and it showed significant improvements as a function of random assignment to the FCU. Moreover, parent-child dynamics were predictive of chronic behavior problems as rated by parents in middle childhood, but not emotional problems. We conclude with a discussion of the validity of the RACS and on methodological advantages of micro social coding over the statistical limitations of macro rating observations. Future directions are discussed for observation research in prevention science.

  7. Chart validation of inpatient ICD-9-CM administrative diagnosis codes for ischemic stroke among IGIV users in the Sentinel Distributed Database.

    Science.gov (United States)

    Ammann, Eric M; Leira, Enrique C; Winiecki, Scott K; Nagaraja, Nandakumar; Dandapat, Sudeepta; Carnahan, Ryan M; Schweizer, Marin L; Torner, James C; Fuller, Candace C; Leonard, Charles E; Garcia, Crystal; Pimentel, Madelyn; Chrischilles, Elizabeth A

    2017-12-01

    The Sentinel Distributed Database (SDD) is a large database of patient-level medical and prescription records, primarily derived from insurance claims and electronic health records, and is sponsored by the U.S. Food and Drug Administration for drug safety assessments. In this chart validation study, we report on the positive predictive value (PPV) of inpatient ICD-9-CM acute ischemic stroke (AIS) administrative diagnosis codes (433.x1, 434.xx, and 436) in the SDD.As part of an assessment of the risk of thromboembolic adverse events following treatment with intravenous immune globulin (IGIV), charts were obtained for 131 potential post-IGIV AIS cases. Charts were abstracted by trained nurses and then adjudicated by stroke experts using pre-specified diagnostic criteria.Case status could be determined for 128 potential AIS cases, of which 34 were confirmed. The PPVs for the inpatient AIS diagnoses recorded in the SDD were 27% overall [95% confidence interval (95% CI): 19-35], 60% (95% CI: 32-84) for principal-position diagnoses, 42% (95% CI: 28-57) for secondary diagnoses, and 6% (95% CI: 2-15) for position-unspecified diagnoses (which in the SDD generally originate from separate physician claims associated with an inpatient stay).Position-unspecified diagnoses were unlikely to represent true AIS cases. PPVs for principal and secondary inpatient diagnosis codes were higher, but still meaningfully lower than estimates from prior chart validation studies. The low PPVs may be specific to the IGIV user study population. Additional research is needed to assess the validity of AIS administrative diagnosis codes in other study populations within the SDD. Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. All rights reserved.

  8. Validation of assistive technology on psychoactive substances for visually impaired people.

    Science.gov (United States)

    Guimarães, Fernanda Jorge; Pagliuca, Lorita Marlena Freitag

    2017-12-26

    To validate the assistive technology "Drugs: reflection for prevention" to be used with visually impaired people. Quantitative and quasi-experimental study, contrasting knowledge before and after the use of the assistive technology with 140 visually impaired people in institutes and associations for people with visual impairment. A questionnaire with identification data, a pre-test, a post-test and a questionnaire to assess the assistive technology were applied. Data were described through means and standard deviations, and analyses included the McNemar test, the exact binomial distribution test, and the intraclass correlation coefficient. Participants were male (65.7%), 84.3% were blind, aged 37.1 years on average and with schooling of 10.1 years on average. There were more correct answers in the post-test (p visually impaired people about psychoactive substance abuse. Implications for rehabilitation Created new tool for prevention substance abuse that can be accessed easily. Improved information about substance psychoactive for users of the assistive technology. Improved quality of life for its users.

  9. Data validation platform for the sophisticated monitoring and communication of the energy technology sector

    Energy Technology Data Exchange (ETDEWEB)

    Flamos, Alexandros; Doukas, Haris; Psarras, J. [Management and Decision Support Systems Lab (EPU-NTUA), School of Electrical and Computer Engineering, National Technical University of Athens, 9, Iroon Polytechniou str., 15780, Athens (Greece)

    2010-05-15

    It has very often been stated that the difficulty and complexity of achieving green energy targets in the European Union (EU) will require strengthened measures to promote implementation of New Energy Technologies, Energy End-use Efficiency, as well as measures to support the related energy Research and Technology Development (RTD). Often forgotten is the fact, that most of all, a European-wide co-ordinated forum is needed to continuously develop and sophisticate the monitoring and methodology results, bringing together specialised statisticians, energy researchers and experts on energy socio-economics. The aim of this paper is to present the Scientific Reference System (SRS) Scorecard; a data validation platform for the sophisticated monitoring and communication of the energy technology sector. In this respect, the concept of the SRS scorecard system will be laid out, the parameters and the scoring criteria will be explained as well as the assessment system so as to provide the interested reader with the basis needed to understand the technology evaluation examples provided, as well as its critical analysis. (author)

  10. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  11. A Systematic Review of Technology-Based Dietary Intake Assessment Validation Studies That Include Carotenoid Biomarkers

    Science.gov (United States)

    Burrows, Tracy L.; Rollo, Megan E.; Williams, Rebecca; Wood, Lisa G.; Garg, Manohar L.; Jensen, Megan; Collins, Clare E.

    2017-01-01

    Technological advances have allowed for the evolution of traditional dietary assessment methods. The aim of this review is to evaluate the accuracy of technology-based dietary assessment methods to determine carotenoid and/or fruit and vegetable intake when compared with carotenoid biomarkers. An online search strategy was undertaken to identify studies published in the English language up to July 2016. Inclusion criteria were adults ≥18 years, a measure of dietary intake that used information and communication technologies that specified fruit and/or vegetable intake or dietary carotenoid, a biomarker of carotenoid status and the association between the two. Sixteen articles from 13 studies were included with the majority cross-sectional in design (n = 9). Some studies used multiple dietary assessment methods with the most common: food records (n = 7), 24-h diet recalls (n = 5), food frequency questionnaires (n = 3) and diet quality assessed by dietary screener (n = 1). Two studies were directly web based, with four studies using technology that could be completed offline and data later transferred. Two studies utilised technology in the collection of dietary data, while the majority (n = 11) automated the collection in combination with nutrient analysis of the dietary data. Four studies provided correlation values between dietary carotenoids with biomarkers, ranging from r = 0.13 to 0.62 with the remaining studies comparing a measure of fruit and vegetable intake with biomarkers (r = 0.09 to 0.25). This review provides an overview of technology-based dietary assessment methods that have been used in validation studies with objectively measured carotenoids. Findings were positive with these dietary assessment measures showing mostly moderate associations with carotenoid biomarkers. PMID:28216582

  12. A Systematic Review of Technology-Based Dietary Intake Assessment Validation Studies That Include Carotenoid Biomarkers.

    Science.gov (United States)

    Burrows, Tracy L; Rollo, Megan E; Williams, Rebecca; Wood, Lisa G; Garg, Manohar L; Jensen, Megan; Collins, Clare E

    2017-02-14

    Technological advances have allowed for the evolution of traditional dietary assessment methods. The aim of this review is to evaluate the accuracy of technology-based dietary assessment methods to determine carotenoid and/or fruit and vegetable intake when compared with carotenoid biomarkers. An online search strategy was undertaken to identify studies published in the English language up to July 2016. Inclusion criteria were adults ≥18 years, a measure of dietary intake that used information and communication technologies that specified fruit and/or vegetable intake or dietary carotenoid, a biomarker of carotenoid status and the association between the two. Sixteen articles from 13 studies were included with the majority cross-sectional in design ( n = 9). Some studies used multiple dietary assessment methods with the most common: food records ( n = 7), 24-h diet recalls ( n = 5), food frequency questionnaires ( n = 3) and diet quality assessed by dietary screener ( n = 1). Two studies were directly web based, with four studies using technology that could be completed offline and data later transferred. Two studies utilised technology in the collection of dietary data, while the majority ( n = 11) automated the collection in combination with nutrient analysis of the dietary data. Four studies provided correlation values between dietary carotenoids with biomarkers, ranging from r = 0.13 to 0.62 with the remaining studies comparing a measure of fruit and vegetable intake with biomarkers ( r = 0.09 to 0.25). This review provides an overview of technology-based dietary assessment methods that have been used in validation studies with objectively measured carotenoids. Findings were positive with these dietary assessment measures showing mostly moderate associations with carotenoid biomarkers.

  13. A Systematic Review of Technology-Based Dietary Intake Assessment Validation Studies That Include Carotenoid Biomarkers

    Directory of Open Access Journals (Sweden)

    Tracy L. Burrows

    2017-02-01

    Full Text Available Technological advances have allowed for the evolution of traditional dietary assessment methods. The aim of this review is to evaluate the accuracy of technology-based dietary assessment methods to determine carotenoid and/or fruit and vegetable intake when compared with carotenoid biomarkers. An online search strategy was undertaken to identify studies published in the English language up to July 2016. Inclusion criteria were adults ≥18 years, a measure of dietary intake that used information and communication technologies that specified fruit and/or vegetable intake or dietary carotenoid, a biomarker of carotenoid status and the association between the two. Sixteen articles from 13 studies were included with the majority cross-sectional in design (n = 9. Some studies used multiple dietary assessment methods with the most common: food records (n = 7, 24-h diet recalls (n = 5, food frequency questionnaires (n = 3 and diet quality assessed by dietary screener (n = 1. Two studies were directly web based, with four studies using technology that could be completed offline and data later transferred. Two studies utilised technology in the collection of dietary data, while the majority (n = 11 automated the collection in combination with nutrient analysis of the dietary data. Four studies provided correlation values between dietary carotenoids with biomarkers, ranging from r = 0.13 to 0.62 with the remaining studies comparing a measure of fruit and vegetable intake with biomarkers (r = 0.09 to 0.25. This review provides an overview of technology-based dietary assessment methods that have been used in validation studies with objectively measured carotenoids. Findings were positive with these dietary assessment measures showing mostly moderate associations with carotenoid biomarkers.

  14. Euler Technology Assessment - SPLITFLOW Code Applications for Stability and Control Analysis on an Advanced Fighter Model Employing Innovative Control Concepts

    Science.gov (United States)

    Jordan, Keith J.

    1998-01-01

    This report documents results from the NASA-Langley sponsored Euler Technology Assessment Study conducted by Lockheed-Martin Tactical Aircraft Systems (LMTAS). The purpose of the study was to evaluate the ability of the SPLITFLOW code using viscous and inviscid flow models to predict aerodynamic stability and control of an advanced fighter model. The inviscid flow model was found to perform well at incidence angles below approximately 15 deg, but not as well at higher angles of attack. The results using a turbulent, viscous flow model matched the trends of the wind tunnel data, but did not show significant improvement over the Euler solutions. Overall, the predictions were found to be useful for stability and control design purposes.

  15. Validating a measure to assess factors that affect assistive technology use by students with disabilities in elementary and secondary education.

    Science.gov (United States)

    Zapf, Susan A; Scherer, Marcia J; Baxter, Mary F; H Rintala, Diana

    2016-01-01

    The purpose of this study was to measure the predictive validity, internal consistency and clinical utility of the Matching Assistive Technology to Child & Augmentative Communication Evaluation Simplified (MATCH-ACES) assessment. Twenty-three assistive technology team evaluators assessed 35 children using the MATCH-ACES assessment. This quasi-experimental study examined the internal consistency, predictive validity and clinical utility of the MATCH-ACES assessment. The MATCH-ACES assessment predisposition scales had good internal consistency across all three scales. A significant relationship was found between (a) high student perseverance and need for assistive technology and (b) high teacher comfort and interest in technology use (p = (0).002). Study results indicate that the MATCH-ACES assessment has good internal consistency and validity. Predisposition characteristics of student and teacher combined can influence the level of assistive technology use; therefore, assistive technology teams should assess predisposition factors of the user when recommending assistive technology. Implications for Rehabilitation Educational and medical professionals should be educated on evidence-based assistive technology assessments. Personal experience and psychosocial factors can influence the outcome use of assistive technology. Assistive technology assessments must include an intervention plan for assistive technology service delivery to measure effective outcome use.

  16. Trace Code Validation for BWR Spray Cooling Injection and CCFL Condition Based on GÖTA Facility Experiments

    Directory of Open Access Journals (Sweden)

    Stefano Racca

    2012-01-01

    Full Text Available Best estimate codes have been used in the past thirty years for the design, licensing, and safety of NPP. Nevertheless, large efforts are necessary for the qualification and the assessment of such codes. The aim of this work is to study the main phenomena involved in the emergency spray cooling injection in a Swedish-designed BWR. For this purpose, data from the Swedish separate effect test facility GÖTA have been simulated using TRACE version 5.0 Patch 2. Furthermore, uncertainty calculations have been performed with the propagation of input errors method, and the identification of the input parameters that mostly influence the peak cladding temperature has been performed.

  17. Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.

    Science.gov (United States)

    Schwebel, David C; Severson, Joan; He, Yefei

    2017-09-01

    Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.

  18. In-vessel core degradation code validation matrix update 1996-1999. Report by an OECD/NEA group of experts

    International Nuclear Information System (INIS)

    2001-02-01

    In 1991 the Committee on the Safety of Nuclear Installations (CSNI) issued a State-of-the-Art Report (SOAR) on In-Vessel Core Degradation in Light Water Reactor (LWR) Severe Accidents. Based on the recommendations of this report a Validation Matrix for severe accident modelling codes was produced. Experiments performed up to the end of 1993 were considered for this validation matrix. To include recent experiments and to enlarge the scope, an update was formally inaugurated in January 1999 by the Task Group on Degraded Core Cooling, a sub-group of Principal Working Group 2 (PWG-2) on Coolant System Behaviour, and a selection of writing group members was commissioned. The present report documents the results of this study. The objective of the Validation Matrix is to define a basic set of experiments, for which comparison of the measured and calculated parameters forms a basis for establishing the accuracy of test predictions, covering the full range of in-vessel core degradation phenomena expected in light water reactor severe accident transients. The emphasis is on integral experiments, where interactions amongst key phenomena as well as the phenomena themselves are explored; however separate-effects experiments are also considered especially where these extend the parameter ranges to cover those expected in postulated LWR severe accident transients. As well as covering PWR and BWR designs of Western origin, the scope of the review has been extended to Eastern European (VVER) types. Similarly, the coverage of phenomena has been extended, starting as before from the initial heat-up but now proceeding through the in-core stage to include introduction of melt into the lower plenum and further to core coolability and retention to the lower plenum, with possible external cooling. Items of a purely thermal hydraulic nature involving no core degradation are excluded, having been covered in other validation matrix studies. Concerning fission product behaviour, the effect

  19. Validation of the Leap Motion Controller using markered motion capture technology.

    Science.gov (United States)

    Smeragliuolo, Anna H; Hill, N Jeremy; Disla, Luis; Putrino, David

    2016-06-14

    The Leap Motion Controller (LMC) is a low-cost, markerless motion capture device that tracks hand, wrist and forearm position. Integration of this technology into healthcare applications has begun to occur rapidly, making validation of the LMC׳s data output an important research goal. Here, we perform a detailed evaluation of the kinematic data output from the LMC, and validate this output against gold-standard, markered motion capture technology. We instructed subjects to perform three clinically-relevant wrist (flexion/extension, radial/ulnar deviation) and forearm (pronation/supination) movements. The movements were simultaneously tracked using both the LMC and a marker-based motion capture system from Motion Analysis Corporation (MAC). Adjusting for known inconsistencies in the LMC sampling frequency, we compared simultaneously acquired LMC and MAC data by performing Pearson׳s correlation (r) and root mean square error (RMSE). Wrist flexion/extension and radial/ulnar deviation showed good overall agreement (r=0.95; RMSE=11.6°, and r=0.92; RMSE=12.4°, respectively) with the MAC system. However, when tracking forearm pronation/supination, there were serious inconsistencies in reported joint angles (r=0.79; RMSE=38.4°). Hand posture significantly influenced the quality of wrist deviation (P<0.005) and forearm supination/pronation (P<0.001), but not wrist flexion/extension (P=0.29). We conclude that the LMC is capable of providing data that are clinically meaningful for wrist flexion/extension, and perhaps wrist deviation. It cannot yet return clinically meaningful data for measuring forearm pronation/supination. Future studies should continue to validate the LMC as updated versions of their software are developed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Validation test of advanced technology for IPV nickel-hydrogen flight cells: Update

    Science.gov (United States)

    Smithrick, John J.; Hall, Stephen W.

    1992-01-01

    Individual pressure vessel (IPV) nickel-hydrogen technology was advanced at NASA Lewis and under Lewis contracts with the intention of improving cycle life and performance. One advancement was to use 26 percent potassium hydroxide (KOH) electrolyte to improve cycle life. Another advancement was to modify the state-of-the-art cell design to eliminate identified failure modes. The modified design is referred to as the advanced design. A breakthrough in the low-earth-orbit (LEO) cycle life of IPV nickel-hydrogen cells has been previously reported. The cycle life of boiler plate cells containing 26 percent KOH electrolyte was about 40,000 LEO cycles compared to 3,500 cycles for cells containing 31 percent KOH. The boiler plate test results are in the process of being validated using flight hardware and real time LEO testing at the Naval Weapons Support Center (NWSC), Crane, Indiana under a NASA Lewis Contract. An advanced 125 Ah IPV nickel-hydrogen cell was designed. The primary function of the advanced cell is to store and deliver energy for long-term, LEO spacecraft missions. The new features of this design are: (1) use of 26 percent rather than 31 percent KOH electrolyte; (2) use of a patented catalyzed wall wick; (3) use of serrated-edge separators to facilitate gaseous oxygen and hydrogen flow within the cell, while still maintaining physical contact with the wall wick for electrolyte management; and (4) use of a floating rather than a fixed stack (state-of-the-art) to accommodate nickel electrode expansion due to charge/discharge cycling. The significant improvements resulting from these innovations are: extended cycle life; enhanced thermal, electrolyte, and oxygen management; and accommodation of nickel electrode expansion. The advanced cell design is in the process of being validated using real time LEO cycle life testing of NWSC, Crane, Indiana. An update of validation test results confirming this technology is presented.

  1. Impact of Recent Trends in Information and Communication Technology on the Validity of the Construct Information Literacy in Higher Education

    NARCIS (Netherlands)

    A.A.J. (Jos) van Helvoort

    2010-01-01

    The objective of this paper is a reflective discussion on the validity of the construct Information Literacy in the perspective of changing information and communication technologies. The research question that will be answered is: what is the impact of technological developments on the relevance of

  2. Development and Validation of the Computer Technology Literacy Self-Assessment Scale for Taiwanese Elementary School Students

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the…

  3. What Technology Skills Do Developers Need? A Text Analysis of Job Listings in Library and Information Science (LIS from Jobs.code4lib.org

    Directory of Open Access Journals (Sweden)

    Monica Maceli

    2015-09-01

    Full Text Available Technology plays an indisputably vital role in library and information science (LIS work; this rapidly moving landscape can create challenges for practitioners and educators seeking to keep pace with such change.  In pursuit of building our understanding of currently sought technology competencies in developer-oriented positions within LIS, this paper reports the results of a text analysis of a large collection of job listings culled from the Code4lib jobs website.  Beginning over a decade ago as a popular mailing list covering the intersection of technology and library work, the Code4lib organization's current offerings include a website that collects and organizes LIS-related technology job listings.  The results of the text analysis of this dataset suggest the currently vital technology skills and concepts that existing and aspiring practitioners may target in their continuing education as developers.

  4. Construction and validation of a tool to Assess the Use of Light Technologies at Intensive Care Units 1

    Science.gov (United States)

    Marinho, Pabliane Matias Lordelo; Campos, Maria Pontes de Aguiar; Rodrigues, Eliana Ofélia Llapa; Gois, Cristiane Franca Lisboa; Barreto, Ikaro Daniel de Carvalho

    2016-01-01

    ABSTRACT Objective: to construct and validate a tool to assess the use of light technologies by the nursing team at Intensive Care Units. Method: methodological study in which the tool was elaborated by means of the psychometric method for construction based on the categorization of health technologies by Merhy and Franco, from the National Humanization Policy, using the Nursing Intervention Classification taxonomy to categorize the domains of the tool. Agreement Percentages and Content Validity Indices were used for the purpose of validation. Results: The result of the application of the Interrater Agreement Percentage exceeded the recommended level of 80%, highlighting the relevance for the proposed theme in the assessment, with an agreement rate of 99%. Conclusion: the tool was validated with four domains (Bond, Autonomy, Welcoming and Management) and nineteen items that assess the use of light technologies at Intensive Care Units. PMID:27992025

  5. Imaging and image restoration of an on-axis three-mirror Cassegrain system with wavefront coding technology.

    Science.gov (United States)

    Guo, Xiaohu; Dong, Liquan; Zhao, Yuejin; Jia, Wei; Kong, Lingqin; Wu, Yijian; Li, Bing

    2015-04-01

    Wavefront coding (WFC) technology is adopted in the space optical system to resolve the problem of defocus caused by temperature difference or vibration of satellite motion. According to the theory of WFC, we calculate and optimize the phase mask parameter of the cubic phase mask plate, which is used in an on-axis three-mirror Cassegrain (TMC) telescope system. The simulation analysis and the experimental results indicate that the defocused modulation transfer function curves and the corresponding blurred images have a perfect consistency in the range of 10 times the depth of focus (DOF) of the original TMC system. After digital image processing by a Wiener filter, the spatial resolution of the restored images is up to 57.14 line pairs/mm. The results demonstrate that the WFC technology in the TMC system has superior performance in extending the DOF and less sensitivity to defocus, which has great value in resolving the problem of defocus in the space optical system.

  6. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  7. A rational method to evaluate tornado-borne missile speed in nuclear power plants. Validation of a numerical code based on Fujita's tornado model

    International Nuclear Information System (INIS)

    Eguchi, Yuzuru; Sugimoto, Soichiro; Hattori, Yasuo; Hirakuchi, Hiromaru

    2015-01-01

    Explanation is given about a rational method to evaluate tornado-borne missile speed, flight distance and flight height to be used for safety design of a nuclear power plant. In the method, the authors employed Fujita's DBT-77 model as a tornado wind model to take the near-ground tornado wind profile into account. A liftoff model of an object on the ground was developed by conservatively modeling the lift force due to ground effect. The wind field model and the liftoff model have been compiled together with a conventional flight model into a computer code, named TONBOS. In this study, especially, the code is verified for one- and two-dimensional free-fall problems as well as a case of 1957 Dallas tornado wind field model, whose solutions are theoretically or numerically known. Finally, the code is validated by typical car behaviors characterized by tornado wind speeds of the enhanced Fujita scale, as well as by an actual event where a truck was blown away by a tornado which struck a part of the town of Saroma, Hokkaido in November, 2006. (author)

  8. Chart validation of inpatient ICD-9-CM administrative diagnosis codes for acute myocardial infarction (AMI) among intravenous immune globulin (IGIV) users in the Sentinel Distributed Database.

    Science.gov (United States)

    Ammann, Eric M; Schweizer, Marin L; Robinson, Jennifer G; Eschol, Jayasheel O; Kafa, Rami; Girotra, Saket; Winiecki, Scott K; Fuller, Candace C; Carnahan, Ryan M; Leonard, Charles E; Haskins, Cole; Garcia, Crystal; Chrischilles, Elizabeth A

    2018-02-15

    The Sentinel Distributed Database (SDD) is a large database of patient-level administrative health care records, primarily derived from insurance claims and electronic health records, and is sponsored by the US Food and Drug Administration for medical product safety evaluations. Acute myocardial infarction (AMI) is a common study endpoint for drug safety studies that rely on health records from the SDD and other administrative databases. In this chart validation study, we report on the positive predictive value (PPV) of inpatient International Classification of Diseases, Ninth Revision, Clinical Modification AMI administrative diagnosis codes (410.x1 and 410.x0) in the SDD. As part of an assessment of thromboembolic adverse event risk following treatment with intravenous immune globulin, charts were obtained for 103 potential post-intravenous immune globulin AMI cases. Charts were abstracted by trained nurses and physician-adjudicated based on prespecified diagnostic criteria. Acute myocardial infarction status could be determined for 89 potential cases. The PPVs for the inpatient AMI diagnoses recorded in the SDD were 75% overall (95% CI, 65-84%), 93% (95% CI, 78-99%) for principal-position diagnoses, 88% (95% CI, 72-97%) for secondary diagnoses, and 38% (95% CI, 20-59%) for position-unspecified diagnoses (eg, diagnoses originating from separate physician claims associated with an inpatient stay). Of the confirmed AMI cases, demand ischemia was the suspected etiology more often for those coded in secondary or unspecified positions (72% and 40%, respectively) than for principal-position AMI diagnoses (21%). The PPVs for principal and secondary AMI diagnoses were high and similar to estimates from prior chart validation studies. Position-unspecified diagnosis codes were less likely to represent true AMI cases. Copyright © 2018 John Wiley & Sons, Ltd.

  9. TECATE - a code for anisotropic thermoelasticity in high-average-power laser technology. Phase 1 final report

    International Nuclear Information System (INIS)

    Gelinas, R.J.; Doss, S.K.; Carlson, N.N.

    1985-01-01

    This report describes a totally Eulerian code for anisotropic thermoelasticity (code name TECATE) which may be used in evaluations of prospective crystal media for high-average-power lasers. The present TECATE code version computes steady-state distributions of material temperatures, stresses, strains, and displacement fields in 2-D slab geometry. Numerous heat source and coolant boundary condition options are available in the TECATE code for laser design considerations. Anisotropic analogues of plane stress and plane strain evaluations can be executed for any and all crystal symmetry classes. As with all new and/or large physics codes, it is likely that some code imperfections will emerge at some point in time

  10. TECATE - a code for anisotropic thermoelasticity in high-average-power laser technology. Phase 1 final report

    Energy Technology Data Exchange (ETDEWEB)

    Gelinas, R.J.; Doss, S.K.; Carlson, N.N.

    1985-01-01

    This report describes a totally Eulerian code for anisotropic thermoelasticity (code name TECATE) which may be used in evaluations of prospective crystal media for high-average-power lasers. The present TECATE code version computes steady-state distributions of material temperatures, stresses, strains, and displacement fields in 2-D slab geometry. Numerous heat source and coolant boundary condition options are available in the TECATE code for laser design considerations. Anisotropic analogues of plane stress and plane strain evaluations can be executed for any and all crystal symmetry classes. As with all new and/or large physics codes, it is likely that some code imperfections will emerge at some point in time.

  11. Measuring technology self efficacy: reliability and construct validity of a modified computer self efficacy scale in a clinical rehabilitation setting.

    Science.gov (United States)

    Laver, Kate; George, Stacey; Ratcliffe, Julie; Crotty, Maria

    2012-01-01

    To describe a modification of the computer self efficacy scale for use in clinical settings and to report on the modified scale's reliability and construct validity. The computer self efficacy scale was modified to make it applicable for clinical settings (for use with older people or people with disabilities using everyday technologies). The modified scale was piloted, then tested with patients in an Australian inpatient rehabilitation setting (n = 88) to determine the internal consistency using Cronbach's alpha coefficient. Construct validity was assessed by correlation of the scale with age and technology use. Factor analysis using principal components analysis was undertaken to identify important constructs within the scale. The modified computer self efficacy scale demonstrated high internal consistency with a standardised alpha coefficient of 0.94. Two constructs within the scale were apparent; using the technology alone, and using the technology with the support of others. Scores on the scale were correlated with age and frequency of use of some technologies thereby supporting construct validity. The modified computer self efficacy scale has demonstrated reliability and construct validity for measuring the self efficacy of older people or people with disabilities when using everyday technologies. This tool has the potential to assist clinicians in identifying older patients who may be more open to using new technologies to maintain independence.

  12. The Cubesat Radiometer Radio Frequency Interference Technology Validation (CubeRRT) Mission

    Science.gov (United States)

    Misra, S.; Johnson, J. T.; Ball, C.; Chen, C. C.; Smith, G.; McKelvey, C.; Andrews, M.; O'Brien, A.; Kocz, J.; Jarnot, R.; Brown, S. T.; Piepmeier, J. R.; Lucey, J.; Miles, L. R.; Bradley, D.; Mohammed, P.

    2016-12-01

    Passive microwave measurements made below 40GHz have experienced increased amounts of man-made radio frequency interference (RFI) over the past couple of decades. Such RFI has had a degenerative impact on various important geophysical retrievals such as soil-moisture, sea-surface salinity, atmospheric water vapor, precipitation etc. The commercial demand for spectrum allocation has increased over the past couple of years - infringing on frequencies traditionally reserved for scientific uses such as Earth observation at passive microwave frequencies. With the current trend in shared spectrum allocations, future microwave radiometers will have to co-exist with terrestrial RFI sources. The CubeSat Radiometer Radio Frequency Interference Technology Validation (CubeRRT) mission is developing a 6U Cubesat system to demonstrate RFI detection and filtering technologies for future microwave radiometer remote sensing missions. CubeRRT will operate between 6-40GHz, and demonstrate on-board real-time RFI detection on Earth brightness temperatures tuned over 1GHz steps. The expected launch date for CubeRRT is early 2018. Digital subsystems for higher frequency microwave radiometry require a larger bandwidth, as well as more processing power and on-board operation capabilities for RFI filtering. Real-time and on-board RFI filtering technology development is critical for future missions to allow manageable downlink data volumes. The enabling CubeRRT technology is a digital FPGA-based spectrometer with a bandwidth of 1 GHz that is capable of implementing advanced RFI filtering algorithms that use the kurtosis and cross-frequency RFI detection methods in real-time on board the spacecraft. The CubeRRT payload consists of 3 subsystems: a wideband helical antenna, a tunable analog radiometer subsystem, and a digital backend. The following presentation will present an overview of the system and results from the latest integration and test.

  13. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  14. The use of a cubesat to validate technological bricks in space

    Science.gov (United States)

    Rakotonimbahy, E.; Vives, S.; Dohlen, K.; Savini, G.; Iafolla, V.

    2017-11-01

    In the framework of the FP7 program FISICA (Far Infrared Space Interferometer Critical Assessment), we are developing a cubesat platform which will be used for the validation in space of two technological bricks relevant for FIRI. The first brick is a high-precision accelerometer which could be used in a future space mission as fundamental element for the dynamic control loop of the interferometer. The second brick is a miniaturized version of an imaging multi-aperture telescope. Ultimately, such an instrument could be composed of numerous space-born mirror segments flying in precise formation on baselines of hundreds or thousands of meters, providing high-resolution glimpses of distant worlds. We are proposing to build a very first space-born demonstrator of such an instrument which will fit into the limited resources of one cubesat. In this paper, we will describe the detailed design of the cubesat hosting the two payloads.

  15. A systematic review of the characteristics and validity of monitoring technologies to assess Parkinson's disease.

    Science.gov (United States)

    Godinho, Catarina; Domingos, Josefa; Cunha, Guilherme; Santos, Ana T; Fernandes, Ricardo M; Abreu, Daisy; Gonçalves, Nilza; Matthews, Helen; Isaacs, Tom; Duffen, Joy; Al-Jawad, Ahmed; Larsen, Frank; Serrano, Artur; Weber, Peter; Thoms, Andrea; Sollinger, Stefan; Graessner, Holm; Maetzler, Walter; Ferreira, Joaquim J

    2016-03-12

    There is growing interest in having objective assessment of health-related outcomes using technology-based devices that provide unbiased measurements which can be used in clinical practice and scientific research. Many studies have investigated the clinical manifestations of Parkinson's disease using such devices. However, clinimetric properties and clinical validation vary among the different devices. Given such heterogeneity, we sought to perform a systematic review in order to (i) list, (ii) compare and (iii) classify technological-based devices used to measure motor function in individuals with Parkinson's disease into three groups, namely wearable, non-wearable and hybrid devices. A systematic literature search of the PubMed database resulted in the inclusion of 168 studies. These studies were grouped based on the type of device used. For each device we reviewed availability, use, reliability, validity, and sensitivity to change. The devices were then classified as (i) 'recommended', (ii) 'suggested' or (iii) 'listed' based on the following criteria: (1) used in the assessment of Parkinson's disease (yes/no), (2) used in published studies by people other than the developers (yes/no), and (3) successful clinimetric testing (yes/no). Seventy-three devices were identified, 22 were wearable, 38 were non-wearable, and 13 were hybrid devices. In accordance with our classification method, 9 devices were 'recommended', 34 devices were 'suggested', and 30 devices were classified as 'listed'. Within the wearable devices group, the Mobility Lab sensors from Ambulatory Parkinson's Disease Monitoring (APDM), Physilog®, StepWatch 3, TriTrac RT3 Triaxial accelerometer, McRoberts DynaPort, and Axivity (AX3) were classified as 'recommended'. Within the non-wearable devices group, the Nintendo Wii Balance Board and GAITRite® gait analysis system were classified as 'recommended'. Within the hybrid devices group only the Kinesia® system was classified as 'recommended'.

  16. Development and validation of science, technology, engineering and mathematics (STEM) based instructional material

    Science.gov (United States)

    Gustiani, Ineu; Widodo, Ari; Suwarma, Irma Rahma

    2017-05-01

    This study is intended to examine the development and validation of simple machines instructional material that developed based on Science, Technology, Engineering and Mathematics (STEM) framework that provides guidance to help students learn and practice for real life and enable individuals to use knowledge and skills they need to be an informed citizen. Sample of this study consist of one class of 8th grader at a junior secondary school in Bandung, Indonesia. To measure student learning, a pre-test and post-test were given before and after implementation of the STEM based instructional material. In addition, a questionnaire of readability was given to examine the clarity and difficulty level of each page of instructional material. A questionnaire of students' response towards instructional material given to students and teachers at the end of instructional material reading session to measure layout aspects, content aspects and utility aspects of instructional material for being used in the junior secondary school classroom setting. The results show that readability aspect and students' response towards STEM based instructional material of STEM based instructional material is categorized as very high. Pretest and posttest responses revealed that students retained significant amounts information upon completion of the STEM instructional material. Student overall learning gain is 0.67 which is categorized as moderate. In summary, STEM based instructional material that was developed is valid enough to be used as educational materials necessary for conducting effective STEM education.

  17. Implementation of a Transition Model in a NASA Code and Validation Using Heat Transfer Data on a Turbine Blade

    Science.gov (United States)

    Ameri, Ali A.

    2012-01-01

    The purpose of this report is to summarize and document the work done to enable a NASA CFD code to model laminar-turbulent transition process on an isolated turbine blade. The ultimate purpose of the present work is to down-select a transition model that would allow the flow simulation of a variable speed power turbine to be accurately performed. The flow modeling in its final form will account for the blade row interactions and their effects on transition which would lead to accurate accounting for losses. The present work only concerns itself with steady flows of variable inlet turbulence. The low Reynolds number k- model of Wilcox and a modified version of the same model will be used for modeling of transition on experimentally measured blade pressure and heat transfer. It will be shown that the k- model and its modified variant fail to simulate the transition with any degree of accuracy. A case is thus made for the adoption of more accurate transition models. Three-equation models based on the work of Mayle on Laminar Kinetic Energy were explored. The three-equation model of Walters and Leylek was thought to be in a relatively mature state of development and was implemented in the Glenn-HT code. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Surface heat transfer rate serves as sensitive indicator of transition. With the newly implemented model, it was shown that the simulation of transition process is much improved over the baseline k- model for the single Reynolds number and pressure ratio attempted; while agreement with heat transfer data became more satisfactory. Armed with the new transition model, total-pressure losses of computed three-dimensional flow of E3 tip section cascade were compared to the experimental data for a range of incidence angles. The results obtained, form a partial loss bucket for the chosen blade

  18. Validation and Assessment of a Technology Familiarity Score in Patients Attending a Symptomatic Breast Clinic.

    Science.gov (United States)

    O'Brien, C; Kelly, J; Lehane, E A; Livingstone, V; Cotter, B; Butt, A; Kelly, L; Corrigan, M A

    2015-10-01

    New media technologies (computers, mobile phones and the internet) have the potential to transform the healthcare information needs of patients with breast disease (Ferlay et al. in Eur J Cancer 49:1374-1403, 2013). However, patients' current level of use and their willingness to accept new media for education and communication remain unknown. This was a single-centre clinic-based prospective cross-sectional study. A previously developed instrument was modified, validated and tested on patients attending a symptomatic breast clinic. The instrument was evaluated on 200 symptomatic breast patients. The commonest outlets for education were staff (95 %), leaflets (69 %) and websites (59 %). Websites are more likely to be consulted by younger patients (new to the clinic were more likely to find text messaging and emailing useful (n messages, apps, websites and email useful (p messaging, apps, websites and email as useful media (p new media technology use among breast patients is expanding as expected along generational trends. As such its' further integration into healthcare systems can potentially ameliorate patient education and communication.

  19. A study of longwave radiation codes for climate studies: Validation with ARM observations and tests in general circulation models

    International Nuclear Information System (INIS)

    Ellingson, R.G.; Baer, F.

    1993-01-01

    This report summarizes the activities of our group to meet our stated objectives. The report is divided into sections entitled: Radiation Model Testing Activities, General Circulation Model Testing Activities, Science Team Activities, and Publications, Presentations and Meetings. The section on Science Team Activities summarizes our participation with the science team to further advance the observation and modeling programs. Appendix A lists graduate students supported, and post-doctoral appointments during the project. Reports on the activities during each of the first two years are included as Appendix B. Significant progress has been made in: determining the ability of line-by-line radiation models to calculate the downward longwave flux at the surface; determining the uncertainties in calculated the downwelling radiance and flux at the surface associated with the use of different proposed profiling techniques; intercomparing clear-sky radiance and flux observations with calculations from radiation codes from different climate models; determining the uncertainties associated with estimating N* from surface longwave flux observations; and determining the sensitivity of model calculations to different formulations of the effects of finite sized clouds

  20. Real Virtuality: A Code of Ethical ConductRecommendations for Good Scientific Practice and the Consumers of VR-Technology

    Directory of Open Access Journals (Sweden)

    Michael eMadary

    2016-02-01

    Full Text Available The goal of this article is to present a first list of ethical concerns that may arise from research and personal use of virtual reality (VR and related technology, and to offer concrete recommendations for minimizing those risks. Many of the recommendations call for focused research initiatives. In the first part of the article, we discuss the relevant evidence from psychology that motivates our concerns. In section 1.1, we cover some of the main results suggesting that one’s environment can influence one’s psychological states, as well as recent work on inducing illusions of embodiment. Then, in section 1.2, we go on to discuss recent evidence indicating that immersion in VR can have psychological effects that last after leaving the virtual environment. In the second part of the article we turn to the risks and recommendations. We begin, in section 2.1, with the research ethics of VR, covering six main topics: the limits of experimental environments, informed consent, clinical risks, dual-use, online research, and a general point about the limitations of a code of conduct for research. Then, in section 2.2, we turn to the risks of VR for the general public, covering four main topics: long-term immersion, neglect of the social and physical environment, risky content, and privacy. We offer concrete recommendations for each of these ten topics, summarized in Table 1.

  1. Development and Validation of Web-Based Courseware for Junior Secondary School Basic Technology Students in Nigeria

    OpenAIRE

    Amosa Isiaka Gambari

    2018-01-01

    This research aimed to develop and validate a web-based courseware for junior secondary school basic technology students in Nigeria. In this study, a mixed method quantitative pilot study design with qualitative components was used to test and ascertain the ease of development and validation of the web-based courseware. Dick and Carey instructional system design model was adopted for developing the courseware. Convenience sampling technique was used in selecting the three content, computer an...

  2. Test and validation of CFD codes for the simulation of accident-typical phenomena in the reactor containment; Erprobung und Validierung von CFD-Codes fuer die Simulation von unfalltypischen Phaenomenen im Sicherheitseinschluss

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, Berthold; Stewering, Joern; Sonnenkalb, Martin

    2014-03-15

    CFD (Computational Fluid Dynamic) simulation techniques have a growing relevance for the simulation and assessment of accidents in nuclear reactor containments. Some fluid dynamic problems like the calculation of the flow resistances in a complex geometry, turbulence calculations or the calculation of deflagrations could only be solved exactly for very simple cases. These fluid dynamic problems could not be represented by lumped parameter models and must be approximated numerically. Therefore CFD techniques are discussed by a growing international community in conferences like the CFD4NRS-conference. Also the number of articles with a CFD topic is increasing in professional journals like Nuclear Engineering and Design. CFD tools like GASFLOW or GOTHIC are already in use in European nuclear site licensing processes for future nuclear power plants like EPR or AP1000 and the results of these CFD tools are accepted by the authorities. For these reasons it seems to be necessary to build up national competences in the field of CFD techniques and it is important to validate and assess the existing CFD tools. GRS continues the work for the validation and assessment of CFD codes for the simulation of accident scenarios in a nuclear reactor containment within the framework of the BMWi sponsored project RS1500. The focus of this report is on the following topics: - Further validation of condensation models from GRS, FZJ and ANSYS and development of a new condensate model. - Validation of a new turbulence model which was developed by the University of Stuttgart in cooperation with ANSYS. - The formation and dissolution of light gas stratifications are analyzed by large scale experiments. These experiments were simulated by GRS. - The AREVA correlations for hydrogen recombiners (PARs) could be improved by GRS after the analysis of experimental data. Relevant experiments were simulated with this improved recombiner correlation. - Analyses on the simulation of H{sub 2

  3. Internal dosimetry with the Monte Carlo code GATE: validation using the ICRP/ICRU female reference computational model

    Science.gov (United States)

    Villoing, Daphnée; Marcatili, Sara; Garcia, Marie-Paule; Bardiès, Manuel

    2017-03-01

    The purpose of this work was to validate GATE-based clinical scale absorbed dose calculations in nuclear medicine dosimetry. GATE (version 6.2) and MCNPX (version 2.7.a) were used to derive dosimetric parameters (absorbed fractions, specific absorbed fractions and S-values) for the reference female computational model proposed by the International Commission on Radiological Protection in ICRP report 110. Monoenergetic photons and electrons (from 50 keV to 2 MeV) and four isotopes currently used in nuclear medicine (fluorine-18, lutetium-177, iodine-131 and yttrium-90) were investigated. Absorbed fractions, specific absorbed fractions and S-values were generated with GATE and MCNPX for 12 regions of interest in the ICRP 110 female computational model, thereby leading to 144 source/target pair configurations. Relative differences between GATE and MCNPX obtained in specific configurations (self-irradiation or cross-irradiation) are presented. Relative differences in absorbed fractions, specific absorbed fractions or S-values are below 10%, and in most cases less than 5%. Dosimetric results generated with GATE for the 12 volumes of interest are available as supplemental data. GATE can be safely used for radiopharmaceutical dosimetry at the clinical scale. This makes GATE a viable option for Monte Carlo modelling of both imaging and absorbed dose in nuclear medicine.

  4. Integration of electronic nose technology with spirometry: validation of a new approach for exhaled breath analysis.

    Science.gov (United States)

    de Vries, R; Brinkman, P; van der Schee, M P; Fens, N; Dijkers, E; Bootsma, S K; de Jongh, F H C; Sterk, P J

    2015-10-15

    New 'omics'-technologies have the potential to better define airway disease in terms of pathophysiological and clinical phenotyping. The integration of electronic nose (eNose) technology with existing diagnostic tests, such as routine spirometry, can bring this technology to 'point-of-care'. We aimed to determine and optimize the technical performance and diagnostic accuracy of exhaled breath analysis linked to routine spirometry. Exhaled breath was collected in triplicate in healthy subjects by an eNose (SpiroNose) based on five identical metal oxide semiconductor sensor arrays (three arrays monitoring exhaled breath and two reference arrays monitoring ambient air) at the rear end of a pneumotachograph. First, the influence of flow, volume, humidity, temperature, environment, etc, was assessed. Secondly, a two-centre case-control study was performed using diagnostic and monitoring visits in day-to-day clinical care in patients with a (differential) diagnosis of asthma, chronic obstructive pulmonary disease (COPD) or lung cancer. Breathprint analysis involved signal processing, environment correction based on alveolar gradients and statistics based on principal component (PC) analysis, followed by discriminant analysis (Matlab2014/SPSS20). Expiratory flow showed a significant linear correlation with raw sensor deflections (R(2)  =  0.84) in 60 healthy subjects (age 43  ±  11 years). No correlation was found between sensor readings and exhaled volume, humidity and temperature. Exhaled data after environment correction were highly reproducible for each sensor array (Cohen's Kappa 0.81-0.94). Thirty-seven asthmatics (41  ±  14.2 years), 31 COPD patients (66  ±  8.4 years), 31 lung cancer patients (63  ±  10.8 years) and 45 healthy controls (41  ±  12.5 years) entered the cross-sectional study. SpiroNose could adequately distinguish between controls, asthma, COPD and lung cancer patients with cross-validation values

  5. Development of LMR basic design technology - Development of 3-D multi-group nodal kinetics code for liquid metal reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun [Kyunghee University, Seoul (Korea, Republic of)

    1996-07-01

    A development project of 3-dimensional kinetics code for ALMR has three level of works. In the first level, a multi-group, nodal kinetics code for the HEX-Z geometry has been developed. A code showed very good results for the static analysis as well as for the kinetics problems. At the second level, a core thermal-hydraulic analysis code was developed for the temperature feedback calculation in ALMR transients analysis. This code is coupled with kinetics code. A sodium property table was programmed and tested to the KAERI data and thermal feedback model was developed and coupled in code. Benchmarking of T/H calculation has been performed and showed fairly good results. At the third level of research work, reactivity feedback model for structure thermal expansion is developed and added to the code. At present, basic model was studied. However, code development in now on going. Benchmarking of this model developed can not be done because of lack of data. 31 refs., 17 tabs., 38 figs. (author)

  6. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  7. Calculation of dose rate in escape channel of Research Irradiating Facility Army Technology Center using code MCNPX

    International Nuclear Information System (INIS)

    Gomes, Renato G.; Rebello, Wilson F.; Vellozo, Sergio O.; Moreira Junior, Luis; Vital, Helio C.; Rusin, Tiago; Silva, Ademir X.

    2013-01-01

    In order to evaluate new lines of research in the area of irradiation of materials external to the research irradiating facility Army Technology Center (CTEx), it is necessary to study security parameters and magnitude of the dose rates from their channels of escape. The objective was to calculate, with the code MCNPX, dose rates (Gy / min) on the interior and exterior of the four-channel leakage gamma irradiator. The channels were designed to leak radiation on materials properly disposed in the area outside the irradiator larger than the expected volume of irradiation chambers (50 liters). This study aims to assess the magnitude of dose rates within the channels, as well as calculate the angle of beam output range outside the channel for analysis as to its spread, and evaluation of safe conditions of their operators (protection radiological). The computer simulation was performed by distributing virtual dosimeter ferrous sulfate (Fricke) in the longitudinal axis of the vertical drain channels (anterior and posterior) and horizontal (top and bottom). The results showed a collimating the beams irradiated on each of the channels to the outside, with values of the order of tenths of Gy / min as compared to the maximum amount of operation of the irradiator chamber (33 Gy / min). The external beam irradiation in two vertical channels showed a distribution shaped 'trunk pyramid', not collimated, so scattered, opening angle 83 ° in the longitudinal direction and 88 in the transverse direction. Thus, the cases allowed the evaluation of materials for irradiation outside the radiator in terms of the magnitude of the dose rates and positioning of materials, and still be able to take the necessary care in mounting shield for radiation protection by operators, avoiding exposure to ionizing radiation. (author)

  8. Target validation for FCV technology development in Japan from energy competition point of view

    International Nuclear Information System (INIS)

    ENDO Eiichi

    2006-01-01

    The objective of this work is to validate the technical targets in the governmental hydrogen energy road-map of Japan by analyzing market penetration of fuel cell vehicle(FCV)s and effects of fuel price and carbon tax on it from technology competition point of view. In this analysis, an energy system model of Japan based on MARKAL is used. The results of the analysis show that hydrogen FCVs could not have cost-competitiveness until 2030 without carbon tax, including the governmental actual plan of carbon tax. However, as the carbon tax rate increases, instead of conventional vehicles including gasoline hybrid electric vehicle, hydrogen FCVs penetrate to the market earlier and more. By assuming higher fuel price and severer carbon tax rate, market share of hydrogen FCVs approaches to the governmental goal. This suggests that cheaper vehicle cost and/or hydrogen price than those targeted in the road-map is required. At the same time, achievement of the technical targets in the road-map also allows to attain the market penetration target of hydrogen FCVs in some possible conditions. (authors)

  9. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    Science.gov (United States)

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  10. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  11. Development of the recombinase-based in vivo expression technology in Streptococcus thermophilus and validation using the lactose operon promoter

    NARCIS (Netherlands)

    Junjua, M.; Galia, W.; Gaci, N.; Uriot, O.; Genay, M.; Bachmann, H.; Kleerebezem, M.; Dary, A.; Roussel, Y.

    2014-01-01

    AIMS: To construct and validate the recombinase-based in vivo expression technology (R-IVET) tool in Streptococcus thermophilus (ST). METHODS AND RESULTS: The R-IVET system we constructed in the LMD-9 strain includes the plasmid pULNcreB allowing transcriptional fusion with the gene of the

  12. Development of the recombinase-based in vivo expression technology in Streptococcus thermophilus and validation using the lactose operon promoter

    NARCIS (Netherlands)

    Junjua, M.; Galia, W.; Gaci, N.; Uriot, O.; Genay, M.; Bachmann, H.; Kleerebezem, M.; Dary, A.; Roussel, Y.

    2014-01-01


    Aims

    To construct and validate the recombinase-based in vivo expression technology (R-IVET) tool in Streptococcus thermophilus (ST).

    Methods and Results

    The R-IVET system we constructed in the LMD-9 strain includes the plasmid pULNcreB allowing transcriptional fusion

  13. Validation of the TRACE code for the system dynamic simulations of the molten salt reactor experiment and the preliminary study on the dual fluid molten salt reactor

    Energy Technology Data Exchange (ETDEWEB)

    He, Xun

    2016-06-14

    Molten Salt Reactor (MSR), which was confirmed as one of the six Generation IV reactor types by the GIF (Generation IV International Forum in 2008), recently draws a lot of attention all around the world. Due to the application of liquid fuels the MSR can be regarded as the most special one among those six GEN-IV reactor types in a sense. A unique advantage of using liquid nuclear fuel lies in that the core melting accident can be thoroughly eliminated. Besides, a molten salt reactor can have several fuel options, for instance, the fuel can be based on {sup 235}U, {sup 232}Th-{sup 233}U, {sup 238}U-{sup 239}Pu cycle or even the spent nuclear fuel (SNF), so the reactor can be operated as a breeder or as an actinides burner both with fast, thermal or epi-thermal neutron spectrum and hence, it has excellent features of the fuel sustainability and for the non-proliferation. Furthermore, the lower operating pressure not only means a lower risk of the explosion as well as the radioactive leakage but also implies that the reactor vessel and its components can be lightweight, thus lowering the cost of equipments. So far there is no commercial MSR being operated. However, the MSR concept and its technical validation dates back to the 1960s to 1970s, when the scientists and engineers from ORNL (Oak Ridge National Laboratory) in the United States managed to build and run the world's first civilian molten salt reactor called MSRE (Molten Salt Reactor Experiment). The MSRE was an experimental liquid-fueled reactor with 10 MW thermal output using {sup 4}LiF-BeF{sub 2}-ZrF{sub 4}-UF{sub 4} as the fuel also as the coolant itself. The MSRE is usually taken as a very important reference case for many current researches to validate their codes and simulations. Without exception it works also as a benchmark for this thesis. The current thesis actually consists of two main parts. The first part is about the validation of the current code for the old MSRE concept, while the second

  14. Validation of morphing wing methodologies on an unmanned aerial system and a wind tunnel technology demonstrator

    Science.gov (United States)

    Gabor, Oliviu Sugar

    To increase the aerodynamic efficiency of aircraft, in order to reduce the fuel consumption, a novel morphing wing concept has been developed. It consists in replacing a part of the wing upper and lower surfaces with a flexible skin whose shape can be modified using an actuation system placed inside the wing structure. Numerical studies in two and three dimensions were performed in order to determine the gains the morphing system achieves for the case of an Unmanned Aerial System and for a morphing technology demonstrator based on the wing tip of a transport aircraft. To obtain the optimal wing skin shapes in function of the flight condition, different global optimization algorithms were implemented, such as the Genetic Algorithm and the Artificial Bee Colony Algorithm. To reduce calculation times, a hybrid method was created by coupling the population-based algorithm with a fast, gradient-based local search method. Validations were performed with commercial state-of-the-art optimization tools and demonstrated the efficiency of the proposed methods. For accurately determining the aerodynamic characteristics of the morphing wing, two new methods were developed, a nonlinear lifting line method and a nonlinear vortex lattice method. Both use strip analysis of the span-wise wing section to account for the airfoil shape modifications induced by the flexible skin, and can provide accurate results for the wing drag coefficient. The methods do not require the generation of a complex mesh around the wing and are suitable for coupling with optimization algorithms due to the computational time several orders of magnitude smaller than traditional three-dimensional Computational Fluid Dynamics methods. Two-dimensional and three-dimensional optimizations of the Unmanned Aerial System wing equipped with the morphing skin were performed, with the objective of improving its performances for an extended range of flight conditions. The chordwise positions of the internal actuators

  15. Identification of mRNA-like non-coding RNAs and validation of a mighty one named MAR in Panax ginseng.

    Science.gov (United States)

    Wang, Meizhen; Wu, Bin; Chen, Chao; Lu, Shanfa

    2015-03-01

    Increasing evidence suggests that long non-coding RNAs (lncRNAs) play significant roles in plants. However, little is known about lncRNAs in Panax ginseng C. A. Meyer, an economically significant medicinal plant species. A total of 3,688 mRNA-like non-coding RNAs (mlncRNAs), a class of lncRNAs, were identified in P. ginseng. Approximately 40% of the identified mlncRNAs were processed into small RNAs, implying their regulatory roles via small RNA-mediated mechanisms. Eleven miRNA-generating mlncRNAs also produced siRNAs, suggesting the coordinated production of miRNAs and siRNAs in P. ginseng. The mlncRNA-derived small RNAs might be 21-, 22-, or 24-nt phased and could be generated from both or only one strand of mlncRNAs, or from super long hairpin structures. A full-length mlncRNA, termed MAR (multiple-function-associated mlncRNA), was cloned. It generated the most abundant siRNAs. The MAR siRNAs were predominantly 24-nt and some of them were distributed in a phased pattern. A total of 228 targets were predicted for 71 MAR siRNAs. Degradome sequencing validated 68 predicted targets involved in diverse metabolic pathways, suggesting the significance of MAR in P. ginseng. Consistently, MAR was detected in all tissues analyzed and responded to methyl jasmonate (MeJA) treatment. It sheds light on the function of mlncRNAs in plants. © 2014 Institute of Botany, Chinese Academy of Sciences.

  16. Development of LEAP-JET code for sodium-water reaction analysis. Validation by sodium-water reaction tests (SWAT-1R)

    International Nuclear Information System (INIS)

    Seino, Hiroshi; Hamada, Hirotsugu

    2004-03-01

    The sodium-water reaction event in an FBR steam generator (SG) has influence on the safety, economical efficiency, etc. of the plant, so that the selection of design base leak (DBL) of the SG is considered as one of the important matters. The clarification of the sodium-water reaction phenomenon and the development of an analysis model are necessary to estimate the sodium-water reaction event with high accuracy and rationality in selecting the DBL. The reaction jet model is pointed out as a part of the necessary improvements to evaluate the overheating tube rupture of large SGs, since the behavior of overheating tube rupture is largely affected by the reaction jet conditions outside the tube. Therefore, LEAP-JET has been developed as an analysis code for the simulation of sodium-water reactions. This document shows the validation of the LEAP-JET code by the Sodium-Water Reaction Test (SWAT-1R). The following results have been obtained: (1) The reaction rate constant, K, is estimated at between 0.001≤K≤0.1 from the LEAP-JET analysis of the SWAT-1R data. (2) The analytical results on the high-temperature region and the behaviors of reaction consumption (Na, H 2 O) and products (H 2 , NaOH, Na 2 O) are considered to be physically reasonable. (3) The LEAP-JET analysis shows the tendency of overestimation in the maximum temperature and temperature distribution of the reaction jet. (4) In the LEAP-JET analysis, the numerical calculation becomes unstably, especially in the mesh containing quite small sodium mass. Therefore, it is necessary to modify the computational algorism to stabilize it and obtain the optimum value of K in sodium-water reactions. (author)

  17. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  18. Validation of Align Technology's Treat III digital model superimposition tool and its case application.

    Science.gov (United States)

    Miller, R J; Kuo, E; Choi, W

    2003-01-01

    An assessment of the efficacy and accuracy of three-dimensional computer-based predictive orthodontic systems requires that new methods of treatment analysis be developed and validated. Invisalign is a digitally fabricated, removable orthodontic appliance that has been commercially available since 1999. It is made up of two main components: 1) computerized graphical images of a patient's teeth moving through a series of stages from initial to final position; 2) pressure formed clear plastic appliances made from stereolithography models of the images in the first component. The manufacturer of Invisalign (Align Technology, Inc.) has created a software tool that can be used to superimpose digital models to evaluate treatment outcomes in three dimensions. Using this software, research was conducted to determine if a single operator could repeatedly superimpose two identical digital models using 12 selected points from the palatal rugae over 10 trials. The tool was then applied to one subject's orthodontic treatment. EXPERIMENT VARIABLES: The output from this tool includes rotations, translations and morphological changes. For this study, translations and rotations were chosen. The results showed that the digital superimposition was reproducible, and that after multiple trials, the superimposition error decreased. The average error in x, y, z, Rx, Ry and Rz after 10 trials was determined to approach approximately 0.2 mm in translation and less than 1 degree in rotation, with a standard deviation of 0.15 mm and 0.7 mm, respectively. The treatment outcome from a single Invisalign-treated bicuspid extraction case was also evaluated tooth-by-tooth in x, y, z, Rx, Ry and Rz dimensions. Using the palate, as a stable reference seemed to work well and the evaluation of the single case showed that many, but not all, of the planned movements occurred.

  19. Validation test of advanced technology for IPV nickel-hydrogen flight cells - Update

    Science.gov (United States)

    Smithrick, John J.; Hall, Stephen W.

    1992-01-01

    Individual pressure vessel (IPV) nickel-hydrogen technology was advanced at NASA Lewis and under Lewis contracts with the intention of improving cycle life and performance. One advancement was to use 26 percent potassium hydroxide (KOH) electrolyte to improve cycle life. Another advancement was to modify the state-of-the-art cell design to eliminate identified failure modes. The modified design is referred to as the advanced design. A breakthrough in the LEO cycle life of IPV nickel-hydrogen cells has been previously reported. The cycle life of boiler plate cells containing 26 percent KOH electrolyte was about 40,000 LEO cycles compared to 3,500 cycles for cells containing 31 percent KOH. The boiler plate test results are in the process of being validated using flight hardware and real time LEO testing. The primary function of the advanced cell is to store and deliver energy for long-term, LEO spacecraft missions. The new features of this design are: (1) use of 26 percent rather than 31 percent KOH electrolyte; (2) use of a patented catalyzed wall wick; (3) use of serrated-edge separators to facilitate gaseous oxygen and hydrogen flow within the cell, while still maintaining physical contact with the wall wick for electrolyte management; and (4) use of a floating rather than a fixed stack (state-of-the-art) to accommodate nickel electrode expansion due to charge/discharge cycling. The significant improvements resulting from these innovations are: extended cycle life; enhanced thermal, electrolyte, and oxygen management; and accommodation of nickel electrode expansion.

  20. Development of LMR basic design technology - Development of 3-D. multi-group nodal kinetics code for liquid metal reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun [Kyunghee University, Seoul (Korea, Republic of)

    1995-07-01

    A development project of 3-dimensional kinetics code for ALMR has four level of works. In the first level, a multi-group, nodal kinetics code for the HEX-Z geometry has been developed. At this point code showed very good results for the static analysis. However, kinetics routine has not been benchmarked because exact benchmark problem was not found. For the artificial benchmark problem, code showed satisfying results. At the second level, a core thermal-hydraulic analysis code was developed for the temperature feedback calculation ALMR transients analysis. A sodium property table was programmed and tested to the KAERI data. Benchmarking of T/H calculation has been performed and showed fairly good results. At the third level of research work, combining of two code should be done. A reactivity feedback model for structure thermal expansion is also developed at this stage. The third and fourth level is planned to be done next year. At this point, work progress is kept right on time. 24 refs., 12 tabs., 15 figs. (author)

  1. Rhodopsin in plasma from patients with diabetic retinopathy - development and validation of digital ELISA by Single Molecule Array (Simoa) technology

    DEFF Research Database (Denmark)

    Petersen, Eva Rabing Brix; Olsen, Dorte Aalund; Christensen, Henry

    2017-01-01

    was therefore to develop and validate a Rhodopsin assay by employing digital ELISA technology, and to investigate whether Rhodopsin concentrations in diabetes patients with DR are elevated compared with diabetes patients without DR. METHODS: A digital ELISA assay using a Simoa HD-1 Analyzer (Quanterix...... patients with or without DR, but significantly increased number of DR patients having concentrations above the LOD. CONCLUSION: We developed and validated a digital ELISA method for quantification of Rhodopsin in plasma but found no statistically significant difference in the plasma concentration...

  2. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  3. Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics model. Code-Saturne validation with the Prairie Grass experiment/Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics software

    International Nuclear Information System (INIS)

    Coulon, Fanny

    2010-09-01

    A validation of Code-Saturne, a computational fluids dynamics model developed by EDF, is proposed for stable conditions. The goal is to guarantee the performance of the model in order to use it for impacts study. A comparison with the Prairie Grass data field experiment and with two Gaussian plume models will be done [fr

  4. Nanoparticle based bio-bar code technology for trace analysis of aflatoxin B1 in Chinese herbs.

    Science.gov (United States)

    Yu, Yu-Yan; Chen, Yuan-Yuan; Gao, Xuan; Liu, Yuan-Yuan; Zhang, Hong-Yan; Wang, Tong-Ying

    2018-04-01

    A novel and sensitive assay for aflatoxin B1 (AFB1) detection has been developed by using bio-bar code assay (BCA). The method that relies on polyclonal antibodies encoded with DNA modified gold nanoparticle (NP) and monoclonal antibodies modified magnetic microparticle (MMP), and subsequent detection of amplified target in the form of bio-bar code using a fluorescent quantitative polymerase chain reaction (FQ-PCR) detection method. First, NP probes encoded with DNA that was unique to AFB1, MMP probes with monoclonal antibodies that bind AFB1 specifically were prepared. Then, the MMP-AFB1-NP sandwich compounds were acquired, dehybridization of the oligonucleotides on the nanoparticle surface allows the determination of the presence of AFB1 by identifying the oligonucleotide sequence released from the NP through FQ-PCR detection. The bio-bar code techniques system for detecting AFB1 was established, and the sensitivity limit was about 10 -8  ng/mL, comparable ELISA assays for detecting the same target, it showed that we can detect AFB1 at low attomolar levels with the bio-bar-code amplification approach. This is also the first demonstration of a bio-bar code type assay for the detection of AFB1 in Chinese herbs. Copyright © 2017. Published by Elsevier B.V.

  5. Validating the Technology Proficiency Self-Assessment Questionnaire for 21st Century Learning (TPSA C-21)

    Science.gov (United States)

    Christensen, Rhonda; Knezek, Gerald

    2017-01-01

    Accurately measuring levels of technology proficiency in current and future classroom teachers are an important first step toward enhancing comfort level and confidence in integrating technology into the educational environment. The original Technology Proficiency Self-Assessment (TPSA) survey has maintained respectable psychometric properties for…

  6. Education Technology Standards Self-Efficacy (ETSSE) Scale: A Validity and Reliability Study

    Science.gov (United States)

    Simsek, Omer; Yazar, Taha

    2016-01-01

    Problem Statement: The educational technology standards for teachers set by the International Society for Technology in Education (the ISTE Standards-T) represent an important framework for using technology effectively in teaching and learning processes. These standards are widely used by universities, educational institutions, and schools. The…

  7. Integration of electronic nose technology with spirometry: validation of a new approach for exhaled breath analysis

    NARCIS (Netherlands)

    de Vries, R.; Brinkman, P.; van der Schee, M. P.; Fens, N.; Dijkers, E.; Bootsma, S. K.; de Jongh, F. H. C.; Sterk, P. J.

    2015-01-01

    New 'omics'-technologies have the potential to better define airway disease in terms of pathophysiological and clinical phenotyping. The integration of electronic nose (eNose) technology with existing diagnostic tests, such as routine spirometry, can bring this technology to 'point-of-care'. We

  8. Validity of Business Strategy as Driver in Technology Management – A Critical Discussion

    DEFF Research Database (Denmark)

    Tambo, Torben; Østergaard, Klaus

    2015-01-01

    in connecting technological design tightly to the business strategy. The purpose of this paper is to advance a research agenda, where long-term orientation of technology is connected to the necessary tools for obtaining insight in assessing adequacy, reliability and quality of business strategy and evaluation......Frameworks for technological development are increasingly requiring that technology must be developed in accordance with the corporate business strategy. It is an interesting tendency that technological development should reflect and interact with central change processes of the enterprise...... of alternatives. Alternatives are related to business environments, compliance, infrastructure, knowledge, skills and societal factors....

  9. Achievement Emotions in Technology Enhanced Learning: Development and Validation of Self-Report Instruments in the Italian Context

    Directory of Open Access Journals (Sweden)

    Daniela Raccanello

    2015-02-01

    Full Text Available The increased use of technology within the educational field gives rise to the need for developing valid instruments to measure key constructs associated with performance. We present some self-report instruments developed and/or validated in the Italian context that could be used to assess achievement emotions and correlates, within the theoretical framework of Pekrun’s control-value model. First, we propose some data related to the construction of two instruments developed to assess ten achievement emotions: the Brief Achievement Emotions Questionnaire, BR-AEQ, used with college students, and the Graduated Achievement Emotions Set, GR-AES, used with primary school students. Second, we describe some data concerning the validation within the Italian context of two instruments assessing achievement goals as antecedents of achievement emotions: the Achievement Goal Questionnaire-Revised, AGQ-R, and its more recent version based on the 3 X 2 achievement goal model.

  10. WIAMan Technology Demonstrator Sensor Codes Conforming to International Organization for Standardization/Technical Standard (ISO/TS) 13499

    Science.gov (United States)

    2016-03-01

    lumbar to coccyx interface bracket for collection of each of the ischial tuberosity and pelvic rami load-cells. The MAIN LOCATION code for this region...is PELV, and the permissible codes are shown in Table 6. As the ischial tuberosity and pelvic rami load-cells collectively use only 5 channels for...FO Z Right ischial Z force PELV FR LE DM MO X Left pubic rami X bending moment PELV FR LE DM MO Z Left pubic rami Z bending moment PELV FR RI DM MO X

  11. Establishing and evaluating bar-code technology in blood sampling system: a model based on human centered human-centered design method.

    Science.gov (United States)

    Chou, Shin-Shang; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Tseng, Kuan-Jui; Kuo, Shu-Chen

    2012-01-01

    This study intended to use a human-centered design study method to develop a bar-code technology in blood sampling process. By using the multilevel analysis to gather the information, the bar-code technology has been constructed to identify the patient's identification, simplify the work process, and prevent medical error rates. A Technology Acceptance Model questionnaire was developed to assess the effectiveness of system and the data of patient's identification and sample errors were collected daily. The average scores of 8 items users' perceived ease of use was 25.21(3.72), 9 items users' perceived usefulness was 28.53(5.00), and 14 items task-technology fit was 52.24(7.09), the rate of patient identification error and samples with order cancelled were down to zero, however, new errors were generated after the new system deployed; which were the position of barcode stickers on the sample tubes. Overall, more than half of nurses (62.5%) were willing to use the new system.

  12. [Translation and validation of the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST 2.0) into Portuguese].

    Science.gov (United States)

    de Carvalho, Karla Emanuelle Cotias; Gois Júnior, Miburge Bolívar; Sá, Katia Nunes

    2014-01-01

    To translate and validate the Quebec User Evaluation of Satisfaction with Assistive Technology (QUEST 2.0) into Brazilian Portuguese. Certified translators translated and back-translated Quest. Content validity (CVI) was determined by 5 experts and, after the final version of B-Quest, a pre-test was applied to users of manual wheelchairs, walkers and crutches. The psychometric properties were tested to assure the validity of items and the reliability and stability of the scale. Data were obtained from 121 users of the above-mentioned devices. Our study showed a CVI of 91.66% and a satisfactory factor analysis referent to the two-dimensional structure of the instrument that ensured the representativeness of the items. The Cron-bach's alpha of the items device, service and total score of B-Quest were 0.862, 0.717 and 0.826, respectively. Test-retest stability conducted after a time interval of 2 months was analyzed using Spearman's correlation test, which showed high correlation (ρ >0.6) for most items. The study suggests that the B-Quest is a reliable, representative, and valid instrument to measure the satisfaction of users of assistive technology in Brazil. Copyright © 2014 Elsevier Editora Ltda. All rights reserved.

  13. DRACCAR, a new 3D-thermal mechanical computer code to simulate LOCA transient on nuclear power plants. Status of the development and the validation

    International Nuclear Information System (INIS)

    Georges, Repetto; Francois, Jacq; Francois, Barre; Francois, Lamare; Jean-Marc, Ricaud

    2009-01-01

    IRSN is developing the DRACCAR computational software within the scope of its safety analyses on pressurised water reactors (PWR). This software is used to study loss-of-coolant accidents in the reactor core (LOCA) or in a spent fuel storage tank, for example. During such an accident, the coolant vaporises and the fuel rods dry out, which leads to an increase of their temperature, a swelling and fuel cladding failure. This swelling is responsible for major blockage in port of the core and can jeopardize the possibility of core cooling by means of back-up systems. The 3D multi-rod software is designed to model a fuel assembly so as to assess rod cooling and the blockage rate caused by deformed rods, by taking into account mechanical and thermal interactions between rods. The software can provide a consistent interpretation of the entire experimental database for a 'single-rod' configuration or a 'rod-bundle' configuration with either real or simulator fuel, transpose these results onto a reactor scale to determine what kind of research still needs to be conducted and finally, carry out safety studies. The models developed for this software cover: Heat transfers by conduction, convection and radiation. Oxidation of Zircaloy elements (cladding, guide tubes, inner shroud layer..) as well as hydriding process which can change mechanical properties. Thermomechanical behavior of fuel cladding (deformation and failure), including bowing phenomenon. Thermohydraulics on the scale of an assembly (to couple with an appropriate software), including a reflooding model. Fuel relocation and release of fission gases. A first version (DRACCAR V1) was delivered in March 2008 and is being validated on the basis of available experimental data (EDGAR, PHEBUS LOCA, PERICLES, REBEKA, HALDEN, etc.). A second version will be released in 2012 for which a coupling, in particular in the frame of the European NURISP project, is planned to an advanced sub-channel thermal-hydraulics code CATHARE

  14. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim

    2016-10-01

    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  15. Towards Developing an Industry-Validated Food Technology Curriculum in Afghanistan

    Science.gov (United States)

    Ebner, Paul; McNamara, Kevin; Deering, Amanda; Oliver, Haley; Rahimi, Mirwais; Faisal, Hamid

    2017-01-01

    Afghanistan remains an agrarian country with most analyses holding food production and processing as key to recovery. To date, however, there are no public or private higher education departments focused on food technology. To bridge this gap, Herat University initiated a new academic department conferring BS degrees in food technology. Models for…

  16. Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers

    Science.gov (United States)

    Schmidt, Denise A.; Baran, Evrim; Thompson, Ann D.; Mishra, Punya; Koehler, Matthew J.; Shin, Tae S.

    2009-01-01

    Based in Shulman's idea of Pedagogical Content Knowledge, Technological Pedagogical Content Knowledge (TPACK) has emerged as a useful frame for describing and understanding the goals for technology use in preservice teacher education. This paper addresses the need for a survey instrument designed to assess TPACK for preservice teachers. The paper…

  17. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  18. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    Science.gov (United States)

    1993-11-01

    interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a

  19. Global Positioning System Technology (GPS for Psychological Research: A Test of Convergent and Nomological Validity

    Directory of Open Access Journals (Sweden)

    Pedro eWolf

    2013-06-01

    Full Text Available The purpose of this paper is to examine the convergent and nomological validity of a GPS-based measure of daily activity, operationalized as Number of Places Visited (NPV. Relations among the GPS-based measure and two self-report measures of NPV, as well as relations among NPV and two factors made up of self-reported individual differences were examined. The first factor was composed of variables related to an Active Lifestyle (AL (e.g. positive affect, extraversion… and the second factor was composed of variables related to a Sedentary Lifestyle (SL (e.g. depression, neuroticism…. NPV was measured over a four-day period. This timeframe was made up of two week and two weekend days. A bi-variate analysis established one level of convergent validity and a Split-Plot GLM examined convergent validity, nomological validity, and alternative hypotheses related to constraints on activity throughout the week simultaneously. The first analysis revealed significant correlations among NPV measures- weekday, weekend, and the entire four day blocks, supporting the convergent validity of the Diary-, Google Maps-, and GPS-NPV measures. Results from the second analysis, indicating non-significant mean differences in NPV regardless of method, also support this conclusion. We also found that AL is a statistically significant predictor of NPV no matter how NPV was measured. We did not find a statically significant relation among NPV and SL. These results permit us to infer that the GPS-based NPV measure has convergent and nomological validity.

  20. The approach to the optimization of the NPP characteristics on a basis of the use of best estimate codes and of information technologies

    Energy Technology Data Exchange (ETDEWEB)

    Vorobyov, Y.B.; Kuznetsov, V.D. [Moscow Power Engineering Institute (Technical University), NPP dept., Moscow (Russian Federation)

    2007-07-01

    In this report, the coupling of information technologies with best estimate codes is considered for increasing NPP safety. The creation of an operator support software tool for the NPP is presented. The aim is the identification of an accident at its beginning, during its development stage and its control. This method has been applied to WWER-1000/320 reactors, the calculations have showed that the proposed algorithms can be adjusted for the identification of the accident type at practically any stage of the accident development and for finding the optimal controlling influences.

  1. A survey on the high reliability software verification and validation technology for instrumentation and control in NPP

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Lee, Chang Soo; Dong, In Sook

    1994-01-01

    This document presents the technical status of the software verification and validation (V and V) efforts to support developing and licensing digital instrumentation and control (I and C) systems in nuclear power plants. We have reviewed codes and standards to be concensus criteria among vendor, licensee and licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 of the United States cope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. These technical issues let us know the development direction of our own software V and V methodology. (Author) 13 refs., 2 figs.,

  2. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  3. Fostering creativity in product and service development: validation in the domain of information technology.

    Science.gov (United States)

    Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel

    2011-06-01

    This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.

  4. OECD/NEA International Benchmark exercises: Validation of CFD codes applied nuclear industry; OECD/NEA internatiion Benchmark exercices: La validacion de los codigos CFD aplicados a la industria nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Pena-Monferrer, C.; Miquel veyrat, A.; Munoz-Cobo, J. L.; Chiva Vicent, S.

    2016-08-01

    In the recent years, due, among others, the slowing down of the nuclear industry, investment in the development and validation of CFD codes, applied specifically to the problems of the nuclear industry has been seriously hampered. Thus the International Benchmark Exercise (IBE) sponsored by the OECD/NEA have been fundamental to analyze the use of CFD codes in the nuclear industry, because although these codes are mature in many fields, still exist doubts about them in critical aspects of thermohydraulic calculations, even in single-phase scenarios. The Polytechnic University of Valencia (UPV) and the Universitat Jaume I (UJI), sponsored by the Nuclear Safety Council (CSN), have actively participated in all benchmark's proposed by NEA, as in the expert meetings,. In this paper, a summary of participation in the various IBE will be held, describing the benchmark itself, the CFD model created for it, and the main conclusions. (Author)

  5. Translation and validation of the parent-adolescent communication scale: technology for DST/HIV prevention.

    Science.gov (United States)

    Gubert, Fabiane do Amaral; Vieira, Neiva Francenely Cunha; Pinheiro, Patrícia Neyva da Costa; Oriá, Mônica Oliveira Batista; de Almeida, Paulo César; de Araújo, Thábyta Silva

    2013-01-01

    accomplishment of the transcultural adaptation of the Parent-adolescent Communication Scale, which evaluates the frequency of communication between parents and children concerning the subjects related to sex, condom, DST, HIV and pregnancy. Methodological research of quantitative approach, accomplished with 313 adolescent pupils of the feminine sex in the 14 to 18 year age group in Fortaleza-CE. The content validity was carried through by means of the initial translation, back translation, pre-final version and final version, being analyzed by a committee of specialists; the reliability was verified by the Cronbach's Alpha and ascertained by testing the hypotheses and test-retest within five weeks. The scale was applied via computer in the online modality in the period November/2010 to January/2011. The version of the instrument in Portuguese presented an Alpha of 0.86 regarding the validity of the structure, was partially verified since the testing of the hypotheses of the contracted group was not confirmed. The version of the instrument adapted for Portuguese is considered valid and reliable in the study sample.

  6. 78 FR 23472 - Amendments to Existing Validated End-User Authorizations: CSMC Technologies Corporation in the...

    Science.gov (United States)

    2013-04-19

    ... Corporation in the People's Republic of China (PRC) AGENCY: Bureau of Industry and Security, Commerce. ACTION... Technologies Corporation (CSMC) in the People's Republic of China (PRC). Specifically, BIS amends Supplement No... comment are not required under the APA or by any other law, the analytical requirements of the Regulatory...

  7. Validation of tissue microarray technology in squamous cell carcinoma of the esophagus

    NARCIS (Netherlands)

    Boone, Judith; van Hillegersberg, Richard; van Diest, Paul J.; Offerhaus, G. Johan A.; Borel Rinkes, Inne H. M.; ten Kate, Fiebo J. W.

    2008-01-01

    Tissue microarray (TMA) technology has been developed to facilitate high-throughput immunohistochemical and in situ hybridization analysis of tissues by inserting small tissue biopsy cores into a single paraffin block. Several studies have revealed novel prognostic biomarkers in esophageal squamous

  8. A new application and experimental validation of moulding technology for ferrite magnet assisted synchronous reluctance machine

    DEFF Research Database (Denmark)

    Wu, Qian; Lu, Kaiyuan; Rasmussen, Peter Omand

    2016-01-01

    This paper introduces a new application of moulding technology to the installation of ferrite magnet material into the rotor flux barriers of Ferrite Magnet Assisted Synchronous Reluctance Machine (FASynRM). The feasibility of this application with respect to manufacturing process and motor...

  9. Follow-On Cooperative Research and Development Agreement: MFIX to FLUENT Technology Transfer and Validation Studies Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Syamlal, Madhava [US Department of Energy, Washington, DC (United States); Guenther, Chris [US Department of Energy, Washington, DC (United States); O' Brien, Thomas J. [US Department of Energy, Washington, DC (United States); Benyahia, Sofiane [Fluent Inc., New York, NY (United States); Shi, Shaoping [Fluent Inc., New York, NY (United States)

    2005-03-01

    This report summarizes the effort by NETL and Fluent on the Cooperative Research and Development Agreement No. 00-F039 signed in May 2000. The objective of the CRADA was to transfer technology from NETL's MFIX code into the commercial software FLUENT so as to increase the computational speed, accuracy, and utility of FLUENT. During the period of this CRADA MFIX was used to develop granular flow theories and used for simulating gas-solids chemical reactors. The FLUENT and MFIX predictions were compared with each other and with experimental data generated at NETL. The granular kinetic theory in FLUENT was improved as a result of this work, and a gas-solids reaction (ozone decomposition) was used as a test case for the gas-solids chemical reaction capability in FLUENT. Also, under a separate project, work has begun to transfer the coal combustion and gasification model in MFIX to FLUENT.

  10. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    2015-01-26

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that provides a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is

  11. Validation of smart sensor technologies for instrument calibration reduction in nuclear power plants

    International Nuclear Information System (INIS)

    Hashemian, H.M.; Mitchell, D.W.; Petersen, K.M.; Shell, C.S.

    1993-01-01

    This report presents the preliminary results of a research and development project on the validation of new techniques for on-line testing of calibration drift of process instrumentation channels in nuclear power plants. These techniques generally involve a computer-based data acquisition and data analysis system to trend the output of a large number of instrument channels and identify the channels that have drifted out of tolerance. This helps limit the calibration effort to those channels which need the calibration, as opposed to the current nuclear industry practice of calibrating essentially all the safety-related instrument channels at every refueling outage

  12. Validation of smart sensor technologies for instrument calibration reduction in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hashemian, H M; Mitchell, D W; Petersen, K M; Shell, C S [Analysis and Measurement Services Corp., Knoxville, TN (United States)

    1993-01-01

    This report presents the preliminary results of a research and development project on the validation of new techniques for on-line testing of calibration drift of process instrumentation channels in nuclear power plants. These techniques generally involve a computer-based data acquisition and data analysis system to trend the output of a large number of instrument channels and identify the channels that have drifted out of tolerance. This helps limit the calibration effort to those channels which need the calibration, as opposed to the current nuclear industry practice of calibrating essentially all the safety-related instrument channels at every refueling outage.

  13. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  14. CFD Modelling and Validation of Mixing in a Model Single-Use-Technology Bioreactor

    OpenAIRE

    Maltby, Richard; Chew, Yong-Min

    2016-01-01

    Single-use-technologies (SUT) are a category of disposable bioprocessing components which have increased in popularity in the biopharmaceutical industry in recent years [1]. Stirred single use bioreactors use a polymeric bag supported by a rigid metal frame. The bag is disposed of and replaced after use, removing the need for energy-intensive and time consuming cleaning and sterilisation in place, as well as improving the flexibility of the production facility [2]. They are currently applied ...

  15. The Validity of an Extended Technology Acceptance Model (TAM) for Assessing the Acceptability of Autonomous Ships

    OpenAIRE

    Roestad, Viktor Olai Stokvik

    2016-01-01

    The study explored an extended Acceptance Technology Acceptance Model (TAM) for the purpose of developing a reliable tool for measuring potential user’s acceptance of autonomous ships. Correlation analysis was conducted to see if the 8 variables of the extended TAM model co vary, and regression analysis to further explain the nature of the relationships. The study reinforced the notion of strong relationships between the original constructs in TAM. Results also showed that trus...

  16. Development and validation of a septoplasty training model using 3-dimensional printing technology.

    Science.gov (United States)

    AlReefi, Mahmoud A; Nguyen, Lily H P; Mongeau, Luc G; Haq, Bassam Ul; Boyanapalli, Siddharth; Hafeez, Nauman; Cegarra-Escolano, Francois; Tewfik, Marc A

    2017-04-01

    Providing alternative training modalities may improve trainees' ability to perform septoplasty. Three-dimensional printing has been shown to be a powerful tool in surgical training. The objectives of this study were to explain the development of our 3-dimensional (3D) printed septoplasty training model, to assess its face and content validity, and to present evidence supporting its ability to distinguish between levels of surgical proficiency. Imaging data of a patient with a nasal septal deviation was selected for printing. Printing materials reproducing the mechanical properties of human tissues were selected based on literature review and prototype testing. Eight expert rhinologists, 6 senior residents, and 6 junior residents performed endoscopic septoplasties on the model and completed a postsimulation survey. Performance metrics in quality (final product analysis), efficiency (time), and safety (eg, perforation length, nares damage) were recorded and analyzed in a study-blind manner. The model was judged to be anatomically correct and the steps performed realistic, with scores of 4.05 ± 0.82 and 4.2 ± 1, respectively, on a 5-point Likert scale. Ninety-two percent of residents desired the simulator to be integrated into their teaching curriculum. There was a significant difference (p materials mixed into the 3 relevant consistencies necessary to simulate septoplasty. Our findings provide evidence supporting the validity of the model. © 2016 ARS-AAOA, LLC.

  17. Remotely Accessible Instrumented Monitoring of Global Development Programs: Technology Development and Validation

    Directory of Open Access Journals (Sweden)

    Michael Fleming

    2013-08-01

    Full Text Available Many global development agencies self-report their project outcomes, often relying on subjective data that is collected sporadically and communicated months later. These reports often highlight successes and downplay challenges. Instrumented monitoring via distributed data collection platforms may provide crucial evidence to help inform the sector and public on the effectiveness of aid, and the on-going challenges. This paper presents the process of designing and validating an integrated sensor platform with cellular-to-internet reporting purposely targeted at global development programs. The integrated hardware platform has been applied to water, sanitation, energy and infrastructure interventions and validated through laboratory calibration and field observations. Presented here are two examples: a water pump and a household water filter, wherein field observations agreed with the data algorithm with a linear fit slope of between 0.91 and 1, and an r-squared of between 0.36 and 0.39, indicating a wide confidence interval but with low overall error (i.e., less than 0.5% in the case of structured field observations of water volume added to a household water filter and few false negatives or false positives.

  18. What Forty Years of Research Says about the Impact of Technology on Learning: A Second-Order Meta-Analysis and Validation Study

    Science.gov (United States)

    Tamim, Rana M.; Bernard, Robert M.; Borokhovski, Eugene; Abrami, Philip C.; Schmid, Richard F.

    2011-01-01

    This research study employs a second-order meta-analysis procedure to summarize 40 years of research activity addressing the question, does computer technology use affect student achievement in formal face-to-face classrooms as compared to classrooms that do not use technology? A study-level meta-analytic validation was also conducted for purposes…

  19. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  20. A validation of technologies monitoring dairy cow feeding, ruminating, and lying behaviors.

    Science.gov (United States)

    Borchers, M R; Chang, Y M; Tsai, I C; Wadsworth, B A; Bewley, J M

    2016-09-01

    The objective of this study was to evaluate commercially available precision dairy technologies against direct visual observations of feeding, rumination, and lying behaviors. Primiparous (n=24) and multiparous (n=24) lactating Holstein dairy cattle (mean ± standard deviation; 223.4±117.8 d in milk, producing 29.2±8.2kg of milk/d) were fitted with 6 different triaxial accelerometer technologies evaluating cow behaviors at or before freshening. The AfiAct Pedometer Plus (Afimilk, Kibbutz Afikim, Israel) was used to monitor lying time. The CowManager SensOor (Agis, Harmelen, Netherlands) monitored rumination and feeding time. The HOBO Data Logger (HOBO Pendant G Acceleration Data Logger, Onset Computer Corp., Pocasset, MA) monitored lying time. The CowAlert IceQube (IceRobotics Ltd., Edinburgh, Scotland) monitored lying time. The Smartbow (Smartbow GmbH, Jutogasse, Austria) monitored rumination time. The Track A Cow (ENGS, Rosh Pina, Israel) monitored lying time and time spent around feeding areas for the calculation of feeding time. Over 8 d, 6 cows per day were visually observed for feeding, rumination, and lying behaviors for 2 h after morning and evening milking. The time of day was recorded when each behavior began and ended. These times were used to generate the length of time behaviors were visually observed. Pearson correlations (r; calculated using the CORR procedure of SAS Version 9.3, SAS Institute Inc., Cary, NC), and concordance correlations (CCC; calculated using the epiR package of R version 3.1.0, R Foundation for Statistical Computing, Vienna, Austria) evaluated association between visual observations and technology-recorded behaviors. Visually recorded feeding behaviors were moderately correlated with the CowManager SensOor (r=0.88, CCC=0.82) and Track A Cow (r=0.93, CCC=0.79) monitors. Visually recorded rumination behaviors were strongly correlated with the Smartbow (r=0.97, CCC=0.96), and weakly correlated with the CowManager SensOor (r=0

  1. Development of the ANL plant dynamics code and control strategies for the supercritical carbon dioxide Brayton cycle and code validation with data from the Sandia small-scale supercritical carbon dioxide Brayton cycle test loop.

    Energy Technology Data Exchange (ETDEWEB)

    Moisseytsev, A.; Sienicki, J. J. (Nuclear Engineering Division)

    2011-11-07

    Significant progress has been made in the ongoing development of the Argonne National Laboratory (ANL) Plant Dynamics Code (PDC), the ongoing investigation and development of control strategies, and the analysis of system transient behavior for supercritical carbon dioxide (S-CO{sub 2}) Brayton cycles. Several code modifications have been introduced during FY2011 to extend the range of applicability of the PDC and to improve its calculational stability and speed. A new and innovative approach was developed to couple the Plant Dynamics Code for S-CO{sub 2} cycle calculations with SAS4A/SASSYS-1 Liquid Metal Reactor Code System calculations for the transient system level behavior on the reactor side of a Sodium-Cooled Fast Reactor (SFR) or Lead-Cooled Fast Reactor (LFR). The new code system allows use of the full capabilities of both codes such that whole-plant transients can now be simulated without additional user interaction. Several other code modifications, including the introduction of compressor surge control, a new approach for determining the solution time step for efficient computational speed, an updated treatment of S-CO{sub 2} cycle flow mergers and splits, a modified enthalpy equation to improve the treatment of negative flow, and a revised solution of the reactor heat exchanger (RHX) equations coupling the S-CO{sub 2} cycle to the reactor, were introduced to the PDC in FY2011. All of these modifications have improved the code computational stability and computational speed, while not significantly affecting the results of transient calculations. The improved PDC was used to continue the investigation of S-CO{sub 2} cycle control and transient behavior. The coupled PDC-SAS4A/SASSYS-1 code capability was used to study the dynamic characteristics of a S-CO{sub 2} cycle coupled to a SFR plant. Cycle control was investigated in terms of the ability of the cycle to respond to a linear reduction in the electrical grid demand from 100% to 0% at a rate of 5

  2. Validation of next generation sequencing technologies in comparison to current diagnostic gold standards for BRAF, EGFR and KRAS mutational analysis.

    Science.gov (United States)

    McCourt, Clare M; McArt, Darragh G; Mills, Ken; Catherwood, Mark A; Maxwell, Perry; Waugh, David J; Hamilton, Peter; O'Sullivan, Joe M; Salto-Tellez, Manuel

    2013-01-01

    Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.

  3. Validation of a new library of nuclear constants of the WIMS code; Validacion de una nueva biblioteca de constantes nucleares del Codigo WIMS

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar H, F. [Departamento de Experimentacion, Gerencia del Reactor, ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1991-10-15

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H{sup 1}, O{sup 16}, Al{sup 27}, U{sup 235} and U{sup 238} was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  4. Atom for peace, code for war. The technology policy of the atomic power solution in Finland between 1955-1970

    International Nuclear Information System (INIS)

    Sarkikoski, T.

    2011-01-01

    This dissertation investigates the atomic power solution in Finland between 1955 - 1970. During these years a national arrangement for atomic energy technology evolved. The foundations of the Finnish atomic energy policy; the creation of basic legislation and the first governmental bodies, were laid between 1955 - 1965. In the late 1960's, the necessary technological and political decisions were made in order to purchase the first commercial nuclear reactor. A historical narration of this process is seen in the international context of 'atoms for peace' policies and Cold War history in general. The geopolitical position of Finland made it necessary to become involved in the balanced participation in international scientific-technical exchange and assistive nuclear programs. The Paris Peace Treaty of 1947 categorically denied Finland acquisition of nuclear weapons. Accordingly, from the 'Geneva year' of 1955, the emphasis was placed on peaceful purposes for atomic energy as well as on the education of national professionals in Finland. An initiative for the governmental atomic energy commission came from academia but the ultimate motive behind it was an anticipated structural change in the supply of national energy. Economically exploitable hydro power resources were expected to be built within ten years and atomic power was seen as a promising and complementing new energy technology. While importing fuels like coal was out of the question, because of scarce foreign currency, domestic uranium mineral deposits were considered as a potential source of nuclear fuel. Nevertheless, even then nuclear energy was regarded as just one of the possible future energy options. In the mid-1960 s a bandwagon effect of light water reactor orders was witnessed in the United States and soon elsewhere in the world. In Finland, two separate invitations for bids for nuclear reactors were initiated. This study explores at length both their preceding grounds and later phases. An

  5. Validation of finite element code DELFIN by means of the zero power experiences at the nuclear power plant of Atucha I

    International Nuclear Information System (INIS)

    Grant, C.R.

    1996-01-01

    Code DELFIN, developed in CNEA, treats the spatial discretization using heterogeneous finite elements, allowing a correct treatment of the continuity of fluxes and currents among elements and a more realistic representation of the hexagonal lattice of the reactor. It can be used for fuel management calculation, Xenon oscillation and spatial kinetics. Using the HUEMUL code for cell calculation (which uses a generalized two dimensional collision probability theory and has the WIMS library incorporated in a data base), the zero power experiences performed in 1974 were calculated. (author). 8 refs., 9 figs., 3 tabs

  6. Sprint mechanics evaluation using inertial sensor-based technology: A laboratory validation study.

    Science.gov (United States)

    Setuain, I; Lecumberri, P; Ahtiainen, J P; Mero, A A; Häkkinen, K; Izquierdo, M

    2018-02-01

    Advances in micro-electromechanical systems have turned magnetic inertial measurement units (MIMUs) into a suitable tool for vertical jumping biomechanical evaluation. Thus, this study aimed to determine whether appropriate reliability and agreement reports could also be obtained when analyzing 20-m sprint mechanics. Four bouts of 20-m sprints were evaluated to determine whether the data provided by a MIMU placed at the lumbar spine could reliably assess sprint mechanics and to examine the validity of the MIMU sensor compared to force plate recordings. Maximal power (P 0 ), force (F 0 ), and velocity (V 0 ), as well as other mechanical determinants of sprint performance associated with the force-velocity, power-velocity, and ratio of forces-velocity, such as applied horizontal force loss (S fv ) and decrease in ratio of forces (D rf ), were calculated and compared between instrumentations. Extremely large-to-very large correlation levels between MIMU sensor-based sprint mechanics variables and force plate recordings were obtained (mean±SD, force plate vs MIMU; V 0, 8.61±0.85 vs 8.42±0.69; F 0 , 383±110 vs 391±103; P 0 , 873±246 vs 799±241; S fv, -44.6±12.7 vs -46.2±10.7), ranging from 0.88 to 0.94, except for D rf, which showed weak-to-moderate correlation level (r=.45; -6.32±1.08 vs -5.76±0.68). Step-averaged force values measured with both systems were highly correlated (r=.88), with a regression slope close to the identity (1.01). Bland and Altman graphical representation showed a no random distribution of measured force values. Finally, very large-to-extremely large retest correlation coefficients were found for the intertrial reliability of MIMU measurements of sprint performance variables (r value ranging from .72 to .96). Therefore, MIMUs showed appropriate validity and reliability values for 20-m sprint performance variables. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Point-of-care solution for osteoporosis management design, fabrication, and validation of new technology

    CERN Document Server

    Khashayar, Patricia

    2017-01-01

    This book addresses the important clinical problem of accurately diagnosing osteoporosis, and analyzes how Bone Turnover Markers (BTMs) can improve osteoporosis detection. In her research, the author integrated microfluidic technology with electrochemical sensing to embody a reaction/detection chamber to measure serum levels of different biomarkers, creating a microfluidic proteomic platform that can easily be translated into a biomarker diagnostic. The Osteokit System, a result of the integration of electrochemical system and microfluidic chips, is a unique design that offers the potential for greater sensitivity. The implementation, feasibility, and specificity of the Osteokit platform is demonstrated in this book, which is appropriate for researchers working on bone biology and mechanics, as well as clinicians.

  8. Validation of computer codes and modelling methods for giving proof of nuclear saefty of transport and storage of spent VVER-type nuclear fuels. Part 1. Purposes and goals of the project. Final report

    International Nuclear Information System (INIS)

    Buechse, H.; Langowski, A.; Lein, M.; Nagel, R.; Schmidt, H.; Stammel, M.

    1995-01-01

    The report gives the results of investigations on the validation of computer codes used to prove nuclear safety during transport and storage of spent VVER - fuel of NPP Greifswald and Rheinsberg. Characteristics of typical spent fuel (nuclide concentration, neutron source strength, gamma spectrum, decay heat) - calculated with several codes - and dose rates (e.g. in the surrounding of a loaded spent fuel cask) - based on the different source terms - are presented. Differences and their possible reasons are discussed. The results show that despite the differences in the source terms all relevant health physics requirements are met for all cases of source term. The validation of the criticality code OMEGA was established by calculation of appr. 200 critical experiments of LWR fuel, including VVER fuel rod arrangements. The mean error of the effective multiplication factor k eff is -0,01 compared to the experiment for this area of applicability. Thus, the OMEGA error of 2% assumed in earlier works has turned out to be sufficiently conservative. (orig.) [de

  9. Validity of VR Technology on the Smartphone for the Study of Wind Park Soundscapes

    Directory of Open Access Journals (Sweden)

    Tianhong YU

    2018-04-01

    Full Text Available The virtual reality of the landscape environment supplies a high level of realism of the real environment, and may improve the public awareness and acceptance of wind park projects. The soundscape around wind parks could have a strong influence on the acceptance and annoyance of wind parks. To explore this VR technology on realism and subjective responses toward different soundscapes of ambient wind parks, three different types of virtual reality on the smartphone tests were performed: aural only, visual only, and aural–visual combined. In total, 21 aural and visual combinations were presented to 40 participants. The aural and visual information used were of near wind park settings and rural spaces. Perceived annoyance levels and realism of the wind park environment were measured. Results indicated that most simulations were rated with relatively strong realism. Perceived realism was strongly correlated with light, color, and vegetation of the simulation. Most wind park landscapes were enthusiastically accepted by the participants. The addition of aural information was found to have a strong impact on whether the participant was annoyed. Furthermore, evaluation of the soundscape on a multidimensional scale revealed the key components influencing the individual’s annoyance by wind parks were the factors of “calmness/relaxation” and “naturality/pleasantness”. “Diversity” of the soundscape might correlate with perceived realism. Finally, the dynamic aural–visual stimuli using virtual reality technology could improve the environmental assessment of the wind park landscapes, and thus, provide a more comprehensible scientific decision than conventional tools. In addition, this study could improve the participatory planning process for more acceptable wind park landscapes.

  10. Child protection and new technologies of communication: the code of regulatory PEGI videogames and games on-line

    Directory of Open Access Journals (Sweden)

    Petra Mª PÉREZ ALONSO-GETA

    2017-07-01

    Full Text Available It’s been said that the future of a village resides in its children, and not only because they’re the future, but because it’s better if we protect and teach our children well. Thus, in 1959 the United Nations published ten principles in the Declaration of the Rights of the Child. The second principle of the Declaration of the Rights states that “the child shall enjoy special protection…to enable him to develop physically, mentally, morally, spiritually and socially…” Today, a very different social context from that of 1959, the right to protection should also be settled within the diverse areas that define new communication technologies. However, current procedures established by the PEGI, while still necessary, don’t guarantee this basic childhood right.

  11. A symmetrical current injection addressed by ENTSO-E draft network code. Requirements and development of technology for WEC

    Energy Technology Data Exchange (ETDEWEB)

    Diedrichs, Volker; Lorenzen, Helge [University of Applied Sciences Wilhelmshaven (Germany); Mackensen, Ingo; Gertjegerdes, Stefan [ENERCON GmbH, Aurich (Germany)

    2012-07-01

    Substitution of conventional power plants equipped with synchronous generators by inverter-based technologies mainly due to large scale integration of wind power is also associated with increasing concerns regarding negative sequence phenomena in power systems (over-voltages in the healthy phases and excitation of protection relays during asymmetrical faults, grid-wide increasing 'unbalancing' level of voltages due to unbalanced load or transmission equipment during normal operation). The paper presents concepts for asymmetrical current injection with inverters used in type IV wind turbines, results from model-based analyses and measurements from a laboratory test power system for both areas of concern. Feasibility and potential efficiency of asymmetrical current injection is the centre of interest. (orig.)

  12. Administration of neuropsychological tests using interactive voice response technology in the elderly: validation and limitations

    Directory of Open Access Journals (Sweden)

    Delyana Ivanova Miller

    2013-08-01

    Full Text Available Interactive voice response systems (IVR are computer programs, which interact with people to provide a number of services from business to health care. We examined the ability of an IVR system to administer and score a verbal fluency task (fruits and the digit span forward and backward in 158 community dwelling people aged between 65 and 92 years of age (full scale IQ of 68 to 134. Only 6 participants could not complete all tasks mostly due to early technical problems in the study. Participants were also administered the WAIS-IV and WMS-IV sub-tests. The IVR system correctly recognized 90% of the fruits in the verbal fluency task and 93-95% of the number sequences in the digit span. The IVR system typically underestimated the performance of participants because of voice recognition errors. In the digit span, these errors led to the erroneous discontinuation of the test: however the correlation between IVR scoring and clinical scoring was still high (93-95%. The correlation between the IVR verbal fluency and the WAIS-IV Similarities sub-test was 0.31. The correlation between the IVR digit span forward and backward and the in-person administration was 0.46. We discuss how valid and useful IVR systems are for neuropsychological testing in the elderly.

  13. Administration of neuropsychological tests using interactive voice response technology in the elderly: validation and limitations.

    Science.gov (United States)

    Miller, Delyana Ivanova; Talbot, Vincent; Gagnon, Michèle; Messier, Claude

    2013-01-01

    Interactive voice response (IVR) systems are computer programs, which interact with people to provide a number of services from business to health care. We examined the ability of an IVR system to administer and score a verbal fluency task (fruits) and the digit span forward and backward in 158 community dwelling people aged between 65 and 92 years of age (full scale IQ of 68-134). Only six participants could not complete all tasks mostly due to early technical problems in the study. Participants were also administered the Wechsler Intelligence Scale fourth edition (WAIS-IV) and Wechsler Memory Scale fourth edition subtests. The IVR system correctly recognized 90% of the fruits in the verbal fluency task and 93-95% of the number sequences in the digit span. The IVR system typically underestimated the performance of participants because of voice recognition errors. In the digit span, these errors led to the erroneous discontinuation of the test: however the correlation between IVR scoring and clinical scoring was still high (93-95%). The correlation between the IVR verbal fluency and the WAIS-IV Similarities subtest was 0.31. The correlation between the IVR digit span forward and backward and the in-person administration was 0.46. We discuss how valid and useful IVR systems are for neuropsychological testing in the elderly.

  14. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  15. Development and Validation of a Three-Dimensional Diffusion Code Based on a High Order Nodal Expansion Method for Hexagonal-z Geometry

    Directory of Open Access Journals (Sweden)

    Daogang Lu

    2016-01-01

    Full Text Available A three-dimensional, multigroup, diffusion code based on a high order nodal expansion method for hexagonal-z geometry (HNHEX was developed to perform the neutronic analysis of hexagonal-z geometry. In this method, one-dimensional radial and axial spatially flux of each node and energy group are defined as quadratic polynomial expansion and four-order polynomial expansion, respectively. The approximations for one-dimensional radial and axial spatially flux both have second-order accuracy. Moment weighting is used to obtain high order expansion coefficients of the polynomials of one-dimensional radial and axial spatially flux. The partially integrated radial and axial leakages are both approximated by the quadratic polynomial. The coarse-mesh rebalance method with the asymptotic source extrapolation is applied to accelerate the calculation. This code is used for calculation of effective multiplication factor, neutron flux distribution, and power distribution. The numerical calculation in this paper for three-dimensional SNR and VVER 440 benchmark problems demonstrates the accuracy of the code. In addition, the results show that the accuracy of the code is improved by applying quadratic approximation for partially integrated axial leakage and four-order approximation for one-dimensional axial spatially flux in comparison to flat approximation for partially integrated axial leakage and quadratic approximation for one-dimensional axial spatially flux.

  16. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  17. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  18. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Kinetic Parameters, Temperature Coefficients and Power Distribution

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results of comparison of calculated and experimental results for temperature coefficients, kinetic parameters and fission rates spatial distributions are shown. (author)

  19. Validation of Lithium-ion cell technology for JPL's 2003 Mars Exploration Rover Mission

    Science.gov (United States)

    Smart, Marshall C.; Ratnakumar, Bugga V.; Ewell, R. C.; Whitcanack, L. D.; Chin, K. B.; Surampudi, S.

    2004-01-01

    n early 2004 JPL successfully landed two Rovers, named Spirit and Opportunity, on the surface of Mars after traveling >300 million miles over a 6-7 month period. In order to operate for extended duration on the surface of Mars, both Rovers are equipped with rechargeable Lithium-ion batteries, which were designed to aid in the launch, correct anomalies during cruise, and support surface operations in conjunction with a triple-junction deployable solar arrays. The requirements of the Lithium-ion battery include the ability to provide power at least 90 sols on the surface of Mars, operate over a wide temperature range (-20 C to +40 C), withstanding long storage periods (e.g., cruise period), operate in an inverted position, and support high currents (e.g., firing pyro events). In order to determine the viability of Lithium-ion technology to meet these stringent requirements, a comprehensive test program was implemented aimed at demonstrating the performance capability of prototype cells fabricated by Lithion, Inc. (Yardney Technical Products, Inc.). The testing performed includes, determining the (a) room temperature cycle life, (b) pulse capability as a function of temperature, (e) self-discharge and storage characteristics mission profile capability, (f) cycle life under mission simulation conditions, (g) impedance characteristics, (h) impact of cell orientation, and (i) performance in 8-cell engineering batteries. As will be discussed, the Lithium-ion prototype cells and batteries were demonstrated to meet, as well as, exceed the requirements defined by the mission.

  20. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  1. Construction and validation of a tool to Assess the Use of Light Technologies at Intensive Care Units.

    Science.gov (United States)

    Marinho, Pabliane Matias Lordelo; Campos, Maria Pontes de Aguiar; Rodrigues, Eliana Ofélia Llapa; Gois, Cristiane Franca Lisboa; Barreto, Ikaro Daniel de Carvalho

    2016-12-19

    to construct and validate a tool to assess the use of light technologies by the nursing team at Intensive Care Units. methodological study in which the tool was elaborated by means of the psychometric method for construction based on the categorization of health technologies by Merhy and Franco, from the National Humanization Policy, using the Nursing Intervention Classification taxonomy to categorize the domains of the tool. Agreement Percentages and Content Validity Indices were used for the purpose of validation. The result of the application of the Interrater Agreement Percentage exceeded the recommended level of 80%, highlighting the relevance for the proposed theme in the assessment, with an agreement rate of 99%. the tool was validated with four domains (Bond, Autonomy, Welcoming and Management) and nineteen items that assess the use of light technologies at Intensive Care Units. construir e validar um instrumento para avaliação do uso de tecnologias leves, pela equipe de enfermagem, em Unidades de Terapia Intensiva. estudo metodológico no qual o instrumento foi elaborado utilizando o método psicométrico para construção com base na categorização das tecnologias em saúde de Merhy e Franco, da Política Nacional de Humanização, utilizando-se a taxonomia Nursing Intervention Classification para categorizar os domínios do instrumento. Utilizou-se o Percentual de Concordância e o Índice de Validade de Conteúdo (IVC) para validação. o resultado da aplicação do Percentual de Concordância entre os juízes foi superior ao recomendado de 80%, havendo destaque na avaliação da pertinência ao tema proposto, apresentando um percentual de concordância de 99%. o instrumento foi validado com quatro domínios (Vínculo, Autonomia, Acolhimento e Gestão) e dezenove itens que avaliam o uso das tecnologias leves em Unidade de Terapia Intensiva. construir y validar un instrumento para evaluación del uso de tecnologías leves, por el equipo de enfermer

  2. Validation of the code ETOBOX/BOXER for UO2 LWR lattices based on the experiments TRX, BAPL-UO2 and other critical experiments

    International Nuclear Information System (INIS)

    Paratte, J.M.

    1985-07-01

    The EIR codes system for LWR arrays is based on cross sections taken out of ENDF/B-4 and ENDF/B-5 by the code ETOBOX. The calculation method for the arrays (code BOXER) and the cross sections as well were applied to the CSEWG benchmark experiments TRX-1 to 4 and BAPL-UO/sub 2/-1 to 3. The results are compared to the measured values and to some calculations of other institutions as well. This demonstrates that the deviations of the parameters calculated by BOXER are typical for the cross sections used. A large number of critical experiments were calculated using the measured material bucklings in order to bring to light possible trends in the calculation of the multiplication factor k/sub eff/. First it came out that the error bounds of B/sub m//sup 2/ evalu-ated in the measurements are often optimistic. Two-dimensional calculations improved the results of the cell calculations. With a mean scattering of 4 to 5 mk in the normal arrays, the multiplication factors calculated by BOXER are satisfactory. However one has to take into account a slight trend of k/sub eff/ to grow with the moderator to fuel ratio and the enrichment. (author)

  3. Application of NEA/CSNI standard problem 3 (blowdown and flow reversal in the IETA-1 rig) to the validation of the RELAP-UK Mk IV code

    International Nuclear Information System (INIS)

    Bryce, W.M.

    1977-10-01

    NEA/CSNI Standard Problem 3 consists of the modelling of an experiment on the IETI-1 rig, in which there is initially flow upwards through a feeder, heated section and riser. The inlet and outlet are then closed and a breach opened at the bottom so that the flow reverses and the rig depressurises. Calculations of this problem by many countries using several computer codes have been reported and show a wide spread of results. The purpose of the study reported here was the following. First, to show the sensitivity of the calculation of Standard Problem 3. Second, to perform an ab initio best estimate calculation using the RELAP-UK Mark IV code with the standard recommended options, and third, to use the results of the sensitivity study to show where tuning of the RELAP-UK Mark IV recommended model options was required. This study has shown that the calculation of Standard Problem 3 is sensitive to model assumptions and that the use of the loss-of-coolant accident code RELAP-UK Mk IV with the standard recommended model options predicts the experimental results very well over most of the transient. (U.K.)

  4. On the structure of Lattice code WIMSD-5B

    International Nuclear Information System (INIS)

    Kim, Won Young; Min, Byung Joo

    2004-03-01

    The WIMS-D code is a freely available thermal reactor physics lattice code used widely for thermal research and power reactor calculation. Now the code WIMS-AECL, developed on the basis of WIMS-D, has been used as one of lattice codes for the cell calculation in Canada and also, in 1998, the latest version WIMSD-5B is released for OECD/NEA Data Bank. While WIMS-KAERI was developed and has been used, originated from WIMS-D, in Korea, it was adjusted for the cell calculation of research reactor HANARO and so it has no conf