WorldWideScience

Sample records for code validation base

  1. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  2. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jingchao; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; He, Qingyun; Ye, Minyou

    2015-11-15

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  3. Further validation of liquid metal MHD code for unstructured grid based on OpenFOAM

    International Nuclear Information System (INIS)

    Feng, Jingchao; Chen, Hongli; He, Qingyun; Ye, Minyou

    2015-01-01

    Highlights: • Specific correction scheme has been adopted to revise the calculating result for non-orthogonal meshes. • The developed MHD code based on OpenFOAM platform has been validated by benchmark cases under uniform and non-uniform magnetic field in round and rectangular ducts. • ALEX experimental results have been used to validate the MHD code based on OpenFOAM. - Abstract: In fusion liquid metal blankets, complex geometries involving contractions, expansions, bends, manifolds are very common. The characteristics of liquid metal flow in these geometries are significant. In order to extend the magnetohydrodynamic (MHD) solver developed on OpenFOAM platform to be applied in the complex geometry, the MHD solver based on unstructured meshes has been implemented. The adoption of non-orthogonal correction techniques in the solver makes it possible to process the non-orthogonal meshes in complex geometries. The present paper focused on the validation of the code under critical conditions. An analytical solution benchmark case and two experimental benchmark cases were conducted to validate the code. Benchmark case I is MHD flow in a circular pipe with arbitrary electric conductivity of the walls in a uniform magnetic field. Benchmark cases II and III are experimental cases of 3D laminar steady MHD flow under fringing magnetic field. In all these cases, the numerical results match well with the benchmark cases.

  4. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  5. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  6. Development and validation of a criticality calculation scheme based on French deterministic transport codes

    International Nuclear Information System (INIS)

    Santamarina, A.

    1991-01-01

    A criticality-safety calculational scheme using the automated deterministic code system, APOLLO-BISTRO, has been developed. The cell/assembly code APOLLO is used mainly in LWR and HCR design calculations, and its validation spans a wide range of moderation ratios, including voided configurations. Its recent 99-group library and self-shielded cross-sections has been extensively qualified through critical experiments and PWR spent fuel analysis. The PIC self-shielding formalism enables a rigorous treatment of the fuel double heterogeneity in dissolver medium calculations. BISTRO is an optimized multidimensional SN code, part of the modular CCRR package used mainly in FBR calculations. The APOLLO-BISTRO scheme was applied to the 18 experimental benchmarks selected by the OECD/NEACRP Criticality Calculation Working Group. The Calculation-Experiment discrepancy was within ± 1% in ΔK/K and always looked consistent with the experimental uncertainty margin. In the critical experiments corresponding to a dissolver type benchmark, our tools computed a satisfactory Keff. In the VALDUC fuel storage experiments, with hafnium plates, the computed Keff ranged between 0.994 and 1.003 for the various watergaps spacing the fuel clusters from the absorber plates. The APOLLO-KENOEUR statistic calculational scheme, based on the same self-shielded multigroup library, supplied consistent results within 0.3% in ΔK/K. (Author)

  7. Validation of a Computational Fluid Dynamics (CFD) Code for Supersonic Axisymmetric Base Flow

    Science.gov (United States)

    Tucker, P. Kevin

    1993-01-01

    The ability to accurately and efficiently calculate the flow structure in the base region of bodies of revolution in supersonic flight is a significant step in CFD code validation for applications ranging from base heating for rockets to drag for protectives. The FDNS code is used to compute such a flow and the results are compared to benchmark quality experimental data. Flowfield calculations are presented for a cylindrical afterbody at M = 2.46 and angle of attack a = O. Grid independent solutions are compared to mean velocity profiles in the separated wake area and downstream of the reattachment point. Additionally, quantities such as turbulent kinetic energy and shear layer growth rates are compared to the data. Finally, the computed base pressures are compared to the measured values. An effort is made to elucidate the role of turbulence models in the flowfield predictions. The level of turbulent eddy viscosity, and its origin, are used to contrast the various turbulence models and compare the results to the experimental data.

  8. Flight code validation simulator

    Science.gov (United States)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  9. Validation of the thermal-hydraulic system code ATHLET based on selected pressure drop and void fraction BFBT tests

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Escalante, Javier Jimenez; Espinoza, Victor Sanchez

    2015-07-15

    Highlights: • Simulation of BFBT-BWR steady-state and transient tests with ATHLET. • Validation of thermal-hydraulic models based on pressure drops and void fraction measurements. • TRACE system code is used for the comparative study. • Predictions result in a good agreement with the experiments. • Discrepancies are smaller or comparable with respect to the measurements uncertainty. - Abstract: Validation and qualification of thermal-hydraulic system codes based on separate effect tests are essential for the reliability of numerical tools when applied to nuclear power plant analyses. To this purpose, the Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in various validation and qualification activities of different CFD, sub-channel and system codes. In this paper, the capabilities of the thermal-hydraulic code ATHLET are assessed based on the experimental results provided within the NUPEC BFBT benchmark related to key Boiling Water Reactors (BWR) phenomena. Void fraction and pressure drops measurements in the BFBT bundle performed under steady-state and transient conditions which are representative for e.g. turbine trip and recirculation pump trip events, are compared with the numerical results of ATHLET. The comparison of code predictions with the BFBT data has shown good agreement given the experimental uncertainty and the results are consistent with the trends obtained with similar thermal-hydraulic codes.

  10. Validity of Principal Diagnoses in Discharge Summaries and ICD-10 Coding Assessments Based on National Health Data of Thailand.

    Science.gov (United States)

    Sukanya, Chongthawonsatid

    2017-10-01

    This study examined the validity of the principal diagnoses on discharge summaries and coding assessments. Data were collected from the National Health Security Office (NHSO) of Thailand in 2015. In total, 118,971 medical records were audited. The sample was drawn from government hospitals and private hospitals covered by the Universal Coverage Scheme in Thailand. Hospitals and cases were selected using NHSO criteria. The validity of the principal diagnoses listed in the "Summary and Coding Assessment" forms was established by comparing data from the discharge summaries with data obtained from medical record reviews, and additionally, by comparing data from the coding assessments with data in the computerized ICD (the data base used for reimbursement-purposes). The summary assessments had low sensitivities (7.3%-37.9%), high specificities (97.2%-99.8%), low positive predictive values (9.2%-60.7%), and high negative predictive values (95.9%-99.3%). The coding assessments had low sensitivities (31.1%-69.4%), high specificities (99.0%-99.9%), moderate positive predictive values (43.8%-89.0%), and high negative predictive values (97.3%-99.5%). The discharge summaries and codings often contained mistakes, particularly the categories "Endocrine, nutritional, and metabolic diseases", "Symptoms, signs, and abnormal clinical and laboratory findings not elsewhere classified", "Factors influencing health status and contact with health services", and "Injury, poisoning, and certain other consequences of external causes". The validity of the principal diagnoses on the summary and coding assessment forms was found to be low. The training of physicians and coders must be strengthened to improve the validity of discharge summaries and codings.

  11. Validation of the reactor dynamics code HEXTRAN

    International Nuclear Information System (INIS)

    Kyrki-Rajamaeki, R.

    1994-05-01

    HEXTRAN is a new three-dimensional, hexagonal reactor dynamics code developed in the Technical Research Centre of Finland (VTT) for VVER type reactors. This report describes the validation work of HEXTRAN. The work has been made with the financing of the Finnish Centre for Radiation and Nuclear Safety (STUK). HEXTRAN is particularly intended for calculation of such accidents, in which radially asymmetric phenomena are included and both good neutron dynamics and two-phase thermal hydraulics are important. HEXTRAN is based on already validated codes. The models of these codes have been shown to function correctly also within the HEXTRAN code. The main new model of HEXTRAN, the spatial neutron kinetics model has been successfully validated against LR-0 test reactor and Loviisa plant measurements. Connected with SMABRE, HEXTRAN can be reliably used for calculation of transients including effects of the whole cooling system of VVERs. Further validation plans are also introduced in the report. (orig.). (23 refs., 16 figs., 2 tabs.)

  12. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  13. Validation of DRAGON code in connection with WIMS-AECL/RFSP code system based on ENDF/B-VI library and two group model

    International Nuclear Information System (INIS)

    Hong, In Seob; Suk, Ho Chun; Kim, Soon Young; Jo, Chang Keun

    2002-06-01

    The major objective of this research is to validate the incremental cross section property of DRAGON code in connection with WIMS-AECL/DRAGON/RFSP code system with ENDF/B-VI library and full 2G calculation model. The direct comparison between the incremental cross section results calculated by DRAGON with ENDF/B-VI and ENDF/B-V and MULTICELL with ENDF/B-V indicate that there are not much differences between the incremental cross sections of DRAGON with ENDF/B-V and ENDF/B-VI, but there exists large discrepancies between the results of DRAGON and those of MULTICELL. In the analysis of the difference between calculated and measured reactivity worths of various types of control devices during Phase-B Post-Simulation of Wolsong Units 2, 3 and 4, WIMS-AECL/DRAGON/RFSP analysis well agrees with those of previous WIMS-AECL /MULTICELL/RFSP analysis within very small differences. From those results, we can conclude that DRAGON code can be used as a general purpose incremental cross section generation tool for not only the natural uranium fuel but also slightly enriched fuel such as RU or SEU, to cover the shortcomings of natural uranium based MULTICELL code

  14. The database 'EDUD Base' for validation of neutron-physics codes used to analyze the WWER-440 cores

    International Nuclear Information System (INIS)

    Rocek, J.; Belac, J.; Miasnikov, A.

    2003-01-01

    The program and data system EDUDBase for validation of reactor computing codes was developed at NRI. It is designed for validation and evaluation of the precision of different computer codes used for WWER core analyses. The main goal of this database is to provide data for comparison with calculation results of tested codes and tools for statistical analysis or differences between the calculation results and the test data. The benchmark data sets are based on in-core measurements performed on WWER-440 reactors of Dukovany NPP. The initial data from NPP are verified, errors and inaccuracies are eliminated and data are transferred to a form, which is suitable for comparison with results of calculations. A special reduced operating history data set is created for each operating cycle ('Benchmark Operation History') to be used as an input data for calculation. It contains values of some integral quantities for each time point: effective time, integral thermal power, boron concentration, position of working group control assemblies (group 6) and inlet coolant temperature. At present, sets are available for all completed cycles up to: (unit/cycle) 1/17, 2/16, 3/15, 4/15. Power distribution is described for approx. 40 time steps during each operating cycle. 2D-power distributions are transferred into 60-degree core symmetry sector of reactor core. At present, such data sets are available only for later cycles starting with: (unit/cycle) 1/7, 2/6, 3/5, 4/5 (in other words last II cycles for each unit) (Authors)

  15. Development and validation of sodium fire codes

    International Nuclear Information System (INIS)

    Morii, Tadashi; Himeno Yoshiaki; Miyake, Osamu

    1989-01-01

    Development, verification, and validation of the spray fire code, SPRAY-3M, the pool fire codes, SOFIRE-M2 and SPM, the aerosol behavior code, ABC-INTG, and the simultaneous spray and pool fires code, ASSCOPS, are presented. In addition, the state-of-the-art of development of the multi-dimensional natural convection code, SOLFAS, for the analysis of heat-mass transfer during a fire, is presented. (author)

  16. Enclosure environment characterization testing for the base line validation of computer fire simulation codes

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1987-03-01

    This report describes a series of fire tests conducted under the direction of Sandia National Laboratories for the US Nuclear Regulatory Commission. The primary purpose of these tests was to provide data against which to validate computer fire environment simulation models to be used in the analysis of nuclear power plant enclosure fire situations. Examples of the data gathered during three of the tests are presented, though the primary objective of this report is to provide a timely description of the test effort itself. These tests were conducted in an enclosure measuring 60x40x20 feet constructed at the Factory Mutual Research Corporation fires test facility in Rhode Island. All of the tests utilized forced ventilation conditions. The ventilation system was designed to simulate typical nuclear power plant installation practices and ventilation rates. A total of 22 tests using simple gas burner, heptane pool, methanol pool, and PMMA solid fires was conducted. Four of these tests were conducted with a full-scale control room mockup in place. Parameters varied during testing were fire intensity, enclosure ventilation rate, and fire location. Data gathered include air temperatures, air velocities, radiative and convective heat flux levels, optical smoke densities, inner and outer enclosure surface temperatures, enclosure surface heat flux levels, and gas concentrations within the enclosure in the exhaust stream

  17. Measurement of reactivity coefficients for code validation

    International Nuclear Information System (INIS)

    Nuding, Matthias; Loetsch, Thomas

    2005-01-01

    In the year 2003 measurements in the cold reactor state have been performed at the NPP KKI 2 in order to validate the codes that are used for reactor core calculations and especially for the proof of the shutdown margin that is produced by calculations only. For full power states code verification is quite easy because the calculations can be compared with different measured values, e.g. with the activation values determined by the aeroball system. For cold reactor states, however the data base is smaller, especially for reactor cores that are quite 'inhomogeneous' and have rather high Pu-fiss-and 235 U-contents. At the same time the cold reactor state is important regarding the shutdown margin. For these reasons the measurements mentioned above have been performed in order to check the accuracy of the codes that are used by the operator and by our organization for many years. Basically, boron concentrations and control rod worths for different configurations have been measured. The results of the calculation show a very good agreement with the measured values. Therefore, it can be stated that the operator's as well as our code system is suitable for routine use, e.g. during licensing procedures (Authors)

  18. A fuel performance code TRUST VIc and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Ishida, M; Kogai, T [Nippon Nuclear Fuel Development Co. Ltd., Oarai, Ibaraki (Japan)

    1997-08-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs.

  19. A fuel performance code TRUST VIc and its validation

    International Nuclear Information System (INIS)

    Ishida, M.; Kogai, T.

    1997-01-01

    This paper describes a fuel performance code TRUST V1c developed to analyze thermal and mechanical behavior of LWR fuel rod. Submodels in the code include FP gas models depicting gaseous swelling, gas release from pellet and axial gas mixing. The code has FEM-based structure to handle interaction between thermal and mechanical submodels brought by the gas models. The code is validated against irradiation data of fuel centerline temperature, FGR, pellet porosity and cladding deformation. (author). 9 refs, 8 figs

  20. Cable SGEMP Code Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Ballard, William Parker [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Center for CA Weapons Systems Engineering

    2013-05-01

    This report compared data taken on the Modular Bremsstrahlung Simulator using copper jacketed (cujac) cables with calculations using the RHSD-RA Cable SGEMP analysis tool. The tool relies on CEPXS/ONBFP to perform radiation transport in a series of 1D slices through the cable, and then uses a Green function technique to evaluate the expected current drive on the center conductor. The data were obtained in 2003 as part of a Cabana verification and validation experiment using 1-D geometries, but were not evaluated until now. The agreement between data and model is not adequate unless gaps between the dielectric and outer conductor (ground) are assumed, and these gaps are large compared with what is believed to be in the actual cable.

  1. Validations and applications of the FEAST code

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Z.; Tayal, M.; Lau, J.H.; Evinou, D. [Atomic Energy of Canada Limited, Mississauga, Ontario (Canada); Jun, J.S. [Korea Atomic Energy Research Inst. (Korea, Republic of)

    1999-07-01

    The FEAST (Finite Element Analysis for STresses) code is part of a suite of computer codes that are used to assess the structural integrity of CANDu fuel elements and bundles. A detailed validation of the FEAST code was recently performed. The FEAST calculations are in good agreement with a variety of analytical solutions (18 cases) for stresses, strains and displacements. This consistency shows that the FEAST code correctly incorporates the fundamentals of stress analysis. Further, the calculations of the FEAST code match the variations in axial and hoop strain profiles, measured by strain gauges near the sheath-endcap weld during an out-reactor compression test. The code calculations are also consistent with photoelastic measurements in simulated endcaps. (author)

  2. Validations and applications of the FEAST code

    International Nuclear Information System (INIS)

    Xu, Z.; Tayal, M.; Lau, J.H.; Evinou, D.; Jun, J.S.

    1999-01-01

    The FEAST (Finite Element Analysis for STresses) code is part of a suite of computer codes that are used to assess the structural integrity of CANDu fuel elements and bundles. A detailed validation of the FEAST code was recently performed. The FEAST calculations are in good agreement with a variety of analytical solutions (18 cases) for stresses, strains and displacements. This consistency shows that the FEAST code correctly incorporates the fundamentals of stress analysis. Further, the calculations of the FEAST code match the variations in axial and hoop strain profiles, measured by strain gauges near the sheath-endcap weld during an out-reactor compression test. The code calculations are also consistent with photoelastic measurements in simulated endcaps. (author)

  3. Nuclear data to support computer code validation

    International Nuclear Information System (INIS)

    Fisher, S.E.; Broadhead, B.L.; DeHart, M.D.; Primm, R.T. III

    1997-04-01

    The rate of plutonium disposition will be a key parameter in determining the degree of success of the Fissile Materials Disposition Program. Estimates of the disposition rate are dependent on neutronics calculations. To ensure that these calculations are accurate, the codes and data should be validated against applicable experimental measurements. Further, before mixed-oxide (MOX) fuel can be fabricated and loaded into a reactor, the fuel vendors, fabricators, fuel transporters, reactor owners and operators, regulatory authorities, and the Department of Energy (DOE) must accept the validity of design calculations. This report presents sources of neutronics measurements that have potential application for validating reactor physics (predicting the power distribution in the reactor core), predicting the spent fuel isotopic content, predicting the decay heat generation rate, certifying criticality safety of fuel cycle facilities, and ensuring adequate radiation protection at the fuel cycle facilities and the reactor. The U.S. in-reactor experience with MOX fuel is first presented, followed by information related to other aspects of the MOX fuel performance information that is valuable to this program, but the data base remains largely proprietary. Thus, this information is not reported here. It is expected that the selected consortium will make the necessary arrangements to procure or have access to the requisite information

  4. Validation of comprehensive space radiation transport code

    International Nuclear Information System (INIS)

    Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.

    1998-01-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation

  5. Langmuir probe-based observables for plasma-turbulence code validation and application to the TORPEX basic plasma physics experiment

    International Nuclear Information System (INIS)

    Ricci, Paolo; Theiler, C.; Fasoli, A.; Furno, I.; Labit, B.; Mueller, S. H.; Podesta, M.; Poli, F. M.

    2009-01-01

    The methodology for plasma-turbulence code validation is discussed, with focus on the quantities to use for the simulation-experiment comparison, i.e., the validation observables, and application to the TORPEX basic plasma physics experiment [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)]. The considered validation observables are deduced from Langmuir probe measurements and are ordered into a primacy hierarchy, according to the number of model assumptions and to the combinations of measurements needed to form each of them. The lowest levels of the primacy hierarchy correspond to observables that require the lowest number of model assumptions and measurement combinations, such as the statistical and spectral properties of the ion saturation current time trace, while at the highest levels, quantities such as particle transport are considered. The comparison of the observables at the lowest levels in the hierarchy is more stringent than at the highest levels. Examples of the use of the proposed observables are applied to a specific TORPEX plasma configuration characterized by interchange-driven turbulence.

  6. Validation study of SRAC2006 code system based on evaluated nuclear data libraries for TRIGA calculations by benchmarking integral parameters of TRX and BAPL lattices of thermal reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Sarker, M.M.; Islam, S.M.A.

    2013-01-01

    Highlights: ► To validate the SRAC2006 code system for TRIGA neutronics calculations. ► TRX and BAPL lattices are treated as standard benchmarks for this purpose. ► To compare the calculated results with experiment as well as MCNP values in this study. ► The study demonstrates a good agreement with the experiment and the MCNP results. ► Thus, this analysis reflects the validation study of the SRAC2006 code system. - Abstract: The goal of this study is to present the validation study of the SRAC2006 code system based on evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3 for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. This study is achieved through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors. In integral measurements, the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 are treated as standard benchmarks for validating/testing the SRAC2006 code system as well as nuclear data libraries. The integral parameters of the said lattices are calculated using the collision probability transport code PIJ of the SRAC2006 code system at room temperature 20 °C based on the above libraries. The calculated integral parameters are compared to the measured values as well as the MCNP values based on the Chinese evaluated nuclear data library CENDL-3.0. It was found that in most cases, the values of integral parameters demonstrate a good agreement with the experiment and the MCNP results. In addition, the group constants in SRAC format for TRX and BAPL lattices in fast and thermal energy range respectively are compared between the above libraries and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation study of the SRAC2006 code system based on evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 and can also be essential to implement further neutronics calculations

  7. European Validation of the Integral Code ASTEC (EVITA)

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Neu, K.; Dorsselaere, J.P. Van

    2005-01-01

    The main objective of the European Validation of the Integral Code ASTEC (EVITA) project is to distribute the severe accident integral code ASTEC to European partners in order to apply the validation strategy issued from the VASA project (4th EC FWP). Partners evaluate the code capability through validation on reference experiments and plant applications accounting for severe accident management measures, and compare results with reference codes. The basis version V0 of ASTEC (Accident Source Term Evaluation Code)-commonly developed and basically validated by GRS and IRSN-was made available in late 2000 for the EVITA partners on their individual platforms. Users' training was performed by IRSN and GRS. The code portability on different computers was checked to be correct. A 'hot line' assistance was installed continuously available for EVITA code users. The actual version V1 has been released to the EVITA partners end of June 2002. It allows to simulate the front-end phase by two new modules:- for reactor coolant system 2-phase simplified thermal hydraulics (5-equation approach) during both front-end and core degradation phases; - for core degradation, based on structure and main models of ICARE2 (IRSN) reference mechanistic code for core degradation and on other simplified models. Next priorities are clearly identified: code consolidation in order to increase the robustness, extension of all plant applications beyond the vessel lower head failure and coupling with fission product modules, and continuous improvements of users' tools. As EVITA has very successfully made the first step into the intention to provide end-users (like utilities, vendors and licensing authorities) with a well validated European integral code for the simulation of severe accidents in NPPs, the EVITA partners strongly recommend to continue validation, benchmarking and application of ASTEC. This work will continue in Severe Accident Research Network (SARNET) in the 6th Framework Programme

  8. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  9. Validation of the reactor dynamics code TRAB

    International Nuclear Information System (INIS)

    Raety, H.; Kyrki-Rajamaeki, R.; Rajamaeki, M.

    1991-05-01

    The one-dimensional reactor dynamics code TRAB (Transient Analysis code for BWRs) developed at VTT was originally designed for BWR analyses, but it can in its present version be used for various modelling purposes. The core model of TRAB can be used separately for LWR calculations. For PWR modelling the core model of TRAB has been coupled to circuit model SMABRE to form the SMATRA code. The versatile modelling capabilities of TRAB have been utilized also in analyses of e.g. the heating reactor SECURE and the RBMK-type reactor (Chernobyl). The report summarizes the extensive validation of TRAB. TRAB has been validated with benchmark problems, comparative calculations against independent analyses, analyses of start-up experiments of nuclear power plants and real plant transients. Comparative RBMES type reactor calculations have been made against Soviet simulations and the initial power excursion of the Chernobyl reactor accident has also been calculated with TRAB

  10. Identifying Psoriasis and Psoriatic Arthritis Patients in Retrospective Databases When Diagnosis Codes Are Not Available: A Validation Study Comparing Medication/Prescriber Visit-Based Algorithms with Diagnosis Codes.

    Science.gov (United States)

    Dobson-Belaire, Wendy; Goodfield, Jason; Borrelli, Richard; Liu, Fei Fei; Khan, Zeba M

    2018-01-01

    Using diagnosis code-based algorithms is the primary method of identifying patient cohorts for retrospective studies; nevertheless, many databases lack reliable diagnosis code information. To develop precise algorithms based on medication claims/prescriber visits (MCs/PVs) to identify psoriasis (PsO) patients and psoriatic patients with arthritic conditions (PsO-AC), a proxy for psoriatic arthritis, in Canadian databases lacking diagnosis codes. Algorithms were developed using medications with narrow indication profiles in combination with prescriber specialty to define PsO and PsO-AC. For a 3-year study period from July 1, 2009, algorithms were validated using the PharMetrics Plus database, which contains both adjudicated medication claims and diagnosis codes. Positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity of the developed algorithms were assessed using diagnosis code as the reference standard. Chosen algorithms were then applied to Canadian drug databases to profile the algorithm-identified PsO and PsO-AC cohorts. In the selected database, 183,328 patients were identified for validation. The highest PPVs for PsO (85%) and PsO-AC (65%) occurred when a predictive algorithm of two or more MCs/PVs was compared with the reference standard of one or more diagnosis codes. NPV and specificity were high (99%-100%), whereas sensitivity was low (≤30%). Reducing the number of MCs/PVs or increasing diagnosis claims decreased the algorithms' PPVs. We have developed an MC/PV-based algorithm to identify PsO patients with a high degree of accuracy, but accuracy for PsO-AC requires further investigation. Such methods allow researchers to conduct retrospective studies in databases in which diagnosis codes are absent. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Validation of intellectual disability coding through hospital morbidity records using an intellectual disability population-based database in Western Australia.

    Science.gov (United States)

    Bourke, Jenny; Wong, Kingsley; Leonard, Helen

    2018-01-23

    To investigate how well intellectual disability (ID) can be ascertained using hospital morbidity data compared with a population-based data source. All children born in 1983-2010 with a hospital admission in the Western Australian Hospital Morbidity Data System (HMDS) were linked with the Western Australian Intellectual Disability Exploring Answers (IDEA) database. The International Classification of Diseases hospital codes consistent with ID were also identified. The characteristics of those children identified with ID through either or both sources were investigated. Of the 488 905 individuals in the study, 10 218 (2.1%) were identified with ID in either IDEA or HMDS with 1435 (14.0%) individuals identified in both databases, 8305 (81.3%) unique to the IDEA database and 478 (4.7%) unique to the HMDS dataset only. Of those unique to the HMDS dataset, about a quarter (n=124) had died before 1 year of age and most of these (75%) before 1 month. Children with ID who were also coded as such in the HMDS data were more likely to be aged under 1 year, female, non-Aboriginal and have a severe level of ID, compared with those not coded in the HMDS data. The sensitivity of using HMDS to identify ID was 14.7%, whereas the specificity was much higher at 99.9%. Hospital morbidity data are not a reliable source for identifying ID within a population, and epidemiological researchers need to take these findings into account in their study design. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. WIMS-AECL/RFSP code validation of reactivity calculations following a long shutdown using the simple-cell history-based method

    International Nuclear Information System (INIS)

    Ardeshiri, F.; Donnelly, J.V.; Arsenault, B.

    1998-01-01

    The purpose of this analysis is to validate the Reactor Fuelling Simulation Program (RFSP) using the simple-cell model (SCM) history-based method in a startup simulation following a reactor shutdown period. This study is part of the validation work for history-based calculations, using the WIMS-AECL code with the ENDF/B-V library, and the SCM linked to the RFSP code. In this work, the RFSP code with the SCM history-based method was used to track a 1-year period of the Point Lepreau reactor operating history, that included a 12-day reactor shutdown and subsequent startup. Measured boron and gadolinium concentrations were used in the RFSP simulations, and the predicted values of core reactivity were compared to the reference (pre-shutdown) value. The discrepancies in core reactivity are shown to be better than ±2 milli-k at any time, and better than about ±0.5 milli-k towards the end of the startup transient. The results of this analysis also show that the calculated maximum channel and bundle powers are within an acceptable range during both the core-follow and the reactor startup simulations. (author)

  13. Validity of vascular trauma codes at major trauma centres.

    Science.gov (United States)

    Altoijry, Abdulmajeed; Al-Omran, Mohammed; Lindsay, Thomas F; Johnston, K Wayne; Melo, Magda; Mamdani, Muhammad

    2013-12-01

    The use of administrative databases in vascular injury research has been increasing, but the validity of the diagnosis codes used in this research is uncertain. We assessed the positive predictive value (PPV) of International Classification of Diseases, tenth revision (ICD-10), vascular injury codes in administrative claims data in Ontario. We conducted a retrospective validation study using the Canadian Institute for Health Information Discharge Abstract Database, an administrative database that records all hospital admissions in Canada. We evaluated 380 randomly selected hospital discharge abstracts from the 2 main trauma centres in Toronto, Ont., St.Michael's Hospital and Sunnybrook Health Sciences Centre, between Apr. 1, 2002, and Mar. 31, 2010. We then compared these records with the corresponding patients' hospital charts to assess the level of agreement for procedure coding. We calculated the PPV and sensitivity to estimate the validity of vascular injury diagnosis coding. The overall PPV for vascular injury coding was estimated to be 95% (95% confidence interval [CI] 92.3-96.8). The PPV among code groups for neck, thorax, abdomen, upper extremity and lower extremity injuries ranged from 90.8 (95% CI 82.2-95.5) to 97.4 (95% CI 91.0-99.3), whereas sensitivity ranged from 90% (95% CI 81.5-94.8) to 98.7% (95% CI 92.9-99.8). Administrative claims hospital discharge data based on ICD-10 diagnosis codes have a high level of validity when identifying cases of vascular injury. Observational Study Level III.

  14. Experimental validation of the HARMONIE code

    International Nuclear Information System (INIS)

    Bernard, A.; Dorsselaere, J.P. van

    1984-01-01

    An experimental program of deformation, in air, of different groups of subassemblies (7 to 41 subassemblies), was performed on a scale 1 mock-up in the SPX1 geometry, in order to achieve a first experimental validation of the code HARMONIE. The agreement between tests and calculations was suitable, qualitatively for all the groups and quantitatively for regular groups of 19 subassemblies at most. The differences come mainly from friction between pads, and secondly from the foot gaps. (author)

  15. Validation of the VTT's reactor physics code system

    International Nuclear Information System (INIS)

    Tanskanen, A.

    1998-01-01

    At VTT Energy several international reactor physics codes and nuclear data libraries are used in a variety of applications. The codes and libraries are under constant development and every now and then new updated versions are released, which are taken in use as soon as they have been validated at VTT Energy. The primary aim of the validation is to ensure that the code works properly, and that it can be used correctly. Moreover, the applicability of the codes and libraries are studied in order to establish their advantages and weak points. The capability of generating program-specific nuclear data for different reactor physics codes starting from the same evaluated data is sometimes of great benefit. VTT Energy has acquired a nuclear data processing system based on the NJOY-94.105 and TRANSX-2.15 processing codes. The validity of the processing system has been demonstrated by generating pointwise (MCNP) and groupwise (ANISN) temperature-dependent cross section sets for the benchmark calculations of the Doppler coefficient of reactivity. At VTT Energy the KENO-VI three-dimensional Monte Carlo code is used in criticality safety analyses. The KENO-VI code and the 44GROUPNDF5 data library have been validated at VTT Energy against the ZR-6 and LR-0 critical experiments. Burnup Credit refers to the reduction in reactivity of burned nuclear fuel due to the change in composition during irradiation. VTT Energy has participated in the calculational VVER-440 burnup credit benchmark in order to validate criticality safety calculation tools. (orig.)

  16. Development of boiling transition analysis code TCAPE-INS/B based on mechanistic methods for BWR fuel bundles. Models and validations with boiling transition experimental data

    International Nuclear Information System (INIS)

    Ishida, Naoyuki; Utsuno, Hideaki; Kasahara, Fumio

    2003-01-01

    The Boiling Transition (BT) analysis code TCAPE-INS/B based on the mechanistic methods coupled with subchannel analysis has been developed for the evaluation of the integrity of Boiling Water Reactor (BWR) fuel rod bundles under abnormal operations. Objective of the development is the evaluation of the BT without using empirical BT and rewetting correlations needed for different bundle designs in the current analysis methods. TCAPE-INS/B consisted mainly of the drift-flux model, the film flow model, the cross-flow model, the thermal conductivity model and the heat transfer correlations. These models were validated systematically with the experimental data. The accuracy of the prediction for the steady-state Critical Heat Flux (CHF) and the transient temperature of the fuel rod surface after the occurrence of BT were evaluated on the validations. The calculations for the experiments with the single tube and bundles were carried out for the validations of the models incorporated in the code. The results showed that the steady-state CHF was predicted within about 6% average error. In the transient calculations, BT timing and temperature of the fuel rod surface gradient agreed well with experimental results, but rewetting was predicted lately. So, modeling of heat transfer phenomena during post-BT is under modification. (author)

  17. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  18. Contributions to the validation of the ASTEC V1 code

    International Nuclear Information System (INIS)

    Constantin, Marin; Rizoiu, Andrei; Turcu, Ilie

    2004-01-01

    In the frame of PHEBEN2 project (Validation of the severe accidents codes for applications to nuclear power plants, based on the PHEBUS FP experiments), a project developed within the EU research Frame Program 5 (FP5), the INR-Pitesti's team has received the task of determining the ASTEC code sensitivity. The PHEBEN2 project has been initiated in 1998 and gathered 13 partners from 6 EU member states. To the project 4 partners from 3 candidate states (Hungary, Bulgaria and Romania) joined later. The works were contracted with the European Commission (under FIKS-CT1999-00009 contract) that supports financially the research effort up to about 50%. According to the contract provisions, INR's team participated in developing the Working Package 1 (WP1) which refers to validation of the integral computation codes that use the PHOEBUS experimental data and the Working Package 3 (WP3) referring to the evaluation of the codes to be applied in nuclear power plants for risk evaluation, nuclear safety margin evaluation and determination/evaluation of the measures to be adopted in case of severe accident. The present work continues the efforts to validate preliminarily the ASTEC code. Focused are the the stand-alone sensitivity analyses applied to two most important modules of the code, namely DIVA and SOPHAEROS

  19. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  20. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  1. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  2. Validation of Dose Calculation Codes for Clearance

    International Nuclear Information System (INIS)

    Menon, S.; Wirendal, B.; Bjerler, J.; Studsvik; Teunckens, L.

    2003-01-01

    Various international and national bodies such as the International Atomic Energy Agency, the European Commission, the US Nuclear Regulatory Commission have put forward proposals or guidance documents to regulate the ''clearance'' from regulatory control of very low level radioactive material, in order to allow its recycling as a material management practice. All these proposals are based on predicted scenarios for subsequent utilization of the released materials. The calculation models used in these scenarios tend to utilize conservative data regarding exposure times and dose uptake as well as other assumptions as a safeguard against uncertainties. None of these models has ever been validated by comparison with the actual real life practice of recycling. An international project was organized in order to validate some of the assumptions made in these calculation models, and, thereby, better assess the radiological consequences of recycling on a practical large scale

  3. The WINCON programme - validation of fast reactor primary containment codes

    International Nuclear Information System (INIS)

    Sidoli, J.E.A.; Kendall, K.C.

    1988-01-01

    In the United Kingdom safety studies for the Commercial Demonstration Fast Reactor (CDFR) include an assessment of the capability of the primary containment in providing an adequate containment for defence against the hazards resulting from a hypothetical Whole Core Accident (WCA). The assessment is based on calculational estimates using computer codes supported by measured evidence from small-scale experiments. The hydrodynamic containment code SEURBNUK-EURDYN is capable of representing a prescribed energy release, the sodium coolant and cover gas, and the main containment and safety related internal structures. Containment loadings estimated using SEURBNUK-EURDYN are used in the structural dynamic code EURDYN-03 for the prediction of the containment response. The experiments serve two purposes, they demonstrate the response of the CDFR containment to accident loadings and provide data for the validation of the codes. This paper summarises the recently completed WINfrith CONtainment (WINCON) experiments that studied the response of specific features of current CDFR design options to WCA loadings. The codes have been applied to some of the experiments and a satisfactory prediction of the global response of the model containment is obtained. This provides confidence in the use of the codes in reactor assessments. (author)

  4. Modelling uranium solubilities in aqueous solutions: Validation of a thermodynamic data base for the EQ3/6 geochemical codes

    International Nuclear Information System (INIS)

    Puigdomenech, I.; Bruno, J.

    1988-01-01

    Experimental solubilities of U 4+ and UO 2 2+ that are reported in the literature have been collected. Data on oxides, hydroxides and carbonates have been selected for this work. They include results both at 25 degrees C and at higher temperatures. The literature data have been compared with calculated uranium solubilities obtained with the EQ3/6 geochemical modelling programs and an uranium thermodynamic data base selected for the Swedish nuclear waste management program. This verification/validiation exercise has shown that more experimental data is needed to determine the chemical composition of anionic uranyl hydroxo complexes as well as their equilibrium constants of formation. There is also a need for more solubility data on well characterised alkaline or alkaline-earth uranates. For the uranyl carbonate system, the calculated results agree reasonably well with the experimental literature values, which span over a wide range of pH, (CO 3 2- ) T , CO 2 (g)-pressure, and T. The experimental solubility of UO 2 (s) agrees also well with the EQ3/6 calculations for pH greater than 6. However, in more acidic solutions the experimental solubilities are higher than the calculated values. This is due to the formation of polynuclear hydroxo complexes of uranium, which are not well characterised, and are not included in the thermodynamic data base used in this study. (authors)

  5. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  6. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  7. Gamma streaming experiments for validation of Monte Carlo code

    International Nuclear Information System (INIS)

    Thilagam, L.; Mohapatra, D.K.; Subbaiah, K.V.; Iliyas Lone, M.; Balasubramaniyan, V.

    2012-01-01

    In-homogeneities in shield structures lead to considerable amount of leakage radiation (streaming) increasing the radiation levels in accessible areas. Development works on experimental as well as computational methods for quantifying this streaming radiation are still continuing. Monte Carlo based radiation transport code, MCNP is usually a tool for modeling and analyzing such problems involving complex geometries. In order to validate this computational method for streaming analysis, it is necessary to carry out some experimental measurements simulating these inhomogeneities like ducts and voids present in the bulk shields for typical cases. The data thus generated will be analysed by simulating the experimental set up employing MCNP code and optimized input parameters for the code in finding solutions for similar radiation streaming problems will be formulated. Comparison of experimental data obtained from radiation streaming experiments through ducts will give a set of thumb rules and analytical fits for total radiation dose rates within and outside the duct. The present study highlights the validation of MCNP code through the gamma streaming experiments carried out with the ducts of various shapes and dimensions. Over all, the present study throws light on suitability of MCNP code for the analysis of gamma radiation streaming problems for all duct configurations considered. In the present study, only dose rate comparisons have been made. Studies on spectral comparison of streaming radiation are in process. Also, it is planned to repeat the experiments with various shield materials. Since the penetrations and ducts through bulk shields are unavoidable in an operating nuclear facility the results on this kind of radiation streaming simulations and experiments will be very useful in the shield structure optimization without compromising the radiation safety

  8. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  9. Validation and applicability of the 3D core kinetics and thermal hydraulics coupled code SPARKLE

    International Nuclear Information System (INIS)

    Miyata, Manabu; Maruyama, Manabu; Ogawa, Junto; Otake, Yukihiko; Miyake, Shuhei; Tabuse, Shigehiko; Tanaka, Hirohisa

    2009-01-01

    The SPARKLE code is a coupled code system based on three individual codes whose physical models have already been verified and validated. Mitsubishi Heavy Industries (MHI) confirmed the coupling calculation, including data transfer and the total reactor coolant system (RCS) behavior of the SPARKLE code. The confirmation uses the OECD/NEA MSLB benchmark problem, which is based on Three Mile Island Unit 1 (TMI-1) nuclear power plant data. This benchmark problem has been used to verify coupled codes developed and used by many organizations. Objectives of the benchmark program are as follows. Phase 1 is to compare the results of the system transient code using point kinetics. Phase 2 is to compare the results of the coupled three-dimensional (3D) core kinetics code and 3D core thermal-hydraulics (T/H) code, and Phase 3 is to compare the results of the combined coupled system transient code, 3D core kinetics code, and 3D core T/H code as a total validation of the coupled calculation. The calculation results of the SPARKLE code indicate good agreement with other benchmark participants' results. Therefore, the SPARKLE code is validated through these benchmark problems. In anticipation of applying the SPARKLE code to licensing analyses, MHI and Japanese PWR utilities have established a safety analysis method regarding the calculation conditions such as power distributions, reactivity coefficients, and event-specific features. (author)

  10. Development and validation of sodium fire analysis code ASSCOPS

    International Nuclear Information System (INIS)

    Ohno, Shuji

    2001-01-01

    A version 2.1 of the ASSCOPS sodium fire analysis code was developed to evaluate the thermal consequences of a sodium leak and consequent fire in LMFBRs. This report describes the computational models and the validation studies using the code. The ASSCOPS calculates sodium droplet and pool fire, and consequential heat/mass transfer behavior. Analyses of sodium pool or spray fire experiments confirmed that this code and parameters used in the validation studies gave valid results on the thermal consequences of sodium leaks and fires. (author)

  11. System Based Code: Principal Concept

    International Nuclear Information System (INIS)

    Yasuhide Asada; Masanori Tashimo; Masahiro Ueta

    2002-01-01

    This paper introduces a concept of the 'System Based Code' which has initially been proposed by the authors intending to give nuclear industry a leap of progress in the system reliability, performance improvement, and cost reduction. The concept of the System Based Code intends to give a theoretical procedure to optimize the reliability of the system by administrating every related engineering requirement throughout the life of the system from design to decommissioning. (authors)

  12. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  13. Computer code validation by high temperature chemistry

    International Nuclear Information System (INIS)

    Alexander, C.A.; Ogden, J.S.

    1988-01-01

    At least five of the computer codes utilized in analysis of severe fuel damage-type events are directly dependent upon or can be verified by high temperature chemistry. These codes are ORIGEN, CORSOR, CORCON, VICTORIA, and VANESA. With the exemption of CORCON and VANESA, it is necessary that verification experiments be performed on real irradiated fuel. For ORIGEN, the familiar knudsen effusion cell is the best choice and a small piece of known mass and known burn-up is selected and volatilized completely into the mass spectrometer. The mass spectrometer is used in the integral mode to integrate the entire signal from preselected radionuclides, and from this integrated signal the total mass of the respective nuclides can be determined. For CORSOR and VICTORIA, experiments with flowing high pressure hydrogen/steam must flow over the irradiated fuel and then enter the mass spectrometer. For these experiments, a high pressure-high temperature molecular beam inlet must be employed. Finally, in support of VANESA-CORCON, the very highest temperature and molten fuels must be contained and analyzed. Results from all types of experiments will be discussed and their applicability to present and future code development will also be covered

  14. Verification and validation of XSDRNPM code for tank waste calculations

    International Nuclear Information System (INIS)

    ROGERS, C.A.

    1999-01-01

    This validation study demonstrates that the XSDRNPM computer code accurately calculates the infinite neutron multiplication for water-moderated systems of low enriched uranium, plutonium, and iron. Calculations are made on a 200 MHz Brvo MS 5200M personal

  15. Validation uncertainty of MATRA code for subchannel void distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  16. Improvements, verifications and validations of the BOW code

    International Nuclear Information System (INIS)

    Yu, S.D.; Tayal, M.; Singh, P.N.

    1995-01-01

    The BOW code calculates the lateral deflections of a fuel element consisting of sheath and pellets, due to temperature gradients, hydraulic drag and gravity. the fuel element is subjected to restraint from endplates, neighboring fuel elements and the pressure tube. Many new features have been added to the BOW code since its original release in 1985. This paper outlines the major improvements made to the code and verification/validation results. (author)

  17. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  18. A rational method to evaluate tornado-borne missile speed in nuclear power plants. Validation of a numerical code based on Fujita's tornado model

    International Nuclear Information System (INIS)

    Eguchi, Yuzuru; Sugimoto, Soichiro; Hattori, Yasuo; Hirakuchi, Hiromaru

    2015-01-01

    Explanation is given about a rational method to evaluate tornado-borne missile speed, flight distance and flight height to be used for safety design of a nuclear power plant. In the method, the authors employed Fujita's DBT-77 model as a tornado wind model to take the near-ground tornado wind profile into account. A liftoff model of an object on the ground was developed by conservatively modeling the lift force due to ground effect. The wind field model and the liftoff model have been compiled together with a conventional flight model into a computer code, named TONBOS. In this study, especially, the code is verified for one- and two-dimensional free-fall problems as well as a case of 1957 Dallas tornado wind field model, whose solutions are theoretically or numerically known. Finally, the code is validated by typical car behaviors characterized by tornado wind speeds of the enhanced Fujita scale, as well as by an actual event where a truck was blown away by a tornado which struck a part of the town of Saroma, Hokkaido in November, 2006. (author)

  19. Results from the First Validation Phase of CAP code

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The second stage of Safety Analysis Code Development for Nuclear Power Plants was lunched on Apirl, 2010 and is scheduled to be through 2012, of which the scope of work shall cover from code validation to licensing preparation. As a part of this project, CAP(Containment Analysis Package) will follow the same procedures. CAP's validation works are organized hieratically into four validation steps using; 1) Fundamental phenomena. 2) Principal phenomena (mixing and transport) and components in containment. 3) Demonstration test by small, middle, large facilities and International Standard Problems. 4) Comparison with other containment codes such as GOTHIC or COMTEMPT. In addition, collecting the experimental data related to containment phenomena and then constructing the database is one of the major works during the second stage as a part of this project. From the validation process of fundamental phenomenon, it could be expected that the current capability and the future improvements of CAP code will be revealed. For this purpose, simple but significant problems, which have the exact analytical solution, were selected and calculated for validation of fundamental phenomena. In this paper, some results of validation problems for the selected fundamental phenomena will be summarized and discussed briefly

  20. Development and validation of a nodal code for core calculation

    International Nuclear Information System (INIS)

    Nowakowski, Pedro Mariano

    2004-01-01

    The code RHENO solves the multigroup three-dimensional diffusion equation using a nodal method of polynomial expansion.A comparative study has been made between this code and present internationals nodal diffusion codes, resulting that the RHENO is up to date.The RHENO has been integrated to a calculation line and has been extend to make burnup calculations.Two methods for pin power reconstruction were developed: modulation and imbedded. The modulation method has been implemented in a program, while the implementation of the imbedded method will be concluded shortly.The validation carried out (that includes experimental data of a MPR) show very good results and calculation efficiency

  1. Verification and Validation of the Tritium Transport Code TMAP7

    International Nuclear Information System (INIS)

    Longhurst, Glen R.; Ambrosek, James

    2005-01-01

    The TMAP code has been upgraded to version 7, which includes radioactive decay along with many features implemented in prior versions. Pursuant to acceptance and release for distribution, the code was exercised in a variety of problem types to demonstrate that it provides results in agreement with theoretical results for cases where those are available. It has also been used to model certain experimental results. In this paper, the capabilities of the TMAP7 code are demonstrated by presenting some of the results from the verification and validation process

  2. Validation of the code ETOBOX/BOXER for UO2 LWR lattices based on the experiments TRX, BAPL-UO2 and other critical experiments

    International Nuclear Information System (INIS)

    Paratte, J.M.

    1985-07-01

    The EIR codes system for LWR arrays is based on cross sections taken out of ENDF/B-4 and ENDF/B-5 by the code ETOBOX. The calculation method for the arrays (code BOXER) and the cross sections as well were applied to the CSEWG benchmark experiments TRX-1 to 4 and BAPL-UO/sub 2/-1 to 3. The results are compared to the measured values and to some calculations of other institutions as well. This demonstrates that the deviations of the parameters calculated by BOXER are typical for the cross sections used. A large number of critical experiments were calculated using the measured material bucklings in order to bring to light possible trends in the calculation of the multiplication factor k/sub eff/. First it came out that the error bounds of B/sub m//sup 2/ evalu-ated in the measurements are often optimistic. Two-dimensional calculations improved the results of the cell calculations. With a mean scattering of 4 to 5 mk in the normal arrays, the multiplication factors calculated by BOXER are satisfactory. However one has to take into account a slight trend of k/sub eff/ to grow with the moderator to fuel ratio and the enrichment. (author)

  3. Optimal, Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2002-01-01

    Reliability based code calibration is considered in this paper. It is described how the results of FORM based reliability analysis may be related to the partial safety factors and characteristic values. The code calibration problem is presented in a decision theoretical form and it is discussed how...... of reliability based code calibration of LRFD based design codes....

  4. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. The spread of the results in the international calculation amounted to ± 12,000 pcm in the realistic fuel dissolver exercise n degrees 19 proposed by BNFL, and to ± 25,000 pcm in the benchmark n degrees 20 in which fissile material in solid form is surrounded by fissile material in solution. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat latter effect, permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates solicited from the participants. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism (NITAWL in the international SCALE package) to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. Improvements in the up-dated 1990 contributions, as do recent complementary reference calculations (MCNP, VIM, ultrafine slowing-down CGM calculation), confirm the need to use rigorous self-shielding methods in criticality design-oriented codes. 6 refs., 11 figs., 3 tabs

  5. Evaluation and validation of criticality codes for fuel dissolver calculations

    International Nuclear Information System (INIS)

    Santamarina, A.; Smith, H.J.; Whitesides, G.E.

    1991-01-01

    During the past ten years an OECD/NEA Criticality Working Group has examined the validity of criticality safety computational methods. International calculation tools which were shown to be valid in systems for which experimental data existed were demonstrated to be inadequate when extrapolated to fuel dissolver media. A theoretical study of the main physical parameters involved in fuel dissolution calculations was performed, i.e. range of moderation, variation of pellet size and the fuel double heterogeneity effect. The APOLLO/P IC method developed to treat this latter effect permits us to supply the actual reactivity variation with pellet dissolution and to propose international reference values. The disagreement among contributors' calculations was analyzed through a neutron balance breakdown, based on three-group microscopic reaction rates. The results pointed out that fast and resonance nuclear data in criticality codes are not sufficiently reliable. Moreover the neutron balance analysis emphasized the inadequacy of the standard self-shielding formalism to account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The benchmark exercise has resolved a potentially dangerous inadequacy in dissolver calculations. (author)

  6. Lawrence Livermore National Laboratory Probabilistic Seismic Hazard Codes Validation

    International Nuclear Information System (INIS)

    Savy, J B

    2003-01-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time-period. LLNL has been developing the methodology and codes in support of the Nuclear Regulatory Commission (NRC) needs for reviews of site licensing of nuclear power plants, since 1978. A number of existing computer codes have been validated and still can lead to ranges of hazard estimates in some cases. Until now, the seismic hazard community had not agreed on any specific method for evaluation of these codes. The Earthquake Engineering Research Institute (EERI) and the Pacific Engineering Earthquake Research (PEER) center organized an exercise in testing of existing codes with the aim of developing a series of standard tests that future developers could use to evaluate and calibrate their own codes. Seven code developers participated in the exercise, on a voluntary basis. Lawrence Livermore National laboratory participated with some support from the NRC. The final product of the study will include a series of criteria for judging of the validity of the results provided by a computer code. This EERI/PEER project was first planned to be completed by June of 2003. As the group neared completion of the tests, the managing team decided that new tests were necessary. As a result, the present report documents only the work performed to this point. It demonstrates that the computer codes developed by LLNL perform all calculations correctly and as intended. Differences exist between the results of the codes tested, that are attributed to a series of assumptions, on the parameters and models, that the developers had to make. The managing team is planning a new series of tests to help in reaching a consensus on these assumptions

  7. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  8. Experimental validation of the containment codes ASTARTE and SEURBNUK

    International Nuclear Information System (INIS)

    Kendall, K.C.; Arnold, L.A.; Broadhouse, B.J.; Jones, A.; Yerkess, A.; Benuzzi, A.

    1979-10-01

    The fast reactor containment codes ASTARTE and SEURBNUK are being validated against data from the COVA series of small scale experiments being performed jointly by the UKAEA and JRC Ispra. The experimental programme is nearly complete, and data are given. (U.K.)

  9. Validation of the TIARA code to tritium inventory data

    International Nuclear Information System (INIS)

    Billone, M.C.

    1994-03-01

    The TIARA code has been developed to predict tritium inventory in Li 2 O breeder ceramic and to predict purge exit flow rate and composition. Inventory predictions are based on models for bulk diffusion, surface desorption, solubility and precipitation. Parameters for these models are determined from the results of laboratory annealing studies on unirradiated and irradiated Li 2 O. Inventory data from in-reactor purge flow tests are used for model improvement, fine-tuning of model parameters and validation. In this current work, the inventory measurement near the purge inlet from the BEATRIX-II thin-ring sample is used to fine tune the surface desorption model parameters for T > 470 degrees C, and the inventory measurement near the midplane from VOM-15H is used to fine tune the moisture solubility model parameters. predictions are then validated to the remaining inventory data from EXOTIC-2 (1 point), SIBELIUS (3 axial points), VOM-15H (2 axial points), CRITIC-1 (4 axial points), BEATRIX-II thin ring (3 axial points) and BEATRIX-II thick pellet (5 radial points). Thus. of the 20 data points, two we re used for fine tuning model parameters and 18 were used for validation. The inventory data span the range of 0.05--1.44 wppm with an average of 0.48 wppm. The data pertain to samples whose end-of-life temperatures were in the range of 490--1000 degrees C. On the average, the TIARA predictions agree quite well with the data (< 0.02 wppm difference). However, the root-mean-square deviation is 0.44 wppm, mostly due to over-predictions for the SIBELIUS samples and the higher-temperature radial samples from the BEATRIX-11 thick-pellet

  10. Validation of Magnetic Reconstruction Codes for Real-Time Applications

    International Nuclear Information System (INIS)

    Mazon, D.; Murari, A.; Boulbe, C.; Faugeras, B.; Blum, J.; Svensson, J.; Quilichini, T.; Gelfusa, M.

    2010-01-01

    The real-time reconstruction of the plasma magnetic equilibrium in a tokamak is a key point to access high-performance regimes. Indeed, the shape of the plasma current density profile is a direct output of the reconstruction and has a leading effect for reaching a steady-state high-performance regime of operation. The challenge is thus to develop real-time methods and algorithms that reconstruct the magnetic equilibrium from the perspective of using these outputs for feedback control purposes. In this paper the validation of the JET real-time equilibrium reconstruction codes using both a Bayesian approach and a full equilibrium solver named Equinox will be detailed, the comparison being performed with the off-line equilibrium code EFIT (equilibrium fitting) or the real-time boundary reconstruction code XLOC (X-point local expansion). In this way a significant database, a methodology, and a strategy for the validation are presented. The validation of the results has been performed using a validated database of 130 JET discharges with a large variety of magnetic configurations. Internal measurements like polarimetry and motional Stark effect have been also used for the Equinox validation including some magnetohydrodynamic signatures for the assessment of the reconstructed safety profile and current density. (authors)

  11. Validation of OPERA3D PCMI Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Jeun, Ji Hoon; Choi, Jae Myung; Yoo, Jong Sung [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of); Cheng, G.; Sim, K. S.; Chassie, Girma [Candu Energy INC.,Ontario (Canada)

    2013-10-15

    This report will describe introduction of validation of OPERA3D code, and validation results that are directly related with PCMI phenomena. OPERA3D was developed for the PCMI analysis and validated using the in-pile measurement data. Fuel centerline temperature and clad strain calculation results shows close expectations with measurement data. Moreover, 3D FEM fuel model of OPERA3D shows slight hour glassing behavior of fuel pellet in contact case. Further optimization will be conducted for future application of OPERA3D code. Nuclear power plant consists of many complicated systems, and one of the important objects of all the systems is maintaining nuclear fuel integrity. However, it is inevitable to experience PCMI (Pellet Cladding Mechanical Interaction) phenomena at current operating reactors and next generation reactors for advanced safety and economics as well. To evaluate PCMI behavior, many studies are on-going to develop 3-dimensional fuel performance evaluation codes. Moreover, these codes are essential to set the safety limits for the best estimated PCMI phenomena aimed for high burnup fuel.

  12. Validation of containment thermal hydraulic computer codes for VVER reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jiri Macek; Lubomir Denk [Nuclear Research Institute Rez plc Thermal-Hydraulic Analyses Department CZ 250 68 Husinec-Rez (Czech Republic)

    2005-07-01

    Full text of publication follows: The Czech Republic operates 4 VVER-440 units, two VVER-1000 units are being finalized (one of them is undergoing commissioning). Thermal-hydraulics Department of the Nuclear Research Institute Rez performs accident analyses for these plants using a number of computer codes. To model the primary and secondary circuits behaviour the system codes ATHLET, CATHARE, RELAP, TRAC are applied. Containment and pressure-suppression system are modelled with COCOSYS and MELCOR codes, the reactor power calculations (point and space-neutron kinetics) are made with DYN3D, NESTLE and CDF codes (FLUENT, TRIO) are used for some specific problems.An integral part of the current Czech project 'New Energy Sources' is selection of a new nuclear source. Within this and the preceding projects financed by the Czech Ministry of Industry and Trade and the EU PHARE, the Department carries and has carried out the systematic validation of thermal-hydraulic and reactor physics computer codes applying data obtained on several experimental facilities as well as the real operational data. One of the important components of the VVER 440/213 NPP is its containment with pressure suppression system (bubble condenser). For safety analyses of this system, computer codes of the type MELCOR and COCOSYS are used in the Czech Republic. These codes were developed for containments of classic PWRs or BWRs. In order to apply these codes for VVER 440 systems, their validation on experimental facilities must be performed.The paper provides concise information on these activities of the NRI and its Thermal-hydraulics Department. The containment system of the VVER 440/213, its functions and approaches to solution of its safety is described with definition of acceptance criteria. A detailed example of the containment code validation on EREC Test facility (LOCA and MSLB) and the consequent utilisation of the results for a real NPP purposes is included. An approach to

  13. Development and preliminary validation of flux map processing code MAPLE

    International Nuclear Information System (INIS)

    Li Wenhuai; Zhang Xiangju; Dang Zhen; Chen Ming'an; Lu Haoliang; Li Jinggang; Wu Yuanbao

    2013-01-01

    The self-reliant flux map processing code MAPLE was developed by China General Nuclear Power Corporation (CGN). Weight coefficient method (WCM), polynomial expand method (PEM) and thin plane spline (TPS) method were applied to fit the deviation between measured and predicted detector signal results for two-dimensional radial plane, to interpolate or extrapolate the non-instrumented location deviation. Comparison of results in the test cases shows that the TPS method can better capture the information of curved fitting lines than the other methods. The measured flux map data of the Lingao Nuclear Power Plant were processed using MAPLE as validation test cases, combined with SMART code. Validation results show that the calculation results of MAPLE are reasonable and satisfied. (authors)

  14. Validation of the TAC/BLOOST code (Contract research)

    International Nuclear Information System (INIS)

    Takamatsu, Kuniyoshi; Nakagawa, Shigeaki

    2005-06-01

    Safety demonstration tests using the High Temperature engineering Test Reactor (HTTR) are in progress to verify the inherent safety features for High Temperature Gas-cooled Reactors (HTGRs). The coolant flow reduction test by tripping gas circulators is one of the safety demonstration tests. The reactor power safely brings to a stable level without a reactor scram and the temperature transient of the reactor-core is very slow. The TAC/BLOOST code was developed to analyze reactor and temperature transient during the coolant flow reduction test taking account of reactor dynamics. This paper describes the validation result of the TAC/BLOOST code with the measured values of gas circulators tripping tests at 30% (9 MW). It was confirmed that the TAC/BLOOST code was able to analyze the reactor transient during the test. (author)

  15. Reactor Fuel Isotopics and Code Validation for Nuclear Applications

    Energy Technology Data Exchange (ETDEWEB)

    Francis, Matthew W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Weber, Charles F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pigni, Marco T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-02-01

    Experimentally measured isotopic concentrations of well characterized spent nuclear fuel (SNF) samples have been collected and analyzed by previous researchers. These sets of experimental data have been used extensively to validate the accuracy of depletion code predictions for given sets of burnups, initial enrichments, and varying power histories for different reactor types. The purpose of this report is to present the diversity of data in a concise manner and summarize the current accuracy of depletion modeling. All calculations performed for this report were done using the Oak Ridge Isotope GENeration (ORIGEN) code, an internationally used irradiation and decay code solver within the SCALE comprehensive modeling and simulation code. The diversity of data given in this report includes key actinides, stable fission products, and radioactive fission products. In general, when using the current ENDF/B-VII.0 nuclear data libraries in SCALE, the major actinides are predicted to within 5% of the measured values. Large improvements were seen for several of the curium isotopes when using improved cross section data found in evaluated nuclear data file ENDF/B-VII.0 as compared to ENDF/B-V-based results. The impact of the flux spectrum on the plutonium isotope concentrations as a function of burnup was also shown. The general accuracy noted for the actinide samples for reactor types with burnups greater than 5,000 MWd/MTU was not observed for the low-burnup Hanford B samples. More work is needed in understanding these large discrepancies. The stable neodymium and samarium isotopes were predicted to within a few percent of the measured values. Large improvements were seen in prediction for a few of the samarium isotopes when using the ENDF/B-VII.0 libraries compared to results obtained with ENDF/B-V libraries. Very accurate predictions were obtained for 133Cs and 153Eu. However, the predicted values for the stable ruthenium and rhodium isotopes varied

  16. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Morrow, Charles W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bartel, Timothy James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34D accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.

  17. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  18. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  19. Preliminary Validation of the MATRA-LMR Code Using Existing Sodium-Cooled Experimental Data

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Kim, Sangji

    2014-01-01

    The main objective of the SFR prototype plant is to verify TRU metal fuel performance, reactor operation, and transmutation ability of high-level wastes. The core thermal-hydraulic design is used to ensure the safe fuel performance during the whole plant operation. The fuel design limit is highly dependent on both the maximum cladding temperature and the uncertainties of the design parameters. Therefore, an accurate temperature calculation in each subassembly is highly important to assure a safe and reliable operation of the reactor systems. The current core thermalhydraulic design is mainly performed using the SLTHEN (Steady-State LMR Thermal-Hydraulic Analysis Code Based on ENERGY Model) code, which has been already validated using the existing sodium-cooled experimental data. In addition to the SLTHEN code, a detailed analysis is performed using the MATRA-LMR (Multichannel Analyzer for Transient and steady-state in Rod Array-Liquid Metal Reactor) code. In this work, the MATRA-LMR code is validated for a single subassembly evaluation using the previous experimental data. The MATRA-LMR code has been validated using existing sodium-cooled experimental data. The results demonstrate that the design code appropriately predicts the temperature distributions compared with the experimental values. Major differences are observed in the experiments with the large pin number due to the radial-wise mixing difference

  20. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  1. Validation of the transportation computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND

    International Nuclear Information System (INIS)

    Maheras, S.J.; Pippen, H.K.

    1995-05-01

    The computer codes HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND were used to estimate radiation doses from the transportation of radioactive material in the Department of Energy Programmatic Spent Nuclear Fuel Management and Idaho National Engineering Laboratory Environmental Restoration and Waste Management Programs Environmental Impact Statement. HIGHWAY and INTERLINE were used to estimate transportation routes for truck and rail shipments, respectively. RADTRAN 4 was used to estimate collective doses from incident-free transportation and the risk (probability x consequence) from transportation accidents. RISKIND was used to estimate incident-free radiation doses for maximally exposed individuals and the consequences from reasonably foreseeable transportation accidents. The purpose of this analysis is to validate the estimates made by these computer codes; critiques of the conceptual models used in RADTRAN 4 are also discussed. Validation is defined as ''the test and evaluation of the completed software to ensure compliance with software requirements.'' In this analysis, validation means that the differences between the estimates generated by these codes and independent observations are small (i.e., within the acceptance criterion established for the validation analysis). In some cases, the independent observations used in the validation were measurements; in other cases, the independent observations used in the validation analysis were generated using hand calculations. The results of the validation analyses performed for HIGHWAY, INTERLINE, RADTRAN 4, and RISKIND show that the differences between the estimates generated using the computer codes and independent observations were small. Based on the acceptance criterion established for the validation analyses, the codes yielded acceptable results; in all cases the estimates met the requirements for successful validation

  2. Validation and testing of the VAM2D computer code

    International Nuclear Information System (INIS)

    Kool, J.B.; Wu, Y.S.

    1991-10-01

    This document describes two modeling studies conducted by HydroGeoLogic, Inc. for the US NRC under contract no. NRC-04089-090, entitled, ''Validation and Testing of the VAM2D Computer Code.'' VAM2D is a two-dimensional, variably saturated flow and transport code, with applications for performance assessment of nuclear waste disposal. The computer code itself is documented in a separate NUREG document (NUREG/CR-5352, 1989). The studies presented in this report involve application of the VAM2D code to two diverse subsurface modeling problems. The first one involves modeling of infiltration and redistribution of water and solutes in an initially dry, heterogeneous field soil. This application involves detailed modeling over a relatively short, 9-month time period. The second problem pertains to the application of VAM2D to the modeling of a waste disposal facility in a fractured clay, over much larger space and time scales and with particular emphasis on the applicability and reliability of using equivalent porous medium approach for simulating flow and transport in fractured geologic media. Reflecting the separate and distinct nature of the two problems studied, this report is organized in two separate parts. 61 refs., 31 figs., 9 tabs

  3. Integrated Validation System for a Thermal-hydraulic System Code, TASS/SMR-S

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee-Kyung; Kim, Hyungjun; Kim, Soo Hyoung; Hwang, Young-Dong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Hyeon-Soo [Chungnam National University, Daejeon (Korea, Republic of)

    2015-10-15

    Development including enhancement and modification of thermal-hydraulic system computer code is indispensable to a new reactor, SMART. Usually, a thermal-hydraulic system code validation is achieved by a comparison with the results of corresponding physical effect tests. In the reactor safety field, a similar concept, referred to as separate effect tests has been used for a long time. But there are so many test data for comparison because a lot of separate effect tests and integral effect tests are required for a code validation. It is not easy to a code developer to validate a computer code whenever a code modification is occurred. IVS produces graphs which shown the comparison the code calculation results with the corresponding test results automatically. IVS was developed for a validation of TASS/SMR-S code. The code validation could be achieved by a comparison code calculation results with corresponding test results. This comparison was represented as a graph for convenience. IVS is useful before release a new code version. The code developer can validate code result easily using IVS. Even during code development, IVS could be used for validation of code modification. The code developer could gain a confidence about his code modification easily and fast and could be free from tedious and long validation work. The popular software introduced in IVS supplies better usability and portability.

  4. Development validation and use of computer codes for inelastic analysis

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger

  5. Implementation of LT codes based on chaos

    International Nuclear Information System (INIS)

    Zhou Qian; Li Liang; Chen Zengqiang; Zhao Jiaxiang

    2008-01-01

    Fountain codes provide an efficient way to transfer information over erasure channels like the Internet. LT codes are the first codes fully realizing the digital fountain concept. They are asymptotically optimal rateless erasure codes with highly efficient encoding and decoding algorithms. In theory, for each encoding symbol of LT codes, its degree is randomly chosen according to a predetermined degree distribution, and its neighbours used to generate that encoding symbol are chosen uniformly at random. Practical implementation of LT codes usually realizes the randomness through pseudo-randomness number generator like linear congruential method. This paper applies the pseudo-randomness of chaotic sequence in the implementation of LT codes. Two Kent chaotic maps are used to determine the degree and neighbour(s) of each encoding symbol. It is shown that the implemented LT codes based on chaos perform better than the LT codes implemented by the traditional pseudo-randomness number generator. (general)

  6. Development and Validation of A Nuclear Fuel Cycle Analysis Tool: A FUTURE Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. K.; Ko, W. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Yoon Hee [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-10-15

    This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy) code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses.

  7. DEVELOPMENT AND VALIDATION OF A NUCLEAR FUEL CYCLE ANALYSIS TOOL: A FUTURE CODE

    Directory of Open Access Journals (Sweden)

    S.K. KIM

    2013-10-01

    Full Text Available This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C# and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses.

  8. Validation of system codes RELAP5 and SPECTRA for natural convection boiling in narrow channels

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M.M., E-mail: stempniewicz@nrg.eu; Slootman, M.L.F.; Wiersema, H.T.

    2016-10-15

    Highlights: • Computer codes RELAP5/Mod3.3 and SPECTRA 3.61 validated for boiling in narrow channels. • Validated codes can be used for LOCA analyses in research reactors. • Code validation based on natural convection boiling in narrow channels experiments. - Abstract: Safety analyses of LOCA scenarios in nuclear power plants are performed with so called thermal–hydraulic system codes, such as RELAP5. Such codes are validated for typical fuel geometries applied in nuclear power plants. The question considered by this article is if the codes can be applied for LOCA analyses in research reactors, in particular exceeding CHF in very narrow channels. In order to answer this question, validation calculations were performed with two thermal–hydraulic system codes: RELAP and SPECTRA. The validation was based on natural convection boiling in narrow channels experiments, performed by Prof. Monde et al. in the years 1990–2000. In total 42 vertical tube and annulus experiments were simulated with both codes. A good agreement of the calculated values with the measured data was observed. The main conclusions are: • The computer codes RELAP5/Mod 3.3 (US NRC version) and SPECTRA 3.61 have been validated for natural convection boiling in narrow channels using experiments of Monde. The dimensions applied in the experiments were performed for a range that covers the values observed in typical research reactors. Therefore it is concluded that both codes are validated and can be used for LOCA analyses in research reactors, including natural convection boiling. The applicability range of the present validation is: hydraulic diameters of 1.1 ⩽ D{sub hyd} ⩽ 9.0 mm, heated lengths of 0.1 ⩽ L ⩽ 1.0 m, pressures of 0.10 ⩽ P ⩽ 0.99 MPa. In most calculations the burnout was predicted to occur at lower power than that observed in the experiments. In several cases the burnout was observed at higher power. The overprediction was not larger than 16% in RELAP and 15% in

  9. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  10. Validation of thermal hydraulic codes for fusion reactors safety

    International Nuclear Information System (INIS)

    Sardain, P.; Gulden, W.; Massaut, V.; Takase, K.; Merill, B.; Caruso, G.

    2006-01-01

    A significant effort has been done worldwide on the validation of thermal hydraulic codes, which can be used for the safety assessment of fusion reactors. This work is an item of an implementing agreement under the umbrella of the International Energy Agency. The European part is supported by EFDA. Several programmes related to transient analysis in water-cooled fusion reactors were run in order to assess the capabilities of the codes to treat the main physical phenomena governing the accidental sequences related to water/steam discharge into the vacuum vessel or the cryostat. The typical phenomena are namely the pressurization of a volume at low initial pressure, the critical flow, the flashing, the relief into an expansion volume, the condensation of vapor in a pressure suppression system, the formation of ice on a cryogenic structure, the heat transfer between walls and fluid in various thermodynamic conditions. · A benchmark exercise has been done involving different types of codes, from homogeneous equilibrium to six equations non-equilibrium models. Several cases were defined, each one focusing on a particular phenomenon. · The ICE (Ingress of Coolant Event) facility has been operated in Japan. It has simulated an in-vessel LOCA and the discharge of steam into a pressure suppression system. · The EVITA (European Vacuum Impingement Test Apparatus) facility has been operated in France. It has simulated ingress of coolant into the cryostat, i.e. into a volume at low initial pressure containing surfaces at cryogenic temperature. This paper gives the main lessons gained from these programs, in particular the possibilities for the improvement of the computer codes, extending their capabilities. For example, the water properties have been extended below the triple point. Ice formation models have been implemented. Work has also been done on condensation models. The remaining needs for R-and-D are also highlighted. (author)

  11. Validation and application of the system code ATHLET-CD for BWR severe accident analyses

    Energy Technology Data Exchange (ETDEWEB)

    Di Marcello, Valentino, E-mail: valentino.marcello@kit.edu; Imke, Uwe; Sanchez, Victor

    2016-10-15

    Highlights: • We present the application of the system code ATHLET-CD code for BWR safety analyses. • Validation of core in-vessel models is performed based on KIT CORA experiments. • A SB-LOCA scenario is simulated on a generic German BWR plant up to vessel failure. • Different core reflooding possibilities are investigated to mitigate the accident consequences. • ATHLET-CD modelling features reflect the current state of the art of severe accident codes. - Abstract: This paper is aimed at the validation and application of the system code ATHLET-CD for the simulation of severe accident phenomena in Boiling Water Reactors (BWR). The corresponding models for core degradation behaviour e.g., oxidation, melting and relocation of core structural components are validated against experimental data available from the CORA-16 and -17 bundle tests. Model weaknesses are discussed along with needs for further code improvements. With the validated ATHLET-CD code, calculations are performed to assess the code capabilities for the prediction of in-vessel late phase core behaviour and reflooding of damaged fuel rods. For this purpose, a small break LOCA scenario for a generic German BWR with postulated multiple failures of the safety systems was selected. In the analysis, accident management measures represented by cold water injection into the damaged reactor core are addressed to investigate the efficacy in avoiding or delaying the failure of the reactor pressure vessel. Results show that ATHLET-CD is applicable to the description of BWR plant behaviour with reliable physical models and numerical methods adopted for the description of key in-vessel phenomena.

  12. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    Science.gov (United States)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  13. Research on verification and validation strategy of detonation fluid dynamics code of LAD2D

    Science.gov (United States)

    Wang, R. L.; Liang, X.; Liu, X. Z.

    2017-07-01

    The verification and validation (V&V) is an important approach in the software quality assurance of code in complex engineering application. Reasonable and efficient V&V strategy can achieve twice the result with half the effort. This article introduces the software-Lagrangian adaptive hydrodynamics code in 2D space (LAD2D), which is self-developed software in detonation CFD with plastic-elastic structure. The V&V strategy of this detonation CFD code is presented based on the foundation of V&V methodology for scientific software. The basic framework of the module verification and the function validation is proposed, composing the detonation fluid dynamics model V&V strategy of LAD2D.

  14. Validation and verification of the ORNL Monte Carlo codes for nuclear safety analysis

    International Nuclear Information System (INIS)

    Emmett, M.B.

    1993-01-01

    The process of ensuring the quality of computer codes can be very time consuming and expensive. The Oak Ridge National Laboratory (ORNL) Monte Carlo codes all predate the existence of quality assurance (QA) standards and configuration control. The number of person-years and the amount of money spent on code development make it impossible to adhere strictly to all the current requirements. At ORNL, the Nuclear Engineering Applications Section of the Computing Applications Division is responsible for the development, maintenance, and application of the Monte Carlo codes MORSE and KENO. The KENO code is used for doing criticality analyses; the MORSE code, which has two official versions, CGA and SGC, is used for radiation transport analyses. Because KENO and MORSE were very thoroughly checked out over the many years of extensive use both in the United States and in the international community, the existing codes were open-quotes baselined.close quotes This means that the versions existing at the time the original configuration plan is written are considered to be validated and verified code systems based on the established experience with them

  15. Spent reactor fuel benchmark composition data for code validation

    International Nuclear Information System (INIS)

    Bierman, S.R.

    1991-09-01

    To establish criticality safety margins utilizing burnup credit in the storage and transport of spent reactor fuels requires a knowledge of the uncertainty in the calculated fuel composition used in making the reactivity assessment. To provide data for validating such calculated burnup fuel compositions, radiochemical assays are being obtained as part of the United States Department of Energy From-Reactor Cask Development Program. Destructive assay data are being obtained from representative reactor fuels having experienced irradiation exposures up to about 55 GWD/MTM. Assay results and associated operating histories on the initial three samples analyzed in this effort are presented. The three samples were taken from different axial regions of the same fuel rod and represent radiation exposures of about 27, 37, and 44 GWD/MTM. The data are presented in a benchmark type format to facilitate identification/referencing and computer code input

  16. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  17. Test and validation of the iterative code for the neutrons spectrometry and dosimetry: NSDUAZ

    International Nuclear Information System (INIS)

    Reyes H, A.; Ortiz R, J. M.; Reyes A, A.; Castaneda M, R.; Solis S, L. O.; Vega C, H. R.

    2014-08-01

    In this work was realized the test and validation of an iterative code for neutronic spectrometry known as Neutron Spectrometry and Dosimetry of the Universidad Autonoma de Zacatecas (NSDUAZ). This code was designed in a user graph interface, friendly and intuitive in the environment programming of LabVIEW using the iterative algorithm known as SPUNIT. The main characteristics of the program are: the automatic selection of the initial spectrum starting from the neutrons spectra catalog compiled by the International Atomic Energy Agency, the possibility to generate a report in HTML format that shows in graph and numeric way the neutrons flowing and calculates the ambient dose equivalent with base to this. To prove the designed code, the count rates of a spectrometer system of Bonner spheres were used with a detector of 6 LiI(Eu) with 7 polyethylene spheres with diameter of 0, 2, 3, 5, 8, 10 and 12. The count rates measured with two neutron sources: 252 Cf and 239 PuBe were used to validate the code, the obtained results were compared against those obtained using the BUNKIUT code. We find that the reconstructed spectra present an error that is inside the limit reported in the literature that oscillates around 15%. Therefore, it was concluded that the designed code presents similar results to those techniques used at the present time. (Author)

  18. Preliminary validation of the MATRA-LMR-FB code for the flow blockage in a subassembly

    International Nuclear Information System (INIS)

    Jeong, H. Y.; Ha, K. S.; Kwon, Y. M.; Chang, W. P.; Lee, Y. B.; Heo, S.

    2005-01-01

    To analyze the flow blockage in a subassembly of a Liquid Metal-cooled Reactor (LMR), the MATRA-LMR-FB code has been developed and validated for the existing experimental data. Compared to the MATRA-LMR code, which had been successfully applied for the core thermal-hydraulic design of KALIMER, the MATRA-LMR-FB code includes some advanced modeling features. Firstly, the Distributed Resistance Model (DRM), which enables a very accurate description of the effects of wire-wrap and blockage in a flow path, is developed for the MATRA-LMR-FB code. Secondly, the hybrid difference method is used to minimize the numerical diffusion especially at the low flow region such as recirculating wakes after blockage. In addition, the code is equipped with various turbulent mixing models to describe the active mixing due to the turbulent motions as accurate as possible. For the validation of the MATRA-LMR-FB code the ORNL THORS test and KOS 169-pin test are analyzed. Based on the analysis results for the temperature data, the accuracy of the code is evaluated quantitatively. The MATRA-LMR-FB code predicts very accurately the exit temperatures measured in the subassembly with wire-wrap. However, the predicted temperatures for the experiment with spacer grid show some deviations from the measured. To enhance the accuracy of the MATRA-LMR-FB for the flow path with grid spacers, it is suggested to improve the models for pressure loss due to spacer grid and the modeling method for blockage itself. The developed MATRA-LMR-FB code is evaluated to be applied to the flow blockage analysis of KALIMER-600 which adopts the wire-wrapped subassemblies

  19. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  20. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related

  1. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  2. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Science.gov (United States)

    Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David

    2014-01-01

    Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  3. First experimental validation on the core equilibrium code: HARMONIE

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.; Cozzani, M.; Gnuffi, M.

    1981-08-01

    The code HARMONIE calculates the mechanical equilibrium of a fast reactor. An experimental program of deformation, in air, of groups of subassemblies, was performed on a mock-up, in the Super Phenix 1- geometry. This program included three kinds of tests, all performed without and then with grease: on groups of 2 or 3 rings of subassemblies, subjected to a force acting upon flats or angles; on groups of 35 and 41 subassemblies, subjected to a force acting on the first row, then with 1 or 2 empty cells; and on groups with 1 or 2 bowed subassemblies or 1 enlarged one over flats. A preliminary test on the friction coefficient in air between two pads showed some dependance upon the pad surface condition with a scattering factor of 8. Two basic code hypotheses were validated: the rotation of the subassemblies around their axis was negligible after deformation of the group, and the choice of a mean Maxwell coefficient, between those of 1st and 2nd slope, led to very similar results to experimental. The agreement between tests and HARMONIE calculations was suitable, qualitatively for all the groups and quantitatively for regular groups of 3 rings at most. But the difference increased for larger groups of 35 or 41 subassemblies: friction between pads, neglected by HARMONIE, seems to be the main reason. Other reasons for these differences are: the influence of the loading order on the mock-up, and the initial contacts issued from the gap between foot and diagrid-insert, and from manufacture bowings

  4. Verification and Validation of The Tritium Transport Code TMAP7

    International Nuclear Information System (INIS)

    Glen R. Longhurst; James Ambrosek

    2004-01-01

    The TMAP Code was written at the Idaho National Engineering and Environmental Laboratory in the late 1980s as a tool for safety analysis of systems involving tritium. Since then it has been upgraded several times and has been used in numerous applications including experiments supporting fusion safety, predictions for advanced systems such as the International Thermonuclear Experimental Reactor (ITER), and estimates involving tritium production technologies. Its most recent upgrade to TMAP7 was accomplished in response to several needs. Prior versions had the capacity to deal with only a single trap for diffusing gaseous species in solid structures. TMAP7 includes up to three separate traps and up to 10 diffusing species. The original code had difficulty dealing with heteronuclear molecule formation such as HD and DT. That has been removed. Under pre-specified boundary enclosure conditions and solution-law dependent diffusion boundary conditions, such as Sieverts' law, TMAP7 automatically generates heteronuclear molecular partial pressures when solubilities and partial pressures of the homonuclear molecular species are provided for law-dependent diffusion boundary conditions. A further sophistication is the addition of non-diffusing surface species. Atoms such as oxygen or nitrogen or formation of hydroxyl radicals on metal surfaces are sometimes important in molecule formation with diffusing hydrogen isotopes but do not, themselves, diffuse appreciably in the material. TMAP7 will accommodate up to 30 such surface species, allowing the user to specify relationships between those surface concentrations and partial pressures of gaseous species above the surfaces or to form them dynamically by combining diffusion species or other surface species. Additionally, TMAP7 allows the user to include a surface binding energy and an adsorption barrier energy and includes asymmetrical diffusion between the surface sites and regular diffusion sites in the bulk. All of the

  5. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  6. Analytical validation of the CACECO containment analysis code

    International Nuclear Information System (INIS)

    Peak, R.D.

    1979-08-01

    The CACECO containment analysis code was developed to predict the thermodynamic responses of LMFBR containment facilities to a variety of accidents. This report covers the verification of the CACECO code by problems that can be solved by hand calculations or by reference to textbook and literature examples. The verification concentrates on the accuracy of the material and energy balances maintained by the code and on the independence of the four cells analyzed by the code so that the user can be assured that the code analyses are numerically correct and independent of the organization of the input data submitted to the code

  7. Analysis of a database in order to validate a computer code; Analyse d'une base de donnees pour la calibration d'un code de calcul

    Energy Technology Data Exchange (ETDEWEB)

    Feuillard, V

    2007-05-15

    This thesis takes place in the general context of calibration for industrial application. Calibration consists in assessing the values of the parameters of a model in order to simulate reality at best. This work aims at evaluating the quality of a data base by checking that the data, with respect to our objectives, 'best fill' the space. This work provides a synthesis of algorithmic and mathematic tools to achieve such a purpose. Extraction and importation techniques to improve the global quality of the data are proposed. These methods allow identifying some defaults of the data structure. An illustration of its application is exposed in the context of functional estimation with orthogonal functions. (author)

  8. Validation of the 3D finite element transport theory code EVENT for shielding applications

    International Nuclear Information System (INIS)

    Warner, Paul; Oliveira, R.E. de

    2000-01-01

    This paper is concerned with the validation of the 3D deterministic neutral-particle transport theory code EVENT for shielding applications. The code is based on the finite element-spherical harmonics (FE-P N ) method which has been extensively developed over the last decade. A general multi-group, anisotropic scattering formalism enables the code to address realistic steady state and time dependent, multi-dimensional coupled neutron/gamma radiation transport problems involving high scattering and deep penetration alike. The powerful geometrical flexibility and competitive computational effort makes the code an attractive tool for shielding applications. In recognition of this, EVENT is currently in the process of being adopted by the UK nuclear industry. The theory behind EVENT is described and its numerical implementation is outlined. Numerical results obtained by the code are compared with predictions of the Monte Carlo code MCBEND and also with the results from benchmark shielding experiments. In particular, results are presented for the ASPIS experimental configuration for both neutron and gamma ray calculations using the BUGLE 96 nuclear data library. (author)

  9. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  10. Sandia National Laboratories analysis code data base

    Science.gov (United States)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  11. Validation and uncertainty analysis of the Athlet thermal-hydraulic computer code

    International Nuclear Information System (INIS)

    Glaeser, H.

    1995-01-01

    The computer code ATHLET is being developed by GRS as an advanced best-estimate code for the simulation of breaks and transients in Pressurized Water Reactor (PWRs) and Boiling Water Reactor (BWRs) including beyond design basis accidents. A systematic validation of ATHLET is based on a well balanced set of integral and separate effects tests emphasizing the German combined Emergency Core Cooling (ECC) injection system. When using best estimate codes for predictions of reactor plant states during assumed accidents, qualification of the uncertainty in these calculations is highly desirable. A method for uncertainty and sensitivity evaluation has been developed by GRS where the computational effort is independent of the number of uncertain parameters. (author)

  12. Adaption, validation and application of advanced codes with 3-dimensional neutron kinetics for accident analysis calculations - STC with Bulgaria

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Mittag, S.; Rohde, U.; Seidel, A.; Panayotov, D.; Ilieva, B.

    2001-08-01

    In the frame of a project on scientific-technical co-operation funded by BMBF/BMWi, the program code DYN3D and the coupled code ATHLET-DYN3D have been transferred to the Institute for Nuclear Research and Nuclear Energy (INRNE) Sofia. The coupled code represents an implementation of the 3D core model DYN3D developed by FZR into the GRS thermal-hydraulics code system ATHLET. For the purpose of validation of these codes, a measurement data base about a start-up experiment obtained at the unit 6 of Kozloduy NPP (VVER-1000/V-320) has been generated. The results of performed validation calculations were compared with measurement values from the data base. A simplified model for estimation of cross flow mixing between fuel assemblies has been implemented into the program code DYN3D by Bulgarian experts. Using this cross flow model, transient processes with asymmetrical boundary conditions can be analysed more realistic. The validation of the implemented model were performed with help of comparison calculations between modified DYD3D code and thermal-hydraulics code COBRA-4I, and also on the base of the collected measurement data from Kozloduy NPP. (orig.) [de

  13. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  14. Validity of International Classification of Diseases (ICD) coding for dengue infections in hospital discharge records in Malaysia.

    Science.gov (United States)

    Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn

    2018-04-20

    Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.

  15. Study of the microstructure of neutron irradiated beryllium for the validation of the ANFIBE code

    International Nuclear Information System (INIS)

    Rabaglino, E.; Ferrero, C.; Reimann, J.; Ronchi, C.; Schulenberg, T.

    2002-01-01

    The behaviour of beryllium under fast neutron irradiation is a key issue of the helium cooled pebble bed tritium breeding blanket, due to the production of large quantities of helium and of a non-negligible amount of tritium. To optimise the design, a reliable prediction of swelling due to helium bubbles and of tritium inventory during normal and off-normal operation of a fusion power reactor is needed. The ANFIBE code (ANalysis of Fusion Irradiated BEryllium) is being developed to meet this need. The code has to be applied in a range of irradiation conditions where no experimental data are available, therefore a detailed gas kinetics model, and a specific and particularly careful validation strategy are needed. The validation procedure of the first version of the code was based on macroscopic data of swelling and tritium release. This approach is, however, incomplete, since a verification of the microscopic behaviour of the gas in the metal is necessary to obtain a reliable description of swelling. This paper discusses a general strategy for a thorough validation of the gas kinetics models in ANFIBE. The microstructure characterisation of weakly irradiated beryllium pebbles, with different visual examination techniques, is then presented as an example of the application of this strategy. In particular, the advantage of developing 3D techniques, such as X-ray microtomography, is demonstrated

  16. ESE a 2D compressible multiphase flow code developed for MFCI analysis - code validation

    International Nuclear Information System (INIS)

    Leskovar, M.; Mavko, B.

    1998-01-01

    ESE (Evaluation of Steam Explosions) is a general second order accurate two-dimensional compressible multiphase flow computer code. It has been developed to model the interaction of molten core debris with water during the first premixing stage of a steam explosion. A steam explosion is a physical event, which may occur during a severe reactor accident following core meltdown when the molten fuel comes into contact with the coolant water. Since the exchanges of mass, momentum and energy are regime dependent, different exchange laws have been incorporated in ESE for the major flow regimes. With ESE a number of premixing experiments performed at the Oxford University and at the QUEOS facility at Forschungszentrum Karlsruhe has been simulated. In these premixing experiments different jets of spheres were injected in a water poll. The ESE validation plan was carefully chosen, starting from very simple, well-defined problems, and gradually working up to more complicated ones. The results of ESE simulations, which were compared to experimental data and also to first order accurate calculations, are presented in form graphs. Most of the ESE results agree qualitatively as quantitatively reasonably well with experimental data and in general better than the results obtained with the first order accurate calculation.(author)

  17. Use of operational data for the validation of the SOPHT thermal-hydraulic code

    Energy Technology Data Exchange (ETDEWEB)

    Ho, S F; Martin, G; Shoukas, L; Siddiqui, Z; Phillips, B [Ontario Hydro, Bowmanville, ON (Canada). Darlington Nuclear Generating Station

    1996-12-31

    The primary objective of this paper is to describe the validation process of the SOPHT and MINI-SOPHT codes with the use of reactor operational data. The secondary objective is to illustrative the effectiveness of the code as a performance monitoring tool by discussing the discoveries that were made during the validation process. (author). 2 refs.

  18. Validation of SCALE code package on high performance neutron shields

    International Nuclear Information System (INIS)

    Bace, M.; Jecmenica, R.; Smuc, T.

    1999-01-01

    The shielding ability and other properties of new high performance neutron shielding materials from the KRAFTON series have been recently published. A comparison of the published experimental and MCNP results for the two materials of the KRAFTON series, with our own calculations has been done. Two control modules of the SCALE-4.4 code system have been used, one of them based on one dimensional radiation transport analysis (SAS1) and other based on the three dimensional Monte Carlo method (SAS3). The comparison of the calculated neutron dose equivalent rates shows a good agreement between experimental and calculated results for the KRAFTON-N2 material.. Our results indicate that the N2-M-N2 sandwich type is approximately 10% inferior as neutron shield to the KRAFTON-N2 material. All values of neutron dose equivalent obtained by SAS1 are approximately 25% lower in comparison with the SAS3 results, which indicates proportions of discrepancies introduced by one-dimensional geometry approximation.(author)

  19. Coding Transparency in Object-Based Video

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    A novel algorithm for coding gray level alpha planes in object-based video is presented. The scheme is based on segmentation in multiple layers. Different coders are specifically designed for each layer. In order to reduce the bit rate, cross-layer redundancies as well as temporal correlation are...

  20. Improvements and validation of the transient analysis code MOREL for molten salt reactors

    International Nuclear Information System (INIS)

    Zhuang Kun; Zheng Youqi; Cao Liangzhi; Hu Tianliang; Wu Hongchun

    2017-01-01

    The liquid fuel salt used in the molten salt reactors (MSRs) serves as the fuel and coolant simultaneously. On the one hand, the delayed neutron precursors circulate in the whole primary loop and part of them decay outside the core. On the other hand, the fission heat is carried off directly by the fuel flow. These two features require new analysis method with the coupling of fluid flow, heat transfer and neutronics. In this paper, the recent update of MOREL code is presented. The update includes: (1) the improved quasi-static method for the kinetics equation with convection term is developed. (2) The multi-channel thermal hydraulic model is developed based on the geometric feature of MSR. (3) The Variational Nodal Method is used to solve the neutron diffusion equation instead of the original analytic basis functions expansion nodal method. The update brings significant improvement on the efficiency of MOREL code. And, the capability of MOREL code is extended for the real core simulation with feedback. The numerical results and experiment data gained from molten salt reactor experiment (MSRE) are used to verify and validate the updated MOREL code. The results agree well with the experimental data, which prove the new development of MOREL code is correct and effective. (author)

  1. Thermal-hydraulic codes validation for safety analysis of NPPs with RBMK

    International Nuclear Information System (INIS)

    Brus, N.A.; Ioussoupov, O.E.

    2000-01-01

    This work is devoted to validation of western thermal-hydraulic codes (RELAP5/MOD3 .2 and ATHLET 1.1 Cycle C) in application to Russian designed light water reactors. Such validation is needed due to features of RBMK reactor design and thermal-hydraulics in comparison with PWR and BWR reactors, for which these codes were developed and validated. These validation studies are concluded with a comparison of calculation results of modeling with the thermal-hydraulics codes with the experiments performed earlier using the thermal-hydraulics test facilities with the experimental data. (authors)

  2. Development and validation of a fuel performance analysis code

    International Nuclear Information System (INIS)

    Majalee, Aaditya V.; Chaturvedi, S.

    2015-01-01

    CAD has been developing a computer code 'FRAVIZ' for calculation of steady-state thermomechanical behaviour of nuclear reactor fuel rods. It contains four major modules viz., Thermal module, Fission Gas Release module, Material Properties module and Mechanical module. All these four modules are coupled to each other and feedback from each module is fed back to others to get a self-consistent evolution in time. The computer code has been checked against two FUMEX benchmarks. Modelling fuel performance in Advance Heavy Water Reactor would require additional inputs related to the fuel and some modification in the code.(author)

  3. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  4. Indications for spine surgery: validation of an administrative coding algorithm to classify degenerative diagnoses

    Science.gov (United States)

    Lurie, Jon D.; Tosteson, Anna N.A.; Deyo, Richard A.; Tosteson, Tor; Weinstein, James; Mirza, Sohail K.

    2014-01-01

    Study Design Retrospective analysis of Medicare claims linked to a multi-center clinical trial. Objective The Spine Patient Outcomes Research Trial (SPORT) provided a unique opportunity to examine the validity of a claims-based algorithm for grouping patients by surgical indication. SPORT enrolled patients for lumbar disc herniation, spinal stenosis, and degenerative spondylolisthesis. We compared the surgical indication derived from Medicare claims to that provided by SPORT surgeons, the “gold standard”. Summary of Background Data Administrative data are frequently used to report procedure rates, surgical safety outcomes, and costs in the management of spinal surgery. However, the accuracy of using diagnosis codes to classify patients by surgical indication has not been examined. Methods Medicare claims were link to beneficiaries enrolled in SPORT. The sensitivity and specificity of three claims-based approaches to group patients based on surgical indications were examined: 1) using the first listed diagnosis; 2) using all diagnoses independently; and 3) using a diagnosis hierarchy based on the support for fusion surgery. Results Medicare claims were obtained from 376 SPORT participants, including 21 with disc herniation, 183 with spinal stenosis, and 172 with degenerative spondylolisthesis. The hierarchical coding algorithm was the most accurate approach for classifying patients by surgical indication, with sensitivities of 76.2%, 88.1%, and 84.3% for disc herniation, spinal stenosis, and degenerative spondylolisthesis cohorts, respectively. The specificity was 98.3% for disc herniation, 83.2% for spinal stenosis, and 90.7% for degenerative spondylolisthesis. Misclassifications were primarily due to codes attributing more complex pathology to the case. Conclusion Standardized approaches for using claims data to accurately group patients by surgical indications has widespread interest. We found that a hierarchical coding approach correctly classified over 90

  5. Description and validation of ANTEO, an optimised PC code the thermalhydraulic analysis of fuel bundles

    International Nuclear Information System (INIS)

    Cevolani, S.

    1995-01-01

    The paper deals with the description of a Personal Computer oriented subchannel code, devoted to the steady state thermal hydraulic analysis of nuclear reactor fuel bundles. The development of such a code was made possible by two facts: firstly, the increase, in the computing power of the desk machines; secondly, the fact that several years of experience into operate subchannels codes have shown how to simplify many of the physical models without a sensible loss of accuracy. For sake of validation, the developed code was compared with a traditional subchannel code, the COBRA one. The results of the comparison show a very good agreement between the two codes. (author)

  6. Development of an Auto-Validation Program for MARS Code Assessments

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2006-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is a best-estimate thermal hydraulic system analysis code developed at KAERI. It is important for a thermal hydraulic computer code to be assessed against theoretical and experimental data to verify and validate the performance and the integrity of the structure, models and correlations of the code. The code assessment efforts for complex thermal hydraulics code such as MARS code can be tedious, time-consuming and require large amount of human intervention in data transfer to see the results in graphic forms. Code developers produce many versions of a code during development and each version need to be verified for integrity. Thus, for MARS code developers, it is desirable to have an automatic way of carrying out the code assessment calculations. In the present work, an Auto-Validation program that carries out the code assessment efforts has been developed. The program uses the user supplied configuration file (with '.vv' extension) which contain commands to read input file, to execute the user selected MARS program, and to generate result graphs. The program can be useful if a same set of code assessments is repeated with different versions of the code. The program is written with the Delphi program language. The program runs under the Microsoft Windows environment

  7. In-core fuel management code package validation for BWRs

    International Nuclear Information System (INIS)

    1995-12-01

    The main goal of the present CRP (Coordinated Research Programme) was to develop benchmarks which are appropriate to check and improve the fuel management computer code packages and their procedures. Therefore, benchmark specifications were established which included a set of realistic data for running in-core fuel management codes. Secondly, the results of measurements and/or operating data were also provided to verify and compare with these parameters as calculated by the in-core fuel management codes or code packages. For the BWR it was established that the Mexican Laguna Verde 1 BWR would serve as the model for providing data on the benchmark specifications. It was decided to provide results for the first 2 cycles of Unit 1 of the Laguna Verde reactor. The analyses of the above benchmarks are performed in two stages. In the first stage, the lattice parameters are generated as a function of burnup at different voids and with and without control rod. These lattice parameters form the input for 3-dimensional diffusion theory codes for over-all reactor analysis. The lattice calculations were performed using different methods, such as, Monte Carlo, 2-D integral transport theory methods. Supercell Model and transport-diffusion model with proper correction for burnable absorber. Thus the variety of results should provide adequate information for any institute or organization to develop competence to analyze In-core fuel management codes. 15 refs, figs and tabs

  8. Wavelet based multicarrier code division multiple access ...

    African Journals Online (AJOL)

    This paper presents the study on Wavelet transform based Multicarrier Code Division Multiple Access (MC-CDMA) system for a downlink wireless channel. The performance of the system is studied for Additive White Gaussian Noise Channel (AWGN) and slowly varying multipath channels. The bit error rate (BER) versus ...

  9. Validation of coupled neutronic / thermal-hydraulic codes for VVER reactors. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mittag, S.; Grundmann, U.; Kliem, S.; Kozmenkov, Y.; Rindelhardt, U.; Rohde, U.; Weiss, F.-P.; Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K.-D.; Vanttola, T.; Haemaelaeinen, A.; Kaloinen, E.; Kereszturi, A.; Hegyi, G.; Panka, I.; Hadek, J.; Strmensky, C.; Darilek, P.; Petkov, P.; Stefanova, S.; Kuchin, A.; Khalimonchuk, V.; Hlbocky, P.; Sico, D.; Danilin, S.; Ionov, V.; Nikonov, S.; Powney, D.

    2004-08-01

    In recent years, the simulation methods for the safety analysis of nuclear power plants have been continuously improved to perform realistic calculations. Therefore in VALCO work package 2 (WP 2), the usual application of coupled neutron-kinetic / thermal-hydraulic codes to VVER has been supplemented by systematic uncertainty and sensitivity analyses. A comprehensive uncertainty analysis has been carried out. The GRS uncertainty and sensitivity method based on the statistical code package SUSA was applied to the two transients studied earlier in SRR-1/95: A load drop of one turbo-generator in Loviisa-1 (VVER-440), and a switch-off of one feed water pump in Balakovo-4 (VVER-1000). The main steps of these analyses and the results obtained by applying different coupled code systems (SMABRE - HEXTRAN, ATHLET - DYN3D, ATHLET - KIKO3D, ATHLET - BIPR-8) are described in this report. The application of this method is only based on variations of input parameter values. No internal code adjustments are needed. An essential result of the analysis using the GRS SUSA methodology is the identification of the input parameters, such as the secondary-circuit pressure, the control-assembly position (as a function of time), and the control-assembly efficiency, that most sensitively affect safety-relevant output parameters, like reactor power, coolant heat-up, and primary pressure. Uncertainty bands for these output parameters have been derived. The variation of potentially uncertain input parameter values as a consequence of uncertain knowledge can activate system actions causing quite different transient evolutions. This gives indications about possible plant conditions that might be reached from the initiating event assuming only small disturbances. In this way, the uncertainty and sensitivity analysis reveals the spectrum of possible transient evolutions. Deviations of SRR-1/95 coupled code calculations from measurements also led to the objective to separate neutron kinetics from

  10. Validation of computer codes used in the safety analysis of Canadian research reactors

    International Nuclear Information System (INIS)

    Bishop, W.E.; Lee, A.G.

    1998-01-01

    AECL has embarked on a validation program for the suite of computer codes that it uses in performing the safety analyses for its research reactors. Current focus is on codes used for the analysis of the two MAPLE reactors under construction at Chalk River but the program will be extended to include additional codes that will be used for the Irradiation Research Facility. The program structure is similar to that used for the validation of codes used in the safety analyses for CANDU power reactors. (author)

  11. Image content authentication based on channel coding

    Science.gov (United States)

    Zhang, Fan; Xu, Lei

    2008-03-01

    The content authentication determines whether an image has been tampered or not, and if necessary, locate malicious alterations made on the image. Authentication on a still image or a video are motivated by recipient's interest, and its principle is that a receiver must be able to identify the source of this document reliably. Several techniques and concepts based on data hiding or steganography designed as a means for the image authentication. This paper presents a color image authentication algorithm based on convolution coding. The high bits of color digital image are coded by the convolution codes for the tamper detection and localization. The authentication messages are hidden in the low bits of image in order to keep the invisibility of authentication. All communications channels are subject to errors introduced because of additive Gaussian noise in their environment. Data perturbations cannot be eliminated but their effect can be minimized by the use of Forward Error Correction (FEC) techniques in the transmitted data stream and decoders in the receiving system that detect and correct bits in error. This paper presents a color image authentication algorithm based on convolution coding. The message of each pixel is convolution encoded with the encoder. After the process of parity check and block interleaving, the redundant bits are embedded in the image offset. The tamper can be detected and restored need not accessing the original image.

  12. Phenomenological modeling of critical heat flux: The GRAMP code and its validation

    International Nuclear Information System (INIS)

    Ahmad, M.; Chandraker, D.K.; Hewitt, G.F.; Vijayan, P.K.; Walker, S.P.

    2013-01-01

    Highlights: ► Assessment of CHF limits is vital for LWR optimization and safety analysis. ► Phenomenological modeling is a valuable adjunct to pure empiricism. ► It is based on empirical representations of the (several, competing) phenomena. ► Phenomenological modeling codes making ‘aggregate’ predictions need careful assessment against experiments. ► The physical and mathematical basis of a phenomenological modeling code GRAMP is presented. ► The GRAMP code is assessed against measurements from BARC (India) and Harwell (UK), and the Look Up Tables. - Abstract: Reliable knowledge of the critical heat flux is vital for the design of light water reactors, for both safety and optimization. The use of wholly empirical correlations, or equivalently “Look Up Tables”, can be very effective, but is generally less so in more complex cases, and in particular cases where the heat flux is axially non-uniform. Phenomenological models are in principle more able to take into account of a wider range of conditions, with a less comprehensive coverage of experimental measurements. These models themselves are in part based upon empirical correlations, albeit of the more fundamental individual phenomena occurring, rather than the aggregate behaviour, and as such they too require experimental validation. In this paper we present the basis of a general-purpose phenomenological code, GRAMP, and then use two independent ‘direct’ sets of measurement, from BARC in India and from Harwell in the United Kingdom, and the large dataset embodied in the Look Up Tables, to perform a validation exercise on it. Very good agreement between predictions and experimental measurements is observed, adding to the confidence with which the phenomenological model can be used. Remaining important uncertainties in the phenomenological modeling of CHF, namely the importance of the initial entrained fraction on entry to annular flow, and the influence of the heat flux on entrainment rate

  13. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  14. Systematic review of validated case definitions for diabetes in ICD-9-coded and ICD-10-coded data in adult populations.

    Science.gov (United States)

    Khokhar, Bushra; Jette, Nathalie; Metcalfe, Amy; Cunningham, Ceara Tess; Quan, Hude; Kaplan, Gilaad G; Butalia, Sonia; Rabi, Doreen

    2016-08-05

    With steady increases in 'big data' and data analytics over the past two decades, administrative health databases have become more accessible and are now used regularly for diabetes surveillance. The objective of this study is to systematically review validated International Classification of Diseases (ICD)-based case definitions for diabetes in the adult population. Electronic databases, MEDLINE and Embase, were searched for validation studies where an administrative case definition (using ICD codes) for diabetes in adults was validated against a reference and statistical measures of the performance reported. The search yielded 2895 abstracts, and of the 193 potentially relevant studies, 16 met criteria. Diabetes definition for adults varied by data source, including physician claims (sensitivity ranged from 26.9% to 97%, specificity ranged from 94.3% to 99.4%, positive predictive value (PPV) ranged from 71.4% to 96.2%, negative predictive value (NPV) ranged from 95% to 99.6% and κ ranged from 0.8 to 0.9), hospital discharge data (sensitivity ranged from 59.1% to 92.6%, specificity ranged from 95.5% to 99%, PPV ranged from 62.5% to 96%, NPV ranged from 90.8% to 99% and κ ranged from 0.6 to 0.9) and a combination of both (sensitivity ranged from 57% to 95.6%, specificity ranged from 88% to 98.5%, PPV ranged from 54% to 80%, NPV ranged from 98% to 99.6% and κ ranged from 0.7 to 0.8). Overall, administrative health databases are useful for undertaking diabetes surveillance, but an awareness of the variation in performance being affected by case definition is essential. The performance characteristics of these case definitions depend on the variations in the definition of primary diagnosis in ICD-coded discharge data and/or the methodology adopted by the healthcare facility to extract information from patient records. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  15. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  16. First steps towards a validation of the new burnup and depletion code TNT

    Energy Technology Data Exchange (ETDEWEB)

    Herber, S.C.; Allelein, H.J. [RWTH Aachen (Germany). Inst. for Reactor Safety and Reactor Technology; Research Center Juelich (Germany). Inst. for Energy and Climate Research - Nuclear Waste Disposal and Reactor Safety (IEK-6); Friege, N. [RWTH Aachen (Germany). Inst. for Reactor Safety and Reactor Technology; Kasselmann, S. [Research Center Juelich (Germany). Inst. for Energy and Climate Research - Nuclear Waste Disposal and Reactor Safety (IEK-6)

    2012-11-01

    In the frame of the fusion of the core design calculation capabilities, represented by V.S.O.P., and the accident calculation capabilities, represented by MGT(-3D), the successor of the TINTE code, difficulties were observed in defining an interface between a program backbone and the ORIGEN code respectively the ORIGENJUEL code. The estimation of the effort of refactoring the ORIGEN code or to write a new burnup code from scratch, led to the decision that it would be more efficient writing a new code, which could benefit from existing programming and software engineering tools from the computer code side and which can use the latest knowledge of nuclear reactions, e.g. consider all documented reaction channels. Therefore a new code with an object-oriented approach was developed at IEK-6. Object-oriented programming is currently state of the art and provides mostly an improved extensibility and maintainability. The new code was named TNT which stands for Topological Nuclide Transformation, since the code makes use of the real topology of the nuclear reactions. Here we want to present some first validation results from code to code benchmarks with the codes ORIGEN V2.2 and FISPACT2005 and whenever possible analytical results also used for the comparison. The 2 reference codes were chosen due to their high reputation in the field of fission reactor analysis (ORIGEN) and fusion facilities (FISPACT). (orig.)

  17. Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation

    Science.gov (United States)

    Edwards, Thomas A.; Flores, Jolen

    1989-01-01

    Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.

  18. Validation of computer code TRAFIC used for estimation of charcoal heatup in containment ventilation systems

    International Nuclear Information System (INIS)

    Yadav, D.H.; Datta, D.; Malhotra, P.K.; Ghadge, S.G.; Bajaj, S.S.

    2005-01-01

    Full text of publication follows: Standard Indian PHWRs are provided with a Primary Containment Filtration and Pump-Back System (PCFPB) incorporating charcoal filters in the ventilation circuit to remove radioactive iodine that may be released from reactor core into the containment during LOCA+ECCS failure which is a Design Basis Accident for containment of radioactive release. This system is provided with two identical air circulation loops, each having 2 full capacity fans (1 operating and 1 standby) for a bank of four combined charcoal and High Efficiency Particulate Activity (HEPA) filters, in addition to other filters. While the filtration circuit is designed to operate under forced flow conditions, it is of interest to understand the performance of the charcoal filters, in the event of failure of the fans after operating for some time, i.e., when radio-iodine inventory is at its peak value. It is of interest to check whether the buoyancy driven natural circulation occurring in the filtration circuit is sufficient enough to keep the temperature in the charcoal under safe limits. A computer code TRAFIC (Transient Analysis of Filters in Containment) was developed using conservative one dimensional model to analyze the system. Suitable parametric studies were carried out to understand the problem and to identify the safety of existing system. TRAFIC Code has two important components. The first one estimates the heat generation in charcoal filter based on 'Source Term'; while the other one performs thermal-hydraulic computations. In an attempt validate the Code, experimental studies have been carried out. For this purpose, an experimental set up comprising of scaled down model of filtration circuit with heating coils embedded in charcoal for simulating the heating effect due to radio iodine has been constructed. The present work of validation consists of utilizing the results obtained from experiments conducted for different heat loads, elevations and adsorbent

  19. Validation of the THIRMAL-1 melt-water interaction code

    Energy Technology Data Exchange (ETDEWEB)

    Chu, C.C.; Sienicki, J.J.; Spencer, B.W. [Argonne National Lab., IL (United States)

    1995-09-01

    The THIRMAL-1 computer code has been used to calculate nonexplosive LWR melt-water interactions both in-vessel and ex-vessel. To support the application of the code and enhance its acceptability, THIRMAL-1 has been compared with available data from two of the ongoing FARO experiments at Ispra and two of the Corium Coolant Mixing (CCM) experiments performed at Argonne. THIRMAL-1 calculations for the FARO Scoping Test and Quenching Test 2 as well as the CCM-5 and -6 experiments were found to be in excellent agreement with the experiment results. This lends confidence to the modeling that has been incorporated in the code describing melt stream breakup due to the growth of both Kelvin-Helmholtz and large wave instabilities, the sizes of droplets formed, multiphase flow and heat transfer in the mixing zone surrounding and below the melt metallic phase. As part of the analysis of the FARO tests, a mechanistic model was developed to calculate the prefragmentation as it may have occurred when melt relocated from the release vessel to the water surface and the model was compared with the relevant data from FARO.

  20. Study of experimental validation for combustion analysis of GOTHIC code

    International Nuclear Information System (INIS)

    Lee, J. Y.; Yang, S. Y.; Park, K. C.; Jeong, S. H.

    2001-01-01

    In this study, present lumped and subdivided GOTHIC6 code analyses of the premixed hydrogen combustion experiment at the Seoul National University and comparison with the experiment results. The experimental facility has 16367 cc free volume and rectangular shape. And the test was performed with unit equivalence ratio of the hydrogen and air, and with various location of igniter position. Using the lumped and mechanistic combustion model in GOTHIC6 code, the experiments were simulated with the same conditions. In the comparison between experiment and calculated results, the GOTHIC6 prediction of the combustion response does not compare well with the experiment results. In the point of combustion time, the lumped combustion model of GOTHIC6 code does not simulate the physical phenomena of combustion appropriately. In the case of mechanistic combustion model, the combustion time is predicted well, but the induction time of calculation data is longer than the experiment data remarkably. Also, the laminar combustion model of GOTHIC6 has deficiency to simulate combustion phenomena unless control the user defined value appropriately. And the pressure is not a proper variable that characterize the three dimensional effect of combustion

  1. Validation Study of CODES Dragonfly Network Model with Theta Cray XC System

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, Misbah [Argonne National Lab. (ANL), Argonne, IL (United States); Ross, Robert B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-31

    This technical report describes the experiments performed to validate the MPI performance measurements reported by the CODES dragonfly network simulation with the Theta Cray XC system at the Argonne Leadership Computing Facility (ALCF).

  2. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  3. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    International Nuclear Information System (INIS)

    Hussein, M.S; Lewis, B.J.; Bonin, H.W.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k eff calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k eff calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k eff calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  4. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S, E-mail: mohamed.hussein@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada); Lewis, B.J., E-mail: Brent.Lewis@uoit.ca [Univ. of Ontario Inst. of Technology, Faculty of Energy Systems and Nuclear Science, Oshawa, Ontario (Canada); Bonin, H.W., E-mail: bonin-h@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k{sub eff} calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k{sub eff} calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k{sub eff} calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  5. Validation study of computer code SPHINCS for sodium fire safety evaluation of fast reactor

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Tajima, Yuji

    2003-01-01

    A computer code SPHINCS solves coupled phenomena of thermal hydraulics and sodium fire based on a multi-zone model. It deals with an arbitrary number of rooms, each of which is connected mutually by doorways and penetrations. With regard to the combustion phenomena, a flame sheet model and a liquid droplet combustion model are used for pool and spray fires, respectively, with the chemical equilibrium model based on the Gibbs free energy minimization method. The chemical reaction and mass and heat transfer are solved interactively. A specific feature of SPHINCS is detailed representation of thermalhydraulics of a sodium pool and a steel liner, which is placed on the floor to prevent sodium-concrete contact. The authors analyzed a series of pool combustion experiments, in which gas and liner temperatures are measured in detail. It has been found that good agreement is obtained and the SPHINCS code has been validated with regard to pool combustion phenomena. Further research needs are identified for pool spreading modeling considering thermal deformation of steel liner and measurement of pool fluidity property as a mixture of liquid sodium and reaction products. The SPHINCS code is to be used mainly in the safety evaluation of the consequence of a sodium fire accident in a liquid metal cooled fast reactor as well as fire safety analysis in general

  6. Validation of the DRAGON/DONJON code package for MNR using the IAEA 10 MW benchmark problem

    International Nuclear Information System (INIS)

    Day, S.E.; Garland, W.J.

    2000-01-01

    The first step in developing a framework for reactor physics analysis is to establish the appropriate and proven reactor physics codes. The chosen code package is tested, by executing a benchmark problem and comparing the results to the accepted standards. The IAEA 10 MW Benchmark problem is suitable for static reactor physics calculations on plate-fueled research reactor systems and has been used previously to validate codes for the McMaster Nuclear (MNR). The flexible and advanced geometry capabilities of the DRAGON transport theory code make it a desirable tool, and the accompanying DONJON diffusion theory code also has useful features applicable to safety analysis work at MNR. This paper describes the methodology used to benchmark the DRAGON/DONJON code package against this problem and the results herein extend the domain of validation of this code package. The results are directly applicable to MNR and are relevant to a reduced-enrichment fuel program. The DRAGON transport code models, used in this study, are based on the 1-D infinite slab approximation whereas the DONJON diffusion code models are defined in 3-D Cartesian geometry. The cores under consideration are composed of HEU (93% enrichment), MEU (45% enrichment) and LEU (20% enrichment) fuel and are examined in a fresh state, as well as at beginning-of-life (BOL) and end-of-life (EOL) exposures. The required flux plots and flux-ratio plots are included, as are transport theory code k∞and diffusion theory code k eff results. In addition to this, selected isotope atom densities are charted as a function of fuel burnup. Results from this analysis are compared to and are in good agreement with previously published results. (author)

  7. RELAP5-3D code validation for RBMK phenomena

    International Nuclear Information System (INIS)

    Fisher, J.E.

    1999-01-01

    The RELAP5-3D thermal-hydraulic code was assessed against Japanese Safety Experiment Loop (SEL) and Heat Transfer Loop (HTL) tests. These tests were chosen because the phenomena present are applicable to analyses of Russian RBMK reactor designs. The assessment cases included parallel channel flow fluctuation tests at reduced and normal water levels, a channel inlet pipe rupture test, and a high power, density wave oscillation test. The results showed that RELAP5-3D has the capability to adequately represent these RBMK-related phenomena

  8. Validation analysis of pool fire experiment (Run-F7) using SPHINCS code

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Tajima, Yuji

    1998-04-01

    SPHINCS (Sodium Fire Phenomenology IN multi-Cell System) code has been developed for the safety analysis of sodium fire accident in a Fast Breeder Reactor. The main features of the SPHINCS code with respect to the sodium pool fire phenomena are multi-dimensional modeling of the thermal behavior in sodium pool and steel liner, modeling of the extension of sodium pool area based on the sodium mass conservation, and equilibrium model for the chemical reaction of pool fire on the flame sheet at the surface of sodium pool during. Therefore, the SPHINCS code is capable of temperature evaluation of the steel liner in detail during the small and/or medium scale sodium leakage accidents. In this study, Run-F7 experiment in which the sodium leakage rate is 11.8 kg/hour has been analyzed. In the experiment the diameter of the sodium pool is approximately 60 cm and the maximum steel liner temperature was 616 degree C. The analytical results tell us the agreement between the SPHINCS analysis and the experiment is excellent with respect to the time history and spatial distribution of the liner temperature, sodium pool extension behavior, as well as atmosphere gas temperature. It is concluded that the pool fire modeling of the SPHINCS code has been validated for this experiment. The SPHINCS code is currently applicable to the sodium pool fire phenomena and the temperature evaluation of the steel liner. The experiment series are continued to check some parameters, i.e., sodium leakage rate and the height of sodium leakage. Thus, the author will analyze the subsequent experiments to check the influence of the parameters and applies SPHINCS to the sodium fire consequence analysis of fast reactor. (author)

  9. Image Coding Based on Address Vector Quantization.

    Science.gov (United States)

    Feng, Yushu

    Image coding is finding increased application in teleconferencing, archiving, and remote sensing. This thesis investigates the potential of Vector Quantization (VQ), a relatively new source coding technique, for compression of monochromatic and color images. Extensions of the Vector Quantization technique to the Address Vector Quantization method have been investigated. In Vector Quantization, the image data to be encoded are first processed to yield a set of vectors. A codeword from the codebook which best matches the input image vector is then selected. Compression is achieved by replacing the image vector with the index of the code-word which produced the best match, the index is sent to the channel. Reconstruction of the image is done by using a table lookup technique, where the label is simply used as an address for a table containing the representative vectors. A code-book of representative vectors (codewords) is generated using an iterative clustering algorithm such as K-means, or the generalized Lloyd algorithm. A review of different Vector Quantization techniques are given in chapter 1. Chapter 2 gives an overview of codebook design methods including the Kohonen neural network to design codebook. During the encoding process, the correlation of the address is considered and Address Vector Quantization is developed for color image and monochrome image coding. Address VQ which includes static and dynamic processes is introduced in chapter 3. In order to overcome the problems in Hierarchical VQ, Multi-layer Address Vector Quantization is proposed in chapter 4. This approach gives the same performance as that of the normal VQ scheme but the bit rate is about 1/2 to 1/3 as that of the normal VQ method. In chapter 5, a Dynamic Finite State VQ based on a probability transition matrix to select the best subcodebook to encode the image is developed. In chapter 6, a new adaptive vector quantization scheme, suitable for color video coding, called "A Self -Organizing

  10. Concurrent validation of an inertial measurement system to quantify kicking biomechanics in four football codes.

    Science.gov (United States)

    Blair, Stephanie; Duthie, Grant; Robertson, Sam; Hopkins, William; Ball, Kevin

    2018-05-17

    Wearable inertial measurement systems (IMS) allow for three-dimensional analysis of human movements in a sport-specific setting. This study examined the concurrent validity of a IMS (Xsens MVN system) for measuring lower extremity and pelvis kinematics in comparison to a Vicon motion analysis system (MAS) during kicking. Thirty footballers from Australian football (n = 10), soccer (n = 10), rugby league and rugby union (n = 10) clubs completed 20 kicks across four conditions. Concurrent validity was assessed using a linear mixed-modelling approach, which allowed the partition of between and within-subject variance from the device measurement error. Results were expressed in raw and standardised units for assessments of differences in means and measurement error, and interpreted via non-clinical magnitude-based inferences. Trivial to small differences were found in linear velocities (foot and pelvis), angular velocities (knee, shank and thigh), sagittal joint (knee and hip) and segment angle (shank and pelvis) means (mean difference: 0.2-5.8%) between the IMS and MAS in Australian football, soccer and the rugby codes. Trivial to small measurement errors (from 0.1 to 5.8%) were found between the IMS and MAS in all kinematic parameters. The IMS demonstrated acceptable levels of concurrent validity compared to a MAS when measuring kicking biomechanics across the four football codes. Wearable IMS offers various benefits over MAS, such as, out-of-laboratory testing, larger measurement range and quick data output, to help improve the ecological validity of biomechanical testing and the timing of feedback. The results advocate the use of IMS to quantify biomechanics of high-velocity movements in sport-specific settings. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Test Data for USEPR Severe Accident Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Rempe

    2007-05-01

    This document identifies data that can be used for assessing various models embodied in severe accident analysis codes. Phenomena considered in this document, which were limited to those anticipated to be of interest in assessing severe accidents in the USEPR developed by AREVA, include: • Fuel Heatup and Melt Progression • Reactor Coolant System (RCS) Thermal Hydraulics • In-Vessel Molten Pool Formation and Heat Transfer • Fuel/Coolant Interactions during Relocation • Debris Heat Loads to the Vessel • Vessel Failure • Molten Core Concrete Interaction (MCCI) and Reactor Cavity Plug Failure • Melt Spreading and Coolability • Hydrogen Control Each section of this report discusses one phenomenon of interest to the USEPR. Within each section, an effort is made to describe the phenomenon and identify what data are available modeling it. As noted in this document, models in US accident analysis codes (MAAP, MELCOR, and SCDAP/RELAP5) differ. Where possible, this report identifies previous assessments that illustrate the impact of modeling differences on predicting various phenomena. Finally, recommendations regarding the status of data available for modeling USEPR severe accident phenomena are summarized.

  12. Mesh-based parallel code coupling interface

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, K.; Steckel, B. (eds.) [GMD - Forschungszentrum Informationstechnik GmbH, St. Augustin (DE). Inst. fuer Algorithmen und Wissenschaftliches Rechnen (SCAI)

    2001-04-01

    MpCCI (mesh-based parallel code coupling interface) is an interface for multidisciplinary simulations. It provides industrial end-users as well as commercial code-owners with the facility to combine different simulation tools in one environment. Thereby new solutions for multidisciplinary problems will be created. This opens new application dimensions for existent simulation tools. This Book of Abstracts gives a short overview about ongoing activities in industry and research - all presented at the 2{sup nd} MpCCI User Forum in February 2001 at GMD Sankt Augustin. (orig.) [German] MpCCI (mesh-based parallel code coupling interface) definiert eine Schnittstelle fuer multidisziplinaere Simulationsanwendungen. Sowohl industriellen Anwender als auch kommerziellen Softwarehersteller wird mit MpCCI die Moeglichkeit gegeben, Simulationswerkzeuge unterschiedlicher Disziplinen miteinander zu koppeln. Dadurch entstehen neue Loesungen fuer multidisziplinaere Problemstellungen und fuer etablierte Simulationswerkzeuge ergeben sich neue Anwendungsfelder. Dieses Book of Abstracts bietet einen Ueberblick ueber zur Zeit laufende Arbeiten in der Industrie und in der Forschung, praesentiert auf dem 2{sup nd} MpCCI User Forum im Februar 2001 an der GMD Sankt Augustin. (orig.)

  13. Validation of simulation codes for future systems: motivations, approach, and the role of nuclear data

    International Nuclear Information System (INIS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. In fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It

  14. Validation of a new 39 neutron group self-shielded library based on the nucleonics analysis of the Lotus fusion-fission hybrid test facility performed with the Monte Carlo code

    International Nuclear Information System (INIS)

    Pelloni, S.; Cheng, E.T.

    1985-02-01

    The Swiss LOTUS fusion-fission hybrid test facility was used to investigate the influence of the self-shielding of resonance cross sections on the tritium breeding and on the thorium ratios. Nucleonic analyses were performed using the discrete-ordinates transport codes ANISN and ONEDANT, the surface-flux code SURCU, and the version 3 of the MCNP code for the Li 2 CO 3 and the Li 2 O blanket designs with lead, thorium and beryllium multipliers. Except for the MCNP calculation which bases on the ENDF/B-V files, all nuclear data are generated from the ENDF/B-IV basic library. For the deterministic methods three NJOY group libraries were considered. The first, a 39 neutron group self-shielded library, was generated at EIR. The second bases on the same group structure as the first does and consists of infinitely diluted cross sections. Finally the third library was processed at LANL and consists of coupled 30+12 neutron and gamma groups; these cross sections are not self-shielded. The Monte Carlo analysis bases on a continuous and on a discrete 262 group library from the ENDF/B-V evaluation. It is shown that the results agree well within 3% between the unshielded libraries and between the different transport codes and theories. The self-shielding of resonance cross sections results in a decrease of the thorium capture rate and in an increase of the tritium breeding of about 6%. The remaining computed ratios are not affected by the self-shielding of cross sections. (Auth.)

  15. Are industry codes and standards a valid cost containment approach

    International Nuclear Information System (INIS)

    Rowley, C.W.; Simpson, G.T.; Young, R.K.

    1990-01-01

    The nuclear industry has historically concentrated on safety design features for many years, but recently has been shifting to the reliability of the operating systems and components. The Navy has already gone through this transition and has found that Reliability Centered Maintenance (RCM) is an invaluable tool to improve the reliability of components, systems, ships, and classes of ships. There is a close correlation of Navy ships and equipment to commercial nuclear power plants and equipment. The Navy has a central engineering and configuration management organization (Naval Sea Systems Command) for over 500 ships, where as the over 100 commercial nuclear power plants and 52 nuclear utilities represent a fragmented owner/management structure. This paper suggests that the results of the application of RCM in the Navy can be duplicated to a large degree in the commercial nuclear power industry by the development and utilization of nuclear codes and standards

  16. Validation of the containment code Sirius: interpretation of an explosion experiment on a scale model

    International Nuclear Information System (INIS)

    Blanchet, Y.; Obry, P.; Louvet, J.; Deshayes, M.; Phalip, C.

    1979-01-01

    The explicit 2-D axisymmetric Langrangian code SIRIUS, developed at the CEA/DRNR, Cadarache, deals with transient compressive flows in deformable primary tanks with more or less complex internal component geometries. This code has been subjected to a two-year intensive validation program on scale model experiments and a number of improvements have been incorporated. This paper presents a recent calculation of one of these experiments using the SIRIUS code, and the comparison with experimental results shows the encouraging possibilities of this Lagrangian code

  17. COCOSYS: Status of development and validation of the German containment code system

    International Nuclear Information System (INIS)

    Allelein, H.-J.; Arndt, S.; Klein-Hessling, W.; Schwarz, S.; Spengler, C.; Weber, G.

    2006-01-01

    For the simulation of severe accident propagation in containments of nuclear power plants it is necessary to assess the efficiency of a severe accident measures under conditions as realistic as possible. Therefore the German containment code system COCOSYS is under development and validation at GRS. The main objective is to provide a code system on the basis of mostly mechanistic models for the comprehensive simulation of all relevant processes and plant states during severe accidents in the containment of light water reactors covering the design basis accidents, too. COCOSYS is being used for the identification of possible deficits in plant safety, qualification of the safety reserves of the entire system, assessment of damage-limiting or mitigating accident management measures, support of integral codes in PSA level 2 studies and safety evaluation of new plants. COCOSYS is composed for three main modules, which are separate executable files. The communication is realized via PVM (parallel virtual machine). The thermal hydraulic main module (THY) contains several specific models relevant for the simulation of severe accidents. Beside the usual capabilities to calculate the gas distribution and thermal behavior inside the containment, there are special models for the simulation of Hydrogen deflagration, pressure suppression systems etc. Further detailed models exist for the simulation of safety systems, like catalytic recombiners (PAR's), safety relief valves (used in WWR-440/V-230 type plants), ice condenser model, pump and spray system models for the complete simulation of cooling systems. The aerosol and fission product part (AFP) describes the aerosol behavior of nonsoluble and as well as hygroscopic aerosols, iodine chemistry and fission transport. Further the decay process of nuclides is considered using ORIGIN like routines. The corium concrete interaction (CCI) main module is based on an improved version of WECHSL extended by the ChemApp module for the

  18. Validation studies of thermal-hydraulic code for safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Haapalehto, T.

    1995-01-01

    The thesis gives an overview of the validation process for thermal-hydraulic system codes and it presents in more detail the assessment and validation of the French code CATHARE for VVER calculations. Three assessment cases are presented: loop seal clearing, core reflooding and flow in a horizontal steam generator. The experience gained during these assessment and validation calculations has been used to analyze the behavior of the horizontal steam generator and the natural circulation in the geometry of the Loviisa nuclear power plant. Large part of the work has been performed in cooperation with the CATHARE-team in Grenoble, France. (41 refs., 11 figs., 8 tabs.)

  19. Validation of physics and thermalhydraulic computer codes for advanced Candu reactor applications

    International Nuclear Information System (INIS)

    Wren, D.J.; Popov, N.; Snell, V.G.

    2004-01-01

    Atomic Energy of Canada Ltd. (AECL) is developing an Advanced Candu Reactor (ACR) that is an evolutionary advancement of the currently operating Candu 6 reactors. The ACR is being designed to produce electrical power for a capital cost and at a unit-energy cost significantly less than that of the current reactor designs. The ACR retains the modular Candu concept of horizontal fuel channels surrounded by a heavy water moderator. However, ACR uses slightly enriched uranium fuel compared to the natural uranium used in Candu 6. This achieves the twin goals of improved economics (via large reductions in the heavy water moderator volume and replacement of the heavy water coolant with light water coolant) and improved safety. AECL has developed and implemented a software quality assurance program to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. Since the basic design of the ACR is equivalent to that of the Candu 6, most of the key phenomena associated with the safety analyses of ACR are common, and the Candu industry standard tool-set of safety analysis codes can be applied to the analysis of the ACR. A systematic assessment of computer code applicability addressing the unique features of the ACR design was performed covering the important aspects of the computer code structure, models, constitutive correlations, and validation database. Arising from this assessment, limited additional requirements for code modifications and extensions to the validation databases have been identified. This paper provides an outline of the AECL software quality assurance program process for the validation of computer codes used to perform physics and thermal-hydraulics safety analyses of the ACR. It describes the additional validation work that has been identified for these codes and the planned, and ongoing, experimental programs to extend the code validation as required to address specific ACR design

  20. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  1. VALIDATION OF SIMBAT-PWR USING STANDARD CODE OF COBRA-EN ON REACTOR TRANSIENT CONDITION

    Directory of Open Access Journals (Sweden)

    Muhammad Darwis Isnaini

    2016-03-01

    Full Text Available The validation of Pressurized Water Reactor typed Nuclear Power Plant simulator developed by BATAN (SIMBAT-PWR using standard code of COBRA-EN on reactor transient condition has been done. The development of SIMBAT-PWR has accomplished several neutronics and thermal-hydraulic calculation modules. Therefore, the validation of the simulator is needed, especially in transient reactor operation condition. The research purpose is for characterizing the thermal-hydraulic parameters of PWR1000 core, which be able to be applied or as a comparison in developing the SIMBAT-PWR. The validation involves the calculation of the thermal-hydraulic parameters using COBRA-EN code. Furthermore, the calculation schemes are based on COBRA-EN with fixed material properties and dynamic properties that calculated by MATPRO subroutine (COBRA-EN+MATPRO for reactor condition of startup, power rise and power fluctuation from nominal to over power. The comparison of the temperature distribution at nominal 100% power shows that the fuel centerline temperature calculated by SIMBAT-PWR has 8.76% higher result than COBRA-EN result and 7.70% lower result than COBRA-EN+MATPRO. In general, SIMBAT-PWR calculation results on fuel temperature distribution are mostly between COBRA-EN and COBRA-EN+MATPRO results. The deviations of the fuel centerline, fuel surface, inner and outer cladding as well as coolant bulk temperature in the SIMBAT-PWR and the COBRA-EN calculation, are due to the value difference of the gap heat transfer coefficient and the cladding thermal conductivity.

  2. Validation of Westinghouse integrated code POLCA-T against OECD NEACRP-L-335 rod ejection benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Panayotov, Dobromir [Westinghouse Electric Sweden AB, Vaesteraas, SE-721 63 (Sweden)

    2008-07-01

    This paper describes the work performed and results obtained in the validation of the POLCA-T code against NEACRP PWR Rod Ejection Transients Benchmark. Presented work is a part of the POLCA-T licensing Assessment Data Base for BWR Control Rod Drop Accident (CRDA) Application. The validation against a PWR Rod Ejection Accidents (REA) Benchmark is relevant for the validation of the code for BWR CRDA, as the analyses of both transients require identical phenomena to be modelled. All six benchmark cases have been analyzed in the presented work. Initial state steady-state calculations including boron search, control rod worth, and final state power search have been performed by POLCA7 code. Initial state boron adjustment and steady-state CR worth as well as the transient analyses were performed by POLCA-T code. Benchmark results including 3D transient power distributions are compared with reference PANTHER solutions and published results of other codes. Given the similarity of the kinetics modelling for a BWR CRDA and a PWR REA and the fact that POLCA-T accurately predicts the local transient power and thus, the resulting fuel enthalpy, it is concluded that POLCA-T is a state-of-art tool also for BWR CRDA analysis. (author)

  3. Validation of Westinghouse integrated code POLCA-T against OECD NEACRP-L-335 rod ejection benchmark

    International Nuclear Information System (INIS)

    Panayotov, Dobromir

    2008-01-01

    This paper describes the work performed and results obtained in the validation of the POLCA-T code against NEACRP PWR Rod Ejection Transients Benchmark. Presented work is a part of the POLCA-T licensing Assessment Data Base for BWR Control Rod Drop Accident (CRDA) Application. The validation against a PWR Rod Ejection Accidents (REA) Benchmark is relevant for the validation of the code for BWR CRDA, as the analyses of both transients require identical phenomena to be modelled. All six benchmark cases have been analyzed in the presented work. Initial state steady-state calculations including boron search, control rod worth, and final state power search have been performed by POLCA7 code. Initial state boron adjustment and steady-state CR worth as well as the transient analyses were performed by POLCA-T code. Benchmark results including 3D transient power distributions are compared with reference PANTHER solutions and published results of other codes. Given the similarity of the kinetics modelling for a BWR CRDA and a PWR REA and the fact that POLCA-T accurately predicts the local transient power and thus, the resulting fuel enthalpy, it is concluded that POLCA-T is a state-of-art tool also for BWR CRDA analysis. (author)

  4. Evaporation over sump surface in containment studies: code validation on TOSQAN tests

    International Nuclear Information System (INIS)

    Malet, J.; Gelain, T.; Degrees du Lou, O.; Daru, V.

    2011-01-01

    During the course of a severe accident in a Nuclear Power Plant, water can be collected in the sump containment through steam condensation on walls and spray systems activation. The objective of this paper is to present code validation on evaporative sump tests performed on the TOSQAN facility. The ASTEC-CPA code is used as a lumped-parameter code and specific user-defined-functions are developed for the TONUS-CFD code. The tests are air-steam tests, as well as tests with other non-condensable gases (He, CO 2 and SF 6 ) under steady and transient conditions. The results show a good agreement between codes and experiments, indicating a good behaviour of the sump models in both codes. (author)

  5. In-vessel core degradation code validation matrix

    International Nuclear Information System (INIS)

    Haste, T.J.; Adroguer, B.; Gauntt, R.O.; Martinez, J.A.; Ott, L.J.; Sugimoto, J.; Trambauer, K.

    1996-01-01

    The objective of the current Validation Matrix is to define a basic set of experiments, for which comparison of the measured and calculated parameters forms a basis for establishing the accuracy of test predictions, covering the full range of in-vessel core degradation phenomena expected in light water reactor severe accident transients. The scope of the review covers PWR and BWR designs of Western origin: the coverage of phenomena extends from the initial heat-up through to the introduction of melt into the lower plenum. Concerning fission product behaviour, the effect of core degradation on fission product release is considered. The report provides brief overviews of the main LWR severe accident sequences and of the dominant phenomena involved. The experimental database is summarised. These data are cross-referenced against a condensed set of the phenomena and test condition headings presented earlier, judging the results against a set of selection criteria and identifying key tests of particular value. The main conclusions and recommendations are listed. (K.A.)

  6. Lessons learned in the verification, validation and application of a coupled heat and fluid flow code

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1986-01-01

    A summary is given of the authors recent studies in the verification, validation and application of a coupled heat and fluid flow code. Verification has been done against eight analytic and semi-analytic solutions. These solutions include those involving thermal buoyancy flow and fracture flow. Comprehensive field validation studies over a period of four years are discussed. The studies are divided into three stages: (1) history matching, (2) double-blind prediction and confirmation, (3) design optimization. At each stage, parameter sensitivity studies are performed. To study the applications of mathematical models, a problem proposed by the International Energy Agency (IEA) is solved using this verified and validated numerical model as well as two simpler models. One of the simpler models is a semi-analytic method assuming the uncoupling of the heat and fluid flow processes. The other is a graphical method based on a large number of approximations. Variations are added to the basic IEA problem to point out the limits of ranges of applications of each model. A number of lessons are learned from the above investigations. These are listed and discussed

  7. A proposed framework for computational fluid dynamics code calibration/validation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1993-01-01

    The paper reviews the terminology and methodology that have been introduced during the last several years for building confidence n the predictions from Computational Fluid Dynamics (CID) codes. Code validation terminology developed for nuclear reactor analyses and aerospace applications is reviewed and evaluated. Currently used terminology such as ''calibrated code,'' ''validated code,'' and a ''validation experiment'' is discussed along with the shortcomings and criticisms of these terms. A new framework is proposed for building confidence in CFD code predictions that overcomes some of the difficulties of past procedures and delineates the causes of uncertainty in CFD predictions. Building on previous work, new definitions of code verification and calibration are proposed. These definitions provide more specific requirements for the knowledge level of the flow physics involved and the solution accuracy of the given partial differential equations. As part of the proposed framework, categories are also proposed for flow physics research, flow modeling research, and the application of numerical predictions. The contributions of physical experiments, analytical solutions, and other numerical solutions are discussed, showing that each should be designed to achieve a distinctively separate purpose in building confidence in accuracy of CFD predictions. A number of examples are given for each approach to suggest methods for obtaining the highest value for CFD code quality assurance

  8. Energy information data base: report number codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used. (RWR)

  9. Energy information data base: report number codes

    International Nuclear Information System (INIS)

    1979-09-01

    Each report processed by the US DOE Technical Information Center is identified by a unique report number consisting of a code plus a sequential number. In most cases, the code identifies the originating installation. In some cases, it identifies a specific program or a type of publication. Listed in this publication are all codes that have been used by DOE in cataloging reports. This compilation consists of two parts. Part I is an alphabetical listing of report codes identified with the issuing installations that have used the codes. Part II is an alphabetical listing of installations identified with codes each has used

  10. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  11. An approach to verification and validation of MHD codes for fusion applications

    Energy Technology Data Exchange (ETDEWEB)

    Smolentsev, S., E-mail: sergey@fusion.ucla.edu [University of California, Los Angeles (United States); Badia, S. [Centre Internacional de Mètodes Numèrics en Enginyeria, Barcelona (Spain); Universitat Politècnica de Catalunya – Barcelona Tech (Spain); Bhattacharyay, R. [Institute for Plasma Research, Gandhinagar, Gujarat (India); Bühler, L. [Karlsruhe Institute of Technology (Germany); Chen, L. [University of Chinese Academy of Sciences, Beijing (China); Huang, Q. [Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui (China); Jin, H.-G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Krasnov, D. [Technische Universität Ilmenau (Germany); Lee, D.-W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Mas de les Valls, E. [Centre Internacional de Mètodes Numèrics en Enginyeria, Barcelona (Spain); Universitat Politècnica de Catalunya – Barcelona Tech (Spain); Mistrangelo, C. [Karlsruhe Institute of Technology (Germany); Munipalli, R. [HyPerComp, Westlake Village (United States); Ni, M.-J. [University of Chinese Academy of Sciences, Beijing (China); Pashkevich, D. [St. Petersburg State Polytechnical University (Russian Federation); Patel, A. [Universitat Politècnica de Catalunya – Barcelona Tech (Spain); Pulugundla, G. [University of California, Los Angeles (United States); Satyamurthy, P. [Bhabha Atomic Research Center (India); Snegirev, A. [St. Petersburg State Polytechnical University (Russian Federation); Sviridov, V. [Moscow Power Engineering Institute (Russian Federation); Swain, P. [Bhabha Atomic Research Center (India); and others

    2015-11-15

    Highlights: • Review of status of MHD codes for fusion applications. • Selection of five benchmark problems. • Guidance for verification and validation of MHD codes for fusion applications. - Abstract: We propose a new activity on verification and validation (V&V) of MHD codes presently employed by the fusion community as a predictive capability tool for liquid metal cooling applications, such as liquid metal blankets. The important steps in the development of MHD codes starting from the 1970s are outlined first and then basic MHD codes, which are currently in use by designers of liquid breeder blankets, are reviewed. A benchmark database of five problems has been proposed to cover a wide range of MHD flows from laminar fully developed to turbulent flows, which are of interest for fusion applications: (A) 2D fully developed laminar steady MHD flow, (B) 3D laminar, steady developing MHD flow in a non-uniform magnetic field, (C) quasi-two-dimensional MHD turbulent flow, (D) 3D turbulent MHD flow, and (E) MHD flow with heat transfer (buoyant convection). Finally, we introduce important details of the proposed activities, such as basic V&V rules and schedule. The main goal of the present paper is to help in establishing an efficient V&V framework and to initiate benchmarking among interested parties. The comparison results computed by the codes against analytical solutions and trusted experimental and numerical data as well as code-to-code comparisons will be presented and analyzed in companion paper/papers.

  12. San Onofre PWR Data for Code Validation of MOX Fuel Depletion Analyses - Revision 1

    International Nuclear Information System (INIS)

    Hermann, O.W.

    2000-01-01

    The isotopic composition of mixed-oxide fuel (fabricated with both uranium and plutonium isotopes) discharged from reactors is of interest to the Fissile Material Disposition Program. The validation of depletion codes used to predict isotopic compositions of MOX fuel, similar to studies concerning uranium-only fueled reactors, thus, is very important. The EEI-Westinghouse Plutonium Recycle Demonstration Program was conducted to examine the use of MOX fuel in the San Onofre PWR, Unit I, during cycles 2 and 3. The data, usually required as input to depletion codes, either one-dimensional or lattice codes, were taken from various sources and compiled into this report. Where data were either lacking or determined inadequate, the appropriate data were supplied from other references. The scope of the reactor operations and design data, in addition to the isotopic analyses, was considered to be of sufficient quality for depletion code validation

  13. Gap conductance model validation in the TASS/SMR-S code

    International Nuclear Information System (INIS)

    Ahn, Sang-Jun; Yang, Soo-Hyung; Chung, Young-Jong; Bae, Kyoo-Hwan; Lee, Won-Jae

    2011-01-01

    An advanced integral pressurized water reactor, SMART (System-Integrated Modular Advanced ReacTor) has been developed by KAERI (Korea Atomic Energy Research and Institute). The purposes of the SMART are sea water desalination and an electricity generation. For the safety evaluation and performance analysis of the SMART, TASS/SMR-S (Transient And Setpoint Simulation/System-integrated Modular Reactor) code, has been developed. In this paper, the gap conductance model for the calculation of gap conductance has been validated by using another system code, MARS code, and experimental results. In the validation, the behaviors of fuel temperature and gap width are selected as the major parameters. According to the evaluation results, the TASS/SMR-S code predicts well the behaviors of fuel temperatures and gap width variation, compared to the MARS calculation results and experimental data. (author)

  14. ASTEC V2 severe accident integral code: Fission product modelling and validation

    International Nuclear Information System (INIS)

    Cantrel, L.; Cousin, F.; Bosland, L.; Chevalier-Jabet, K.; Marchetto, C.

    2014-01-01

    One main goal of the severe accident integral code ASTEC V2, jointly developed since almost more than 15 years by IRSN and GRS, is to simulate the overall behaviour of fission products (FP) in a damaged nuclear facility. ASTEC applications are source term determinations, level 2 Probabilistic Safety Assessment (PSA2) studies including the determination of uncertainties, accident management studies and physical analyses of FP experiments to improve the understanding of the phenomenology. ASTEC is a modular code and models of a part of the phenomenology are implemented in each module: the release of FPs and structural materials from degraded fuel in the ELSA module; the transport through the reactor coolant system approximated as a sequence of control volumes in the SOPHAEROS module; and the radiochemistry inside the containment nuclear building in the IODE module. Three other modules, CPA, ISODOP and DOSE, allow respectively computing the deposition rate of aerosols inside the containment, the activities of the isotopes as a function of time, and the gaseous dose rate which is needed to model radiochemistry in the gaseous phase. In ELSA, release models are semi-mechanistic and have been validated for a wide range of experimental data, and noticeably for VERCORS experiments. For SOPHAEROS, the models can be divided into two parts: vapour phase phenomena and aerosol phase phenomena. For IODE, iodine and ruthenium chemistry are modelled based on a semi-mechanistic approach, these FPs can form some volatile species and are particularly important in terms of potential radiological consequences. The models in these 3 modules are based on a wide experimental database, resulting for a large part from international programmes, and they are considered at the state of the art of the R and D knowledge. This paper illustrates some FPs modelling capabilities of ASTEC and computed values are compared to some experimental results, which are parts of the validation matrix

  15. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  16. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    Science.gov (United States)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  17. Password Authentication Based on Fractal Coding Scheme

    Directory of Open Access Journals (Sweden)

    Nadia M. G. Al-Saidi

    2012-01-01

    Full Text Available Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image mechanisms. The advantage of fractal image coding is to be used to securely send the compressed image data through a nonsecured communication channel to the server. The verification of client information with the database system is achieved in the server to authenticate the legal user. The encrypted hashed password in the decoded fractal image is recognized using optical character recognition. The authentication process is performed after a successful verification of the client identity by comparing the decrypted hashed password with those which was stored in the database system. The system is analyzed and discussed from the attacker’s viewpoint. A security comparison is performed to show that the proposed scheme provides an essential security requirement, while their efficiency makes it easier to be applied alone or in hybrid with other security methods. Computer simulation and statistical analysis are presented.

  18. Triboelectric-Based Transparent Secret Code.

    Science.gov (United States)

    Yuan, Zuqing; Du, Xinyu; Li, Nianwu; Yin, Yingying; Cao, Ran; Zhang, Xiuling; Zhao, Shuyu; Niu, Huidan; Jiang, Tao; Xu, Weihua; Wang, Zhong Lin; Li, Congju

    2018-04-01

    Private and security information for personal identification requires an encrypted tool to extend communication channels between human and machine through a convenient and secure method. Here, a triboelectric-based transparent secret code (TSC) that enables self-powered sensing and information identification simultaneously in a rapid process method is reported. The transparent and hydrophobic TSC can be conformed to any cambered surface due to its high flexibility, which extends the application scenarios greatly. Independent of the power source, the TSC can induce obvious electric signals only by surface contact. This TSC is velocity-dependent and capable of achieving a peak voltage of ≈4 V at a resistance load of 10 MΩ and a sliding speed of 0.1 m s -1 , according to a 2 mm × 20 mm rectangular stripe. The fabricated TSC can maintain its performance after reciprocating rolling for about 5000 times. The applications of TSC as a self-powered code device are demonstrated, and the ordered signals can be recognized through the height of the electric peaks, which can be further transferred into specific information by the processing program. The designed TSC has great potential in personal identification, commodity circulation, valuables management, and security defense applications.

  19. Validation of KENO-based criticality calculations at Rocky Flats

    International Nuclear Information System (INIS)

    Felsher, P.D.; McKamy, J.N.; Monahan, S.P.

    1992-01-01

    In the absence of experimental data, it is necessary to rely on computer-based computational methods in evaluating the criticality condition of a nuclear system. The validity of the computer codes is established in a two-part procedure as outlined in ANSI/ANS 8.1. The first step, usually the responsibility of the code developer, involves verification that the algorithmic structure of the code is performing the intended mathematical operations correctly. The second step involves an assessment of the code's ability to realistically portray the governing physical processes in question. This is accomplished by determining the code's bias, or systematic error, through a comparison of computational results to accepted values obtained experimentally. In this paper, the authors discuss the validation process for KENO and the Hansen-Roach cross sections in use at EG and G Rocky Flats. The validation process at Rocky Flats consists of both global and local techniques. The global validation resulted in a maximum k eff limit of 0.95 for the limiting-accident scanarios of a criticality evaluation

  20. Experimental data bases useful for quantification of model uncertainties in best estimate codes

    International Nuclear Information System (INIS)

    Wilson, G.E.; Katsma, K.R.; Jacobson, J.L.; Boodry, K.S.

    1988-01-01

    A data base is necessary for assessment of thermal hydraulic codes within the context of the new NRC ECCS Rule. Separate effect tests examine particular phenomena that may be used to develop and/or verify models and constitutive relationships in the code. Integral tests are used to demonstrate the capability of codes to model global characteristics and sequence of events for real or hypothetical transients. The nuclear industry has developed a large experimental data base of fundamental nuclear, thermal-hydraulic phenomena for code validation. Given a particular scenario, and recognizing the scenario's important phenomena, selected information from this data base may be used to demonstrate applicability of a particular code to simulate the scenario and to determine code model uncertainties. LBLOCA experimental data bases useful to this objective are identified in this paper. 2 tabs

  1. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  2. Implementation of QR Code and Digital Signature to Determine the Validity of KRS and KHS Documents

    Directory of Open Access Journals (Sweden)

    Fatich Fazlur Rochman

    2017-05-01

    Full Text Available Universitas Airlangga students often find it difficult to verify the mark that came out in the Kartu Hasil Studi (KHS is called Study Result Card or courses taken in the Kartu Rencana Studi (KRS is called Study Plan Card, if there are changes to the data on the system used Universitas Airlangga. This complicated KRS and KHS verification process happened because the KRS and KHS documents that owned by student is easier to counterfeit than the data in the system. Implementation digital signature and QR Code technology as a solution that can prove the validity of KRS or KHS. The KRS and KHS validation system developed by Digital Signature and QR Code. QR Code is a type of matrix code that was developed as a code that allows its contents to be decoded at high speed while the Digital Signature has a function as a marker on the data to ensure that the data is the original data. The verification process was divided into two types are reading the Digital Signature and printing document that works by scanning the data from QR Code. The application of the system is carried out were the addition of the QR Code on KRS and KHS, required a readiness of human resources. 

  3. Experimental validation of decay heat calculation codes and associated nuclear data libraries for fusion energy

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Wada, Masayuki; Ikeda, Yujiro

    2001-01-01

    Validity of decay heat calculations for safety designs of fusion reactors was investigated by using decay heat experimental data on thirty-two fusion reactor relevant materials obtained at the 14-MeV neutron source facility of FNS in JAERI. Calculation codes developed in Japan, ACT4 and CINAC version 4, and nuclear data bases such as JENDL/Act-96, FENDL/A-2.0 and Lib90 were used for the calculation. Although several corrections in algorithms for both the calculation codes were needed, it was shown by comparing calculated results with the experimental data that most of activation cross sections and decay data were adequate. In cases of type 316 stainless steel and copper which were important for ITER, prediction accuracy of decay heat within ±10% was confirmed. However, it was pointed out that there were some problems in parts of data such as improper activation cross sections, e,g., the 92 Mo(n, 2n) 91g Mo reaction in FENDL, and lack of activation cross section data, e.g., the 138 Ba(n, 2n) 137m Ba reaction in JENDL. Modifications of cross section data were recommended for 19 reactions in JENDL and FENDL. It was also pointed out that X-ray and conversion electron energies should be included in decay data. (author)

  4. Experimental validation of decay heat calculation codes and associated nuclear data libraries for fusion energy

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Wada, Masayuki; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-01-01

    Validity of decay heat calculations for safety designs of fusion reactors was investigated by using decay heat experimental data on thirty-two fusion reactor relevant materials obtained at the 14-MeV neutron source facility of FNS in JAERI. Calculation codes developed in Japan, ACT4 and CINAC version 4, and nuclear data bases such as JENDL/Act-96, FENDL/A-2.0 and Lib90 were used for the calculation. Although several corrections in algorithms for both the calculation codes were needed, it was shown by comparing calculated results with the experimental data that most of activation cross sections and decay data were adequate. In cases of type 316 stainless steel and copper which were important for ITER, prediction accuracy of decay heat within {+-}10% was confirmed. However, it was pointed out that there were some problems in parts of data such as improper activation cross sections, e,g., the {sup 92}Mo(n, 2n){sup 91g}Mo reaction in FENDL, and lack of activation cross section data, e.g., the {sup 138}Ba(n, 2n){sup 137m}Ba reaction in JENDL. Modifications of cross section data were recommended for 19 reactions in JENDL and FENDL. It was also pointed out that X-ray and conversion electron energies should be included in decay data. (author)

  5. Protograph-Based Raptor-Like Codes

    Science.gov (United States)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  6. Optimal Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kroon, I. B.; Faber, Michael Havbro

    1994-01-01

    Calibration of partial safety factors is considered in general, including classes of structures where no code exists beforehand. The partial safety factors are determined such that the difference between the reliability for the different structures in the class considered and a target reliability...... level is minimized. Code calibration on a decision theoretical basis is also considered and it is shown how target reliability indices can be calibrated. Results from code calibration for rubble mound breakwater designs are shown....

  7. Quantum BCH Codes Based on Spectral Techniques

    International Nuclear Information System (INIS)

    Guo Ying; Zeng Guihua

    2006-01-01

    When the time variable in quantum signal processing is discrete, the Fourier transform exists on the vector space of n-tuples over the Galois field F 2 , which plays an important role in the investigation of quantum signals. By using Fourier transforms, the idea of quantum coding theory can be described in a setting that is much different from that seen that far. Quantum BCH codes can be defined as codes whose quantum states have certain specified consecutive spectral components equal to zero and the error-correcting ability is also described by the number of the consecutive zeros. Moreover, the decoding of quantum codes can be described spectrally with more efficiency.

  8. ASTEC code development, validation and applications for severe accident management within the CESAM European project - 15392

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Chatelard, P.; Chevalier-Jabet, K.; Nowack, H.; Herranz, L.E.; Pascal, G.; Sanchez-Espinoza, V.H.

    2015-01-01

    ASTEC, jointly developed by IRSN and GRS, is considered as the European reference code since it capitalizes knowledge from the European research on the domain. The CESAM project aims at its enhancement and extension for use in severe accident management (SAM) analysis of the nuclear power plants (NPP) of Generation II-III presently under operation or foreseen in near future in Europe, spent fuel pools included. Within the CESAM project 3 main types of research activities are performed: -) further validation of ASTEC models important for SAM, in particular for the phenomena being of importance in the Fukushima-Daichi accidents, such as reflooding of degraded cores, pool scrubbing, hydrogen combustion, or spent fuel pools behaviour; -) modelling improvements, especially for BWR or based on the feedback of validation tasks; and -) ASTEC applications to severe accident scenarios in European NPPs in order to assess prevention and mitigation measures. An important step will be reached with the next major ASTEC V2.1 version planned to be delivered in the first part of 2015. Its main improvements will concern the possibility to simulate in details the core degradation of BWR and PHWR and a model of reflooding of severely degraded cores. A new user-friendly Graphic User Interface will be available for plant analyses

  9. Development and validation of computer codes for analysis of PHWR containment behaviour

    International Nuclear Information System (INIS)

    Markandeya, S.G.; Haware, S.K.; Ghosh, A.K.; Venkat Raj, V.

    1997-01-01

    In order to ensure that the design intent of the containment of Indian Pressurised Heavy Water Reactors (IPHWRs) is met, both analytical and experimental studies are being pursued at BARC. As a part of analytical studies, computer codes for predicting the behaviour of containment under various accident scenarios are developed/adapted. These include codes for predicting 1) pressure, temperature transients in the containment following either Loss of Coolant Accident (LOCA) or Main Steam Line Break (MSLB), 2) hydrogen behaviour in respect of its distribution, combustion and the performance of proposed mitigation systems, and 3) behaviour of fission product aerosols in the piping circuits of the primary heat transport system and in the containment. All these codes have undergone thorough validation using data obtained from in-house test facilities or from international sources. Participation in the International Standard Problem (ISP) exercises has also helped in validation of the codes. The present paper briefly describes some of these codes and the various exercises performed for their validation. (author)

  10. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  11. Intercomparison and validation of computer codes for thermalhydraulic safety analysis of heavy water reactors

    International Nuclear Information System (INIS)

    2004-08-01

    Activities within the frame of the IAEA's Technical Working Group on Advanced Technologies for HWRs (TWG-HWR) are conducted in a project within the IAEA's subprogramme on nuclear power reactor technology development. The objective of the activities on HWRs is to foster, within the frame of the TWG-HWR, information exchange and co-operative research on technology development for current and future HWRs, with an emphasis on safety, economics and fuel resource sustainability. One of the activities recommended by the TWG-HWR was an international standard problem exercise entitled: Intercomparison and validation of computer codes for thermalhydraulics safety analyses. Intercomparison and validation of computer codes used in different countries for thermalhydraulics safety analyses will enhance the confidence in the predictions made by these codes. However, the intercomparison and validation exercise needs a set of reliable experimental data. The RD-14M Large-Loss Of Coolant Accident (LOCA) test B9401 simulating HWR LOCA behaviour that was conducted by Atomic Energy of Canada Ltd (AECL) was selected for this validation project. This report provides a comparison of the results obtained from six participating countries, utilizing four different computer codes. General conclusions are reached and recommendations made

  12. Validation of a pre-coded food record for infants and young children

    DEFF Research Database (Denmark)

    Gondolf, Ulla Holmboe; Tetens, Inge; Hills, A. P.

    2012-01-01

    Background/Objectives:To assess the validity of a 7-day pre-coded food record (PFR) method in 9-month-old infants against metabolizable energy intake (ME(DLW)) measured by doubly labeled water (DLW); additionally to compare PFR with a 7-day weighed food record (WFR) in 9-month-old infants and 36...

  13. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    Directory of Open Access Journals (Sweden)

    F. Terzuoli

    2008-01-01

    Full Text Available Pressurized thermal shock (PTS modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV lifetime is the cold water emergency core cooling (ECC injection into the cold leg during a loss of coolant accident (LOCA. Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mécanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX, and a research code (NEPTUNE CFD. The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling.

  14. Validation of thermal hydraulic computer codes for advanced light water reactor

    International Nuclear Information System (INIS)

    Macek, J.

    2001-01-01

    The Czech Republic operates 4 WWER-440 units, two WWER-1000 units are being finalised (one of them is undergoing commissioning). Thermal-hydraulics Department of the Nuclear Research Institute Rez performs accident analyses for these plants using a number of computer codes. To model the primary and secondary circuits behaviour the system codes ATHLET, CATHARE, RELAP, TRAC are applied. Containment and pressure-suppressure system are modelled with RALOC and MELCOR codes, the reactor power calculations (point and space-neutron kinetics) are made with DYN3D, NESTLE and CDF codes (FLUENT, TRIO) are used for some specific problems. An integral part of the current Czech project 'New Energy Sources' is selection of a new nuclear source. Within this and the preceding projects financed by the Czech Ministry of Industry and Trade and the EU PHARE, the Department carries and has carried out the systematic validation of thermal-hydraulic and reactor physics computer codes applying data obtained on several experimental facilities as well as the real operational data. The paper provides a concise information on these activities of the NRI and its Thermal-hydraulics Department. A detailed example of the system code validation and the consequent utilisation of the results for a real NPP purposes is included. (author)

  15. CFD Code Validation against Stratified Air-Water Flow Experimental Data

    International Nuclear Information System (INIS)

    Terzuoli, F.; Galassi, M.C.; Mazzini, D.; D'Auria, F.

    2008-01-01

    Pressurized thermal shock (PTS) modelling has been identified as one of the most important industrial needs related to nuclear reactor safety. A severe PTS scenario limiting the reactor pressure vessel (RPV) lifetime is the cold water emergency core cooling (ECC) injection into the cold leg during a loss of coolant accident (LOCA). Since it represents a big challenge for numerical simulations, this scenario was selected within the European Platform for Nuclear Reactor Simulations (NURESIM) Integrated Project as a reference two-phase problem for computational fluid dynamics (CFDs) code validation. This paper presents a CFD analysis of a stratified air-water flow experimental investigation performed at the Institut de Mecanique des Fluides de Toulouse in 1985, which shares some common physical features with the ECC injection in PWR cold leg. Numerical simulations have been carried out with two commercial codes (Fluent and Ansys CFX), and a research code (NEPTUNE CFD). The aim of this work, carried out at the University of Pisa within the NURESIM IP, is to validate the free surface flow model implemented in the codes against experimental data, and to perform code-to-code benchmarking. Obtained results suggest the relevance of three-dimensional effects and stress the importance of a suitable interface drag modelling

  16. Design validation of the ITER EC upper launcher according to codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Spaeh, Peter, E-mail: peter.spaeh@kit.edu [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Aiello, Gaetano [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Gagliardi, Mario [Karlsruhe Institute of Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); F4E, Fusion for Energy, Joint Undertaking, Barcelona (Spain); Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Weinhorst, Bastian [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany)

    2015-10-15

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  17. Design validation of the ITER EC upper launcher according to codes and standards

    International Nuclear Information System (INIS)

    Spaeh, Peter; Aiello, Gaetano; Gagliardi, Mario; Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro; Weinhorst, Bastian

    2015-01-01

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  18. Non-binary unitary error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E.

    1996-06-01

    Error operator bases for systems of any dimension are defined and natural generalizations of the bit-flip/ sign-change error basis for qubits are given. These bases allow generalizing the construction of quantum codes based on eigenspaces of Abelian groups. As a consequence, quantum codes can be constructed form linear codes over {ital Z}{sub {ital n}} for any {ital n}. The generalization of the punctured code construction leads to many codes which permit transversal (i.e. fault tolerant) implementations of certain operations compatible with the error basis.

  19. System code improvements for modelling passive safety systems and their validation

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, Sebastian; Cron, Daniel von der; Schaffrath, Andreas [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    GRS has been developing the system code ATHLET over many years. Because ATHLET, among other codes, is widely used in nuclear licensing and supervisory procedures, it has to represent the current state of science and technology. New reactor concepts such as Generation III+ and IV reactors and SMR are using passive safety systems intensively. The simulation of passive safety systems with the GRS system code ATHLET is still a big challenge, because of non-defined operation points and self-setting operation conditions. Additionally, the driving forces of passive safety systems are smaller and uncertainties of parameters have a larger impact than for active systems. This paper addresses the code validation and qualification work of ATHLET on the example of slightly inclined horizontal heat exchangers, which are e. g. used as emergency condensers (e. g. in the KERENA and the CAREM) or as heat exchanger in the passive auxiliary feed water systems (PAFS) of the APR+.

  20. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  1. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  2. Uncertainty propagation applied to multi-scale thermal-hydraulics coupled codes. A step towards validation

    Energy Technology Data Exchange (ETDEWEB)

    Geffray, Clotaire Clement

    2017-03-20

    The work presented here constitutes an important step towards the validation of the use of coupled system thermal-hydraulics and computational fluid dynamics codes for the simulation of complex flows in liquid metal cooled pool-type facilities. First, a set of methods suited for uncertainty and sensitivity analysis and validation activities with regards to the specific constraints of the work with coupled and expensive-to-run codes is proposed. Then, these methods are applied to the ATHLET - ANSYS CFX model of the TALL-3D facility. Several transients performed at this latter facility are investigated. The results are presented, discussed and compared to the experimental data. Finally, assessments of the validity of the selected methods and of the quality of the model are offered.

  3. Validation of ICD-9 Codes for Stable Miscarriage in the Emergency Department.

    Science.gov (United States)

    Quinley, Kelly E; Falck, Ailsa; Kallan, Michael J; Datner, Elizabeth M; Carr, Brendan G; Schreiber, Courtney A

    2015-07-01

    International Classification of Disease, Ninth Revision (ICD-9) diagnosis codes have not been validated for identifying cases of missed abortion where a pregnancy is no longer viable but the cervical os remains closed. Our goal was to assess whether ICD-9 code "632" for missed abortion has high sensitivity and positive predictive value (PPV) in identifying patients in the emergency department (ED) with cases of stable early pregnancy failure (EPF). We studied females ages 13-50 years presenting to the ED of an urban academic medical center. We approached our analysis from two perspectives, evaluating both the sensitivity and PPV of ICD-9 code "632" in identifying patients with stable EPF. All patients with chief complaints "pregnant and bleeding" or "pregnant and cramping" over a 12-month period were identified. We randomly reviewed two months of patient visits and calculated the sensitivity of ICD-9 code "632" for true cases of stable miscarriage. To establish the PPV of ICD-9 code "632" for capturing missed abortions, we identified patients whose visits from the same time period were assigned ICD-9 code "632," and identified those with actual cases of stable EPF. We reviewed 310 patient records (17.6% of 1,762 sampled). Thirteen of 31 patient records assigned ICD-9 code for missed abortion correctly identified cases of stable EPF (sensitivity=41.9%), and 140 of the 142 patients without EPF were not assigned the ICD-9 code "632"(specificity=98.6%). Of the 52 eligible patients identified by ICD-9 code "632," 39 cases met the criteria for stable EPF (PPV=75.0%). ICD-9 code "632" has low sensitivity for identifying stable EPF, but its high specificity and moderately high PPV are valuable for studying cases of stable EPF in epidemiologic studies using administrative data.

  4. Verification and validation of the THYTAN code for the graphite oxidation analysis in the HTGR systems

    International Nuclear Information System (INIS)

    Shimazaki, Yosuke; Isaka, Kazuyoshi; Nomoto, Yasunobu; Seki, Tomokazu; Ohashi, Hirofumi

    2014-12-01

    The analytical models for the evaluation of graphite oxidation were implemented into the THYTAN code, which employs the mass balance and a node-link computational scheme to evaluate tritium behavior in the High Temperature Gas-cooled Reactor (HTGR) systems for hydrogen production, to analyze the graphite oxidation during the air or water ingress accidents in the HTGR systems. This report describes the analytical models of the THYTAN code in terms of the graphite oxidation analysis and its verification and validation (V and V) results. Mass transfer from the gas mixture in the coolant channel to the graphite surface, diffusion in the graphite, graphite oxidation by air or water, chemical reaction and release from the primary circuit to the containment vessel by a safety valve were modeled to calculate the mass balance in the graphite and the gas mixture in the coolant channel. The computed solutions using the THYTAN code for simple questions were compared to the analytical results by a hand calculation to verify the algorithms for each implemented analytical model. A representation of the graphite oxidation experimental was analyzed using the THYTAN code, and the results were compared to the experimental data and the computed solutions using the GRACE code, which was used for the safety analysis of the High Temperature Engineering Test Reactor (HTTR), in regard to corrosion depth of graphite and oxygen concentration at the outlet of the test section to validate the analytical models of the THYTAN code. The comparison of THYTAN code results with the analytical solutions, experimental data and the GRACE code results showed the good agreement. (author)

  5. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  6. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  7. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Fuel performance codes approximate this complex behavior using an axisymmetric, axially-stacked, one-dimensional radial representation to save computation cost. However, the need for improved modeling of PCMI and, particularly, the importance of multidimensional capability for accurate fuel performance simulation has been identified as safety margin decreases. Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed

  8. Manual versus automated coding of free-text self-reported medication data in the 45 and Up Study: a validation study.

    Science.gov (United States)

    Gnjidic, Danijela; Pearson, Sallie-Anne; Hilmer, Sarah N; Basilakis, Jim; Schaffer, Andrea L; Blyth, Fiona M; Banks, Emily

    2015-03-30

    Increasingly, automated methods are being used to code free-text medication data, but evidence on the validity of these methods is limited. To examine the accuracy of automated coding of previously keyed in free-text medication data compared with manual coding of original handwritten free-text responses (the 'gold standard'). A random sample of 500 participants (475 with and 25 without medication data in the free-text box) enrolled in the 45 and Up Study was selected. Manual coding involved medication experts keying in free-text responses and coding using Anatomical Therapeutic Chemical (ATC) codes (i.e. chemical substance 7-digit level; chemical subgroup 5-digit; pharmacological subgroup 4-digit; therapeutic subgroup 3-digit). Using keyed-in free-text responses entered by non-experts, the automated approach coded entries using the Australian Medicines Terminology database and assigned corresponding ATC codes. Based on manual coding, 1377 free-text entries were recorded and, of these, 1282 medications were coded to ATCs manually. The sensitivity of automated coding compared with manual coding was 79% (n = 1014) for entries coded at the exact ATC level, and 81.6% (n = 1046), 83.0% (n = 1064) and 83.8% (n = 1074) at the 5, 4 and 3-digit ATC levels, respectively. The sensitivity of automated coding for blank responses was 100% compared with manual coding. Sensitivity of automated coding was highest for prescription medications and lowest for vitamins and supplements, compared with the manual approach. Positive predictive values for automated coding were above 95% for 34 of the 38 individual prescription medications examined. Automated coding for free-text prescription medication data shows very high to excellent sensitivity and positive predictive values, indicating that automated methods can potentially be useful for large-scale, medication-related research.

  9. Validation of a Subchannel Analysis Code MATRA Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun; Seo, Kyung Won; Kwon, Hyouk

    2008-10-15

    A subchannel analysis code MATRA has been developed for the thermal hydraulic analysis of SMART core. The governing equations and important models were established, and validation calculations have been performed for subchannel flow and enthalpy distributions in rod bundles under steady-state conditions. The governing equations of the MATRA were on the basis of integral balance equation of the two-phase mixture. The effects of non-homogeneous and non-equilibrium states were considered by employing the subcooled boiling model and the phasic slip model. Solution scheme and main structure of the MATRA code, as well as the difference of MATRA and COBRA-IV-I codes, were summarized. Eight different test data sets were employed for the validation of the MATRA code. The collected data consisted of single-phase subchannel flow and temperature distribution data, single-phase inlet flow maldistribution data, single-phase partial flow blockage data, and two-phase subchannel flow and enthalpy distribution data. The prediction accuracy as well as the limitation of the MATRA code was evaluated from this analysis.

  10. CSNI Integral Test Facility Matrices for Validation of Best-Estimate Thermal-Hydraulic Computer Codes

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Internationally agreed Integral Test Facility (ITF) matrices for validation of realistic thermal hydraulic system computer codes were established. ITF development is mainly for Pressurised Water Reactors (PWRs) and Boiling Water Reactors (BWRs). A separate activity was for Russian Pressurised Water-cooled and Water-moderated Energy Reactors (WWER). Firstly, the main physical phenomena that occur during considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. In this paper some specific examples from the ITF matrices will also be provided. The matrices will be a guide for code validation, will be a basis for comparisons of code predictions performed with different system codes, and will contribute to the quantification of the uncertainty range of code model predictions. In addition to this objective, the construction of such a matrix is an attempt to record information which has been generated around the world over the last years, so that it is more accessible to present and future workers in that field than would otherwise be the case.

  11. Application of advanced validation concepts to oxide fuel performance codes: LIFE-4 fast-reactor and FRAPCON thermal-reactor fuel performance codes

    Energy Technology Data Exchange (ETDEWEB)

    Unal, C., E-mail: cu@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Williams, B.J. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Yacout, A. [Argonne National Laboratory, 9700 S. Cass Avenue, Lemont, IL 60439 (United States); Higdon, D.M. [Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States)

    2013-10-15

    Highlights: ► The application of advanced validation techniques (sensitivity, calibration and prediction) to nuclear performance codes FRAPCON and LIFE-4 is the focus of the paper. ► A sensitivity ranking methodology narrows down the number of selected modeling parameters from 61 to 24 for FRAPCON and from 69 to 35 for LIFE-4. ► Fuel creep, fuel thermal conductivity, fission gas transport/release, crack/boundary, and fuel gap conductivity models of LIFE-4 are identified for improvements. ► FRAPCON sensitivity results indicated the importance of the fuel thermal conduction and the fission gas release models. -- Abstract: Evolving nuclear energy programs expect to use enhanced modeling and simulation (M and S) capabilities, using multiscale, multiphysics modeling approaches, to reduce both cost and time from the design through the licensing phases. Interest in the development of the multiscale, multiphysics approach has increased in the last decade because of the need for predictive tools for complex interacting processes as a means of eliminating the limited use of empirically based model development. Complex interacting processes cannot be predicted by analyzing each individual component in isolation. In most cases, the mathematical models of complex processes and their boundary conditions are nonlinear. As a result, the solutions of these mathematical models often require high-performance computing capabilities and resources. The use of multiscale, multiphysics (MS/MP) models in conjunction with high-performance computational software and hardware introduces challenges in validating these predictive tools—traditional methodologies will have to be modified to address these challenges. The advanced MS/MP codes for nuclear fuels and reactors are being developed within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program of the US Department of Energy (DOE) – Nuclear Energy (NE). This paper does not directly address challenges in calibration/validation

  12. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Setiadipura, Topan; Obara, Toru

    2014-01-01

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower k eff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher k eff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  13. Validation of the WIMSD4M cross-section generation code with benchmark results

    International Nuclear Information System (INIS)

    Deen, J.R.; Woodruff, W.L.; Leal, L.E.

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented

  14. Validation of the WIMSD4M cross-section generation code with benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)

    1995-01-01

    The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.

  15. Large leak sodium-water reaction code SWACS and its validation

    International Nuclear Information System (INIS)

    Miyake, O.; Shindo, Y.; Hiroi, H.; Tanabe, H.; Sato, M.

    1982-01-01

    A computer code SWACS for analyzing the large leak accident of an LMFBR steam generators has been developed and validated. Five tests data obtained by SWAT-3 test facility were compared with code results. In each of SWAT-3 tests, a double-ended guillotine rupture of one tube was simulated in a helical coil steam generator model with 1/2.5 scaled test vessel to the prototype SG. The analytical results, including an initial pressure spike, a propagated pressure in a secondary system, and a quasi-steady pressure, indicate that the overall large-leak event could be predicted in reasonably good agreement

  16. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1995-01-01

    This report is a compilation of the information submitted by AECL, CIAE, JAERI, ORNL and Siemens in response to a need identified at the 'Workshop on R and D Needs' at the IGORR-3 meeting. The survey compiled information on the national standards applied to the Safety Quality Assurance (SQA) programs undertaken by the participants. Information was assembled for the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods used to verify and validate the codes and libraries. Although the survey was not comprehensive, it provides a basis for exchanging information of common interest to the research reactor community

  17. Validation of coding algorithms for the identification of patients hospitalized for alcoholic hepatitis using administrative data.

    Science.gov (United States)

    Pang, Jack X Q; Ross, Erin; Borman, Meredith A; Zimmer, Scott; Kaplan, Gilaad G; Heitman, Steven J; Swain, Mark G; Burak, Kelly W; Quan, Hude; Myers, Robert P

    2015-09-11

    Epidemiologic studies of alcoholic hepatitis (AH) have been hindered by the lack of a validated International Classification of Disease (ICD) coding algorithm for use with administrative data. Our objective was to validate coding algorithms for AH using a hospitalization database. The Hospital Discharge Abstract Database (DAD) was used to identify consecutive adults (≥18 years) hospitalized in the Calgary region with a diagnosis code for AH (ICD-10, K70.1) between 01/2008 and 08/2012. Medical records were reviewed to confirm the diagnosis of AH, defined as a history of heavy alcohol consumption, elevated AST and/or ALT (34 μmol/L, and elevated INR. Subgroup analyses were performed according to the diagnosis field in which the code was recorded (primary vs. secondary) and AH severity. Algorithms that incorporated ICD-10 codes for cirrhosis and its complications were also examined. Of 228 potential AH cases, 122 patients had confirmed AH, corresponding to a positive predictive value (PPV) of 54% (95% CI 47-60%). PPV improved when AH was the primary versus a secondary diagnosis (67% vs. 21%; P codes for ascites (PPV 75%; 95% CI 63-86%), cirrhosis (PPV 60%; 47-73%), and gastrointestinal hemorrhage (PPV 62%; 51-73%) had improved performance, however, the prevalence of these diagnoses in confirmed AH cases was low (29-39%). In conclusion the low PPV of the diagnosis code for AH suggests that caution is necessary if this hospitalization database is used in large-scale epidemiologic studies of this condition.

  18. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  19. A validation study of the BURNUP and associated options of the MONTE CARLO neutronics code MONK5W

    International Nuclear Information System (INIS)

    Howard, E.A.

    1985-11-01

    This is a report on the validation of the burnup option of the Monte Carlo Neutronics Code MONK5W, together with the associated facilities which allow for control rod movements and power changes. The validation uses reference solutions produced by the Deterministic Neutronics Code LWR-WIMS for a 2D model which represents a whole reactor calculation with control rod movements. (author)

  20. Validation of a new continuous Monte Carlo burnup code using a Mox fuel assembly

    International Nuclear Information System (INIS)

    El bakkari, B.; El Bardouni, T.; Merroun, O.; El Younoussi, C.; Boulaich, Y.; Boukhal, H.; Chakir, E.

    2009-01-01

    The reactivity of nuclear fuel decreases with irradiation (or burnup) due to the transformation of heavy nuclides and the formation of fission products. Burnup credit studies aim at accounting for fuel irradiation in criticality studies of the nuclear fuel cycle (transport, storage, etc...). The principal objective of this study is to evaluate the potential capabilities of a newly developed burnup code called 'BUCAL1'. BUCAL1 differs in comparison with other burnup codes as it does not use the calculated neutron flux as input to other computer codes to generate the nuclide inventory for the next time step. Instead, BUCAL1 directly uses the neutron reaction tally information generated by MCNP for each nuclide of interest to determine the new nuclides inventory. This allows the full capabilities of MCNP to be incorporated into the calculation and a more accurate and robust analysis to be performed. Validation of BUCAL1 was processed by code-to-code comparisons using predictions of several codes from the NEA/OCED. Infinite multiplication factors (k ∞ ) and important fission product and actinide concentrations were compared for a MOX core benchmark exercise. Results of calculations are analysed and discussed.

  1. Development and validation of ALEPH Monte Carlo burn-up code

    International Nuclear Information System (INIS)

    Stankovskiy, A.; Van den Eynde, G.; Vidmar, T.

    2011-01-01

    The Monte-Carlo burn-up code ALEPH is being developed in SCK-CEN since 2004. Belonging to the category of shells coupling Monte Carlo transport (MCNP or MCNPX) and 'deterministic' depletion codes (ORIGEN-2.2), ALEPH possess some unique features that distinguish it from other codes. The most important feature is full data consistency between steady-state Monte Carlo and time-dependent depletion calculations. Recent improvements of ALEPH concern full implementation of general-purpose nuclear data libraries (JEFF-3.1.1, ENDF/B-VII, JENDL-3.3). The upgraded version of the code is capable to treat isomeric branching ratios, neutron induced fission product yields, spontaneous fission yields and energy release per fission recorded in ENDF-formatted data files. The alternative algorithm for time evolution of nuclide concentrations is added. A predictor-corrector mechanism and the calculation of nuclear heating are available as well. The validation of the code on REBUS experimental programme results has been performed. The upgraded version of ALEPH has shown better agreement with measured data than other codes, including previous version of ALEPH. (authors)

  2. Development and validation of the ENIGMA code for MOX fuel performance modelling

    International Nuclear Information System (INIS)

    Palmer, I.; Rossiter, G.; White, R.J.

    2000-01-01

    The ENIGMA fuel performance code has been under development in the UK since the mid-1980s with contributions made by both the fuel vendor (BNFL) and the utility (British Energy). In recent years it has become the principal code for UO 2 fuel licensing for both PWR and AGR reactor systems in the UK and has also been used by BNFL in support of overseas UO 2 and MOX fuel business. A significant new programme of work has recently been initiated by BNFL to further develop the code specifically for MOX fuel application. Model development is proceeding hand in hand with a major programme of MOX fuel testing and PIE studies, with the objective of producing a fuel modelling code suitable for mechanistic analysis, as well as for licensing applications. This paper gives an overview of the model developments being undertaken and of the experimental data being used to underpin and to validate the code. The paper provides a summary of the code development programme together with specific examples of new models produced. (author)

  3. Contributions to the validation of advanced codes for accident analysis calculations with 3-dimensional neutron kinetics. STC with the Ukraine. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Kliem, S.; Rohde, U.; Khalimonchuk, V.; Kuchin, A.; Seidel, A.

    2000-10-01

    In the frame of a project of scientific-technical cooperation funded by BMBF/BMWi, the coupled code ATHLET-DYN3D has been transferred to the Scientific and Technical Centre on Nuclear and Radiation Safety Kiev (Ukraine). This program code represents an implementation of the 3D core model DYN3D developed by FZR into the GRS thermohydraulics code system ATHLET. For the purpose of validation of this coupled code, a measurement data base has been generated. In the data base suitable experimental data for operational transients from NPPs are collected. The data collection and documentation was performed in accordance with a directive about requirements to measurement data for code validation, which has been elaborated within the project. The validation calculations have been performed for two selected transients. The results of these calculations were compared with measurement values from the data base. The function of the code DYN3D was expanded with a subroutine for reactivity coefficients calculation. Using this modification of the code DYN3D, investigations of reactivity contributions on different operational processes can be performed. (orig.) [de

  4. Examples of Use of SINBAD Database for Nuclear Data and Code Validation

    Science.gov (United States)

    Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto

    2017-09-01

    The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.

  5. AEEW comments on the NNC/CEGB LOCA code validation report RX 440-A

    International Nuclear Information System (INIS)

    Brittain, I.; Bryce, W.M.; O'Mahoney, R.; Richards, C.G.; Gibson, I.H.; Porter, W.H.L.; Fell, J.

    1984-03-01

    Comments are made on the NNC/CEGB report PWR/RX 440-A, Review of Validation for the ECCS Evaluation Model Codes, by K.T. Routledge et al, 1982. This set out to review methods and models used in the LOCA safety case for Sizewell B. These methods are embodied in the Evaluation Model Computer codes SATAN-VI, WREFLOOD, WFLASH, LOCTA-IV and COCO. The main application of these codes is the determination of peak clad temperature and overall containment pressure. The comments represent the views of a group which has been involved for a number of years in the development and application of Best-Estimate methods for LOCA analysis. It is the judgement of this group that, overall, the EM methods can be used to make an acceptable safety case, but there are a number of points of detail still to be resolved. (U.K.)

  6. Validation of a CFD code for Unsteady Flows with cyclic boundary Conditions

    International Nuclear Information System (INIS)

    Kim, Jong-Tae; Kim, Sang-Baik; Lee, Won-Jae

    2006-01-01

    Currently Lilac code is under development to analyze thermo-hydraulics of a high-temperature gas-cooled reactor (GCR). Interesting thermo-hydraulic phenomena in a nuclear reactor are usually unsteady and turbulent. The analysis of the unsteady flows by using a three dimension CFD code is time-consuming if the flow domain is very large. Hopefully, flow domains commonly encountered in the nuclear thermo-hydraulics is periodic. So it is better to use the geometrical characteristics in order to reduce the computational resources. To get the benefits from reducing the computation domains especially for the calculations of unsteady flows, the cyclic boundary conditions are implemented in the parallelized CFD code LILAC. In this study, the parallelized cyclic boundary conditions are validated by solving unsteady laminar and turbulent flows past a circular cylinder

  7. Validation of the ORIGEN-S code for predicting radionuclide inventories in used CANDU Fuel

    International Nuclear Information System (INIS)

    Tait, J.C.; Gauld, I.; Kerr, A.H.

    1994-10-01

    The safety assessment being conducted by AECL Research for the concept of deep geological disposal of used CANDU UO 2 fuel requires the calculation of radionuclide inventories in the fuel to provide source terms for radionuclide release. This report discusses the validation of selected actinide and fission-product inventories calculated using the ORIGEN-S code coupled with the WIMS-AECL lattice code, using data from analytical measurements of radioisotope inventories in Pickering CANDU reactor fuel. The recent processing of new ENDF/B-VI cross-section data has allowed the ORIGEN-S calculations to be performed using the most up-to-date nuclear data available. The results indicate that the code is reliably predicting actinide and the majority of fission-product inventories to within the analytical uncertainty. 38 refs., 4 figs., 5 tabs

  8. Validation of the ORIGEN-S code for predicting radionuclide inventories in used CANDU fuel

    International Nuclear Information System (INIS)

    Tait, J.C.; Gauld, I.; Kerr, A.H.

    1995-01-01

    The safety assessment being conducted by AECL Research for the concept of deep geological disposal of used CANDU UO 2 fuel requires the calculation of radionuclide inventories in the fuel to provide source terms for radionuclide release. This report discusses the validation of selected actinide and fission-product inventories calculated using the ORIGEN-S code coupled with the WIMS-AECL lattice code, using data from analytical measurements of radioisotope inventories in Pickering CANDU reactor fuel. The recent processing of new ENDF/B-VI cross-section data has allowed the ORIGEN-S calculations to be performed using the most up-to-date nuclear data available. The results indicate that the code is reliably predicting actinide and the majority of fission-product inventories to within the analytical uncertainty. ((orig.))

  9. Validation of the ASSERT subchannel code for prediction of CHF in standard and non-standard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Kiteley, J.C.; Carver, M.B.; Zhou, Q.N.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting critical heat flux (CHF) at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is the only tool available to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries. 28 refs., 12 figs

  10. Validation of the assert subchannel code: Prediction of CHF in standard and non-standard Candu bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of prediting CHF at these local conditions, makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries

  11. Validation of Mean Drift Forces Computed with the BEM Code NEMOH

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg

    This report covers a simple investigation of mean drift forces found by use of the boundary element method code NEMOH. The results from NEMOH are compared to analytical results from literature and to numerical values found from the commercial software package WADAM by DNV-GL. The work was conduct...... under the project ”Mooring Solutions for Large Wave Energy Converters”, during", Work Package 4: Full Dynamic Analysis". The validation compares results from a simple sphere and from a vertical cylinder....

  12. IAEA programme to support development and validation of advanced design and safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J., E-mail: J.H.Choi@iaea.org [International Atomic Energy Agency, Vienna (Austria)

    2013-07-01

    The International Atomic Energy Agency (IAEA) has been organized many international collaboration programs to support the development and validation of design and safety analysis computer codes for nuclear power plants. These programs are normally implemented with a frame of Coordinated Research Project (CRP) or International Collaborative Standard Problem (ICSP). This paper introduces CRPs and ICSPs currently being organized or recently completed by IAEA for this purpose. (author)

  13. National assessment of validity of coding of acute mastoiditis: a standardised reassessment of 1966 records.

    Science.gov (United States)

    Stalfors, J; Enoksson, F; Hermansson, A; Hultcrantz, M; Robinson, Å; Stenfeldt, K; Groth, A

    2013-04-01

    To investigate the internal validity of the diagnosis code used at discharge after treatment of acute mastoiditis. Retrospective national re-evaluation study of patient records 1993-2007 and make comparison with the original ICD codes. All ENT departments at university hospitals and one large county hospital department in Sweden. A total of 1966 records were reviewed for patients with ICD codes for in-patient treatment of acute (529), chronic (44) and unspecified mastoiditis (21) and acute otitis media (1372). ICD codes were reviewed by the authors with a defined protocol for the clinical diagnosis of acute mastoiditis. Those not satisfying the diagnosis were given an alternative diagnosis. Of 529 records with ICD coding for acute mastoiditis, 397 (75%) were found to meet the definition of acute mastoiditis used in this study, while 18% were not diagnosed as having any type of mastoiditis after review. Review of the in-patients treated for acute media otitis identified an additional 60 cases fulfilling the definition of acute mastoiditis. Overdiagnosis was common, and many patients with a diagnostic code indicating acute mastoiditis had been treated for external otitis or otorrhoea with transmyringeal drainage. The internal validity of the diagnosis acute mastoiditis is dependent on the use of standardised, well-defined criteria. Reliability of diagnosis is fundamental for the comparison of results from different studies. Inadequate reliability in the diagnosis of acute mastoiditis also affects calculations of incidence rates and statistical power and may also affect the conclusions drawn from the results. © 2013 Blackwell Publishing Ltd.

  14. Validation of ASTEC v1.0 computer code against FPT2 test

    International Nuclear Information System (INIS)

    Mladenov, I.; Tusheva, P.; Kalchev, B.; Dimov, D.; Ivanov, I.

    2005-01-01

    The aim of the work is by various nodalization schemes of the model to investigate the ASTEC v1.0 computer code sensitivity and to validate the code against PHEBUS - FPT2 experiment. This code is used for severe accident analysis. The aim corresponds to the main technical objective of the experiment which is to contribute to the validation of models and computer codes to be used for the calculation of the source term in case of a severe accident in a Light Water Reactor. The objective's scope of the FPT2 is large - separately for the bundle, the experimental circuit and the containment. Additional objectives are to characterize aerosol sizing and deposition processes, and also potential FP poisoning effects on hydrogen recombiner coupons exposed to containment atmospheric conditions representative of a LWR severe accident. The analyses of the results of the performed calculations show a good accordance with the reference case calculations, and then with the experimental data. Some differences in the calculations for the thermal behavior appear locally during the oxidation phase and the heat-up phase. There is very good confirmation regarding the volatile and semi-volatile fission products release from the fuel pellets. Important for analysis of the process is the final axial distribution of the mass of fuel relocation obtained at the end of the calculation

  15. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors

    International Nuclear Information System (INIS)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L.; Xolocostli M, J. V.; Gomez T, A. M.

    2016-09-01

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S_N, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO_2 cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  16. Validity of the Italian Code of Ethics for everyday nursing practice.

    Science.gov (United States)

    Gobbi, Paola; Castoldi, Maria Grazia; Alagna, Rosa Anna; Brunoldi, Anna; Pari, Chiara; Gallo, Annamaria; Magri, Miriam; Marioni, Lorena; Muttillo, Giovanni; Passoni, Claudia; Torre, Anna La; Rosa, Debora; Carnevale, Franco A

    2016-12-07

    The research question for this study was as follows: Is the Code of Ethics for Nurses in Italy (Code) a valid or useful decision-making instrument for nurses faced with ethical problems in their daily clinical practice? Focus groups were conducted to analyze specific ethical problems through 11 case studies. The analysis was conducted using sections of the Code as well as other relevant documents. Each focus group had a specific theme and nurses participated freely in the discussions according to their respective clinical competencies. The executive administrative committee of the local nursing licensing council provided approval for conducting this project. Measures were taken to protect the confidentiality of consenting participants. The answer to the research question posed for this investigation was predominantly positive. Many sections of the Code were useful for discussion and identifying possible solutions for the ethical problems presented in the 11 cases. We concluded that the Code of Ethics for Nurses in Italy can be a valuable aid in daily practice in most clinical situations that can give rise to ethical problems. © The Author(s) 2016.

  17. Validity of the coding for herpes simplex encephalitis in the Danish National Patient Registry

    DEFF Research Database (Denmark)

    Jørgensen, Laura Krogh; Dalgaard, Lars Skov; Østergaard, Lars Jørgen

    2016-01-01

    BACKGROUND: Large health care databases are a valuable source of infectious disease epidemiology if diagnoses are valid. The aim of this study was to investigate the accuracy of the recorded diagnosis coding of herpes simplex encephalitis (HSE) in the Danish National Patient Registry (DNPR...... (7.3%) as probable cases providing an overall PPV of 58.0% (95% confidence interval [CI]: 53.0-62.9). For "Encephalitis due to herpes simplex virus" (ICD-10 code B00.4), the PPV was 56.6% (95% CI: 51.1-62.0). Similarly, the PPV for "Meningoencephalitis due to herpes simplex virus" (ICD-10 code B00.4A......) was 56.8% (95% CI: 39.5-72.9). "Herpes viral encephalitis" (ICD-10 code G05.1E) had a PPV of 75.9% (95% CI: 56.5-89.7), thereby representing the highest PPV. The estimated sensitivity was 95.5%. CONCLUSION: The PPVs of the ICD-10 diagnosis coding for adult HSE in the DNPR were relatively low. Hence...

  18. Validation of the REL2005 code package on Gd-poisoned PWR type assemblies through the CAMELEON experimental program

    International Nuclear Information System (INIS)

    Blaise, Patrick; Vidal, Jean-Francois; Santamarina, Alain

    2009-01-01

    This paper details the validation of Gd-poisoned 17x17 PWR lattices, through several configurations of the CAMELEON experimental program, by using the newly qualified REL2005 French code package. After a general presentation of the CAMELEON program that took place in the EOLE critical Facility in Cadarache, one describes the new REL2005 code package relying on the deterministic transport code APOLLO2.8 based on characteristics method (MOC), and its new CEA2005 library based on the latest JEFF-3.1.1 nuclear data evaluation. For critical masses, the average Calculation-to-Experiment C/E's on the k eff are (136 ± 80) pcm and (300 ± 76) pcm for the reference 281 groups MOC and optimized 26 groups MOC schemes respectively. These values include also a drastic improvement of about 250 pcm due to the change in the library from JEF2.2 to JEFF3.1. For pin-by-pin radial power distributions, reference and REL2005 results are very close, with maximum discrepancies of the order of 2%, i.e., in the experimental uncertainty limits. The Optimized REL2005 code package allows to predict the reactivity worth of the Gd-clusters (averaged on 9 experimental configurations) to be C/E Δρ(Gd clusters) = +1.3% ± 2.3%. (author)

  19. Validation of the Thermal-Hydraulic Model in the SACAP Code with the ISP Tests

    Energy Technology Data Exchange (ETDEWEB)

    Park, Soon-Ho; Kim, Dong-Min; Park, Chang-Hwan [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    In safety viewpoint, the pressure of the containment is the important parameter, of course, the local hydrogen concentration is also the parameter of the major concern because of its flammability and the risk of the detonation. In Korea, there have been an extensive efforts to develop the computer code which can analyze the severe accident behavior of the pressurized water reactor. The development has been done in a modularized manner and SACAP(Severe Accident Containment Analysis Package) code is now under final stage of development. SACAP code adopts LP(Lumped Parameter) model and is applicable to analyze the synthetic behavior of the containment during severe accident occurred by thermal-hydraulic transient, combustible gas burn, direct containment heating by high pressure melt ejection, steam explosion and molten core-concrete interaction. The analyses of a number of ISP(International Standard Problem) experiments were done as a part of the SACAP code V and V(verification and validation). In this paper, the SACAP analysis results for ISP-35 NUPEC and ISP-47 TOSQAN are presented including comparison with other existing NPP simulation codes. In this paper, we selected and analyzed ISP-35 NUPEC, ISP-47 TOSQAN in order to confirm the computational performance of SACAP code currently under development. Now the multi-node analysis for the ISP-47 is under process. As a result of simulation, SACAP predicts well the thermal-hydraulic variables such as temperature, pressure, etc. Also, we verify that SACAP code is properly equipped to analyze the gas distribution and condensation.

  20. Validation and comparison of two-phase flow modeling capabilities of CFD, sub channel and system codes by means of post-test calculations of BFBT transient tests

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim; Manes, Jorge Perez; Imke, Uwe; Escalante, Javier Jimenez; Espinoza, Victor Sanchez, E-mail: victor.sanchez@kit.edu

    2013-10-15

    Highlights: • Simulation of BFBT turbine and pump transients at multiple scales. • CFD, sub-channel and system codes are used for the comparative study. • Heat transfer models are compared to identify difference between the code predictions. • All three scales predict results in good agreement to experiment. • Sub cooled boiling models are identified as field for future research. -- Abstract: The Institute for Neutron Physics and Reactor Technology (INR) at the Karlsruhe Institute of Technology (KIT) is involved in the validation and qualification of modern thermo hydraulic simulations tools at various scales. In the present paper, the prediction capabilities of four codes from three different scales – NEPTUNE{sub C}FD as fine mesh computational fluid dynamics code, SUBCHANFLOW and COBRA-TF as sub channels codes and TRACE as system code – are assessed with respect to their two-phase flow modeling capabilities. The subject of the investigations is the well-known and widely used data base provided within the NUPEC BFBT benchmark related to BWRs. Void fraction measurements simulating a turbine and a re-circulation pump trip are provided at several axial levels of the bundle. The prediction capabilities of the codes for transient conditions with various combinations of boundary conditions are validated by comparing the code predictions with the experimental data. In addition, the physical models of the different codes are described and compared to each other in order to explain the different results and to identify areas for further improvements.

  1. Verification and validation of the PLTEMP/ANL code for thermal hydraulic analysis of experimental and test reactors

    International Nuclear Information System (INIS)

    Kalimullah, M.; Olson, A.O.; Feldman, E.E.; Hanan, N.; Dionne, B.

    2012-01-01

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  2. Verification and Validation of the PLTEMP/ANL Code for Thermal-Hydraulic Analysis of Experimental and Test Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-04-07

    The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.

  3. Preliminary validation of RELAP5/Mod4.0 code for LBE cooled NACIE facility

    Energy Technology Data Exchange (ETDEWEB)

    Kumari, Indu; Khanna, Ashok, E-mail: akhanna@iitk.ac.in

    2017-04-01

    Highlights: • Detail discussion of thermo physical properties of Lead Bismuth Eutectic incorporated in the code RELAP5/Mod4.0 included. • Benchmarking of LBE properties in RELAP5/Mod4.0 against literature. • NACIE facility for three different power levels (10.8, 21.7 and 32.5 kW) under natural circulation considered for benchmarking. • Preliminary validation of the LBE properties against experimental data. • NACIE facility for power level 22.5 kW considered for validation. - Abstract: The one-dimensional thermal hydraulic computer code RELAP5 was developed for thermal hydraulic study of light water reactor as well as for nuclear research reactors. The purpose of this work is to evaluate the code RELAP5/Mod4.0 for analysis of research reactors. This paper consists of three major sections. The first section presents detailed discussions on thermo-physical properties of Lead Bismuth Eutectic (LBE) incorporated in RELAP5/Mod4.0 code. In the second section, benchmarking of RELAP5/Mod4.0 has been done with the Natural Circulation Experimental (NACIE) facility in comparison with Barone’s simulations using RELAP5/Mod3.3. Three different power levels (10.8 kW, 21.7 kW and 32.5 kW) under natural circulation conditions are considered. Results obtained for LBE temperatures, temperature difference across heat section, pin surface temperatures, mass flow rates and heat transfer coefficients in heat section heat exchanger are in agreement with Barone’s simulation results within 7% of average relative error. Third section presents validation of RELAP5/Mod4.0 against the experimental data of NACIE facility performed by Tarantino et al. test number 21 at power of 22.5 kW comparing the profiles of temperatures, mass flow rate and velocity of LBE. Simulation and experimental results agree within 7% of average relative error.

  4. Context based Coding of Quantized Alpha Planes for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2002-01-01

    In object based video, each frame is a composition of objects that are coded separately. The composition is performed through the alpha plane that represents the transparency of the object. We present an alternative to MPEG-4 for coding of alpha planes that considers their specific properties....... Comparisons in terms of rate and distortion are provided, showing that the proposed coding scheme for still alpha planes is better than the algorithms for I-frames used in MPEG-4....

  5. Development of libraries for ORIGEN2 code based on JENDL-3.2

    Energy Technology Data Exchange (ETDEWEB)

    Suyama, Kenya; Katakura, Jun-ichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Makoto; Ohkawachi, Yasushi

    1998-03-01

    The working Group of JNDC `Nuclide Generation Evaluation` has launched a project to make libraries for ORIGEN2 code based on the latest nuclear data library `JENDL-3.2` for current design of LWR and FBR fuels. Many of these libraries are under validation. (author)

  6. First validation of the new continuous energy version of the MORET5 Monte Carlo code

    International Nuclear Information System (INIS)

    Miss, Joachim; Bernard, Franck; Forestier, Benoit; Haeck, Wim; Richet, Yann; Jacquet, Olivier

    2008-01-01

    The 5.A.1 version is the next release of the MORET Monte Carlo code dedicated to criticality and reactor calculations. This new version combines all the capabilities that are already available in the multigroup version with many new and enhanced features. The main capabilities of the previous version are the powerful association of a deterministic and Monte Carlo approach (like for instance APOLLO-MORET), the modular geometry, five source sampling techniques and two simulation strategies. The major advance in MORET5 is the ability to perform calculations either a multigroup or a continuous energy simulation. Thanks to these new developments, we now have better control over the whole process of criticality calculations, from reading the basic nuclear data to the Monte Carlo simulation itself. Moreover, this new capability enables us to better validate the deterministic-Monte Carlo multigroup calculations by performing continuous energy calculations with the same code, using the same geometry and tracking algorithms. The aim of this paper is to describe the main options available in this new release, and to present the first results. Comparisons of the MORET5 continuous-energy results with experimental measurements and against another continuous-energy Monte Carlo code are provided in terms of validation and time performance. Finally, an analysis of the interest of using a unified energy grid for continuous energy Monte Carlo calculations is presented. (authors)

  7. Development and validation of corium oxidation model for the VAPEX code

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, V.I.; Davydov, M.V.; Melikhov, O.I.; Borovkova, E.M.

    2011-01-01

    In light water reactor core melt accidents, the molten fuel (corium) can be brought into contact with coolant water in the course of the melt relocation in-vessel and ex-vessel as well as in an accident mitigation action of water addition. Mechanical energy release from such an interaction is of interest in evaluating the structural integrity of the reactor vessel as well as of the containment. Usually, the source for the energy release is considered to be the rapid transfer of heat from the molten fuel to the water ('vapor explosion'). When the fuel contains a chemically reactive metal component, there could be an additional source for the energy release, which is the heat release and hydrogen production due to the metal-water chemical reaction. In Electrogorsk Research and Engineering Center the computer code VAPEX (VAPor EXplosion) has been developed for analysis of the molten fuel coolant interaction. Multifield approach is used for modeling of dynamics of following phases: water, steam, melt jet, melt droplets, debris. The VAPEX code was successfully validated on FARO experimental data. Hydrogen generation was observed in FARO tests even though corium didn't contain metal component. The reason for hydrogen generation was not clear, so, simplified empirical model of hydrogen generation was implemented in the VAPEX code to take into account input of hydrogen into pressure increase. This paper describes new more detailed model of hydrogen generation due to the metal-water chemical reaction and results of its validation on ZREX experiments. (orig.)

  8. Validation of ICD-9-CM coding algorithm for improved identification of hypoglycemia visits

    Directory of Open Access Journals (Sweden)

    Lieberman Rebecca M

    2008-04-01

    Full Text Available Abstract Background Accurate identification of hypoglycemia cases by International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM codes will help to describe epidemiology, monitor trends, and propose interventions for this important complication in patients with diabetes. Prior hypoglycemia studies utilized incomplete search strategies and may be methodologically flawed. We sought to validate a new ICD-9-CM coding algorithm for accurate identification of hypoglycemia visits. Methods This was a multicenter, retrospective cohort study using a structured medical record review at three academic emergency departments from July 1, 2005 to June 30, 2006. We prospectively derived a coding algorithm to identify hypoglycemia visits using ICD-9-CM codes (250.3, 250.8, 251.0, 251.1, 251.2, 270.3, 775.0, 775.6, and 962.3. We confirmed hypoglycemia cases by chart review identified by candidate ICD-9-CM codes during the study period. The case definition for hypoglycemia was documented blood glucose 3.9 mmol/l or emergency physician charted diagnosis of hypoglycemia. We evaluated individual components and calculated the positive predictive value. Results We reviewed 636 charts identified by the candidate ICD-9-CM codes and confirmed 436 (64% cases of hypoglycemia by chart review. Diabetes with other specified manifestations (250.8, often excluded in prior hypoglycemia analyses, identified 83% of hypoglycemia visits, and unspecified hypoglycemia (251.2 identified 13% of hypoglycemia visits. The absence of any predetermined co-diagnosis codes improved the positive predictive value of code 250.8 from 62% to 92%, while excluding only 10 (2% true hypoglycemia visits. Although prior analyses included only the first-listed ICD-9 code, more than one-quarter of identified hypoglycemia visits were outside this primary diagnosis field. Overall, the proposed algorithm had 89% positive predictive value (95% confidence interval, 86–92 for

  9. Verification and Validation of the k-kL Turbulence Model in FUN3D and CFL3D Codes

    Science.gov (United States)

    Abdol-Hamid, Khaled S.; Carlson, Jan-Renee; Rumsey, Christopher L.

    2015-01-01

    The implementation of the k-kL turbulence model using multiple computational uid dy- namics (CFD) codes is reported herein. The k-kL model is a two-equation turbulence model based on Abdol-Hamid's closure and Menter's modi cation to Rotta's two-equation model. Rotta shows that a reliable transport equation can be formed from the turbulent length scale L, and the turbulent kinetic energy k. Rotta's equation is well suited for term-by-term mod- eling and displays useful features compared to other two-equation models. An important di erence is that this formulation leads to the inclusion of higher-order velocity derivatives in the source terms of the scale equations. This can enhance the ability of the Reynolds- averaged Navier-Stokes (RANS) solvers to simulate unsteady ows. The present report documents the formulation of the model as implemented in the CFD codes Fun3D and CFL3D. Methodology, veri cation and validation examples are shown. Attached and sepa- rated ow cases are documented and compared with experimental data. The results show generally very good comparisons with canonical and experimental data, as well as matching results code-to-code. The results from this formulation are similar or better than results using the SST turbulence model.

  10. State of art in FE-based fuel performance codes

    International Nuclear Information System (INIS)

    Kim, Hyo Chan; Yang, Yong Sik; Kim, Dae Ho; Bang, Je Geon; Kim, Sun Ki; Koo, Yang Hyun

    2013-01-01

    Finite element (FE) method that is reliable and proven solution in mechanical field has been introduced into fuel performance codes for multidimensional analysis. The present state of the art in numerical simulation of FE-based fuel performance predominantly involves 2-D axisymmetric model and 3-D volumetric model. The FRAPCON and FRAPTRAN own 1.5-D and 2-D FE model to simulate PCMI and cladding ballooning. In 2-D simulation, the FALCON code, developed by EPRI, is a 2-D (R-Z and R-θ) fully thermal-mechanically coupled steady-state and transient FE-based fuel behavior code. The French codes TOUTATIS and ALCYONE which are 3-D, and typically used to investigate localized behavior. In 2008, the Idaho National Laboratory (INL) has been developing multidimensional (2-D and 3-D) nuclear fuel performance code called BISON. In this paper, the current state of FE-based fuel performance code and their models are presented. Based on investigation into the codes, requirements and direction of development for new FE-based fuel performance code can be discussed. Based on comparison of models in FE-based fuel performance code, status of art in the codes can be discussed. A new FE-based fuel performance code should include typical pellet and cladding models which all codes own. In particular, specified pellet and cladding model such as gaseous swelling and high burnup structure (HBS) model should be developed to improve accuracy of code as well as consider AC condition. To reduce computation cost, the approximated gap and the optimized contact model should be also developed. Nuclear fuel operates in an extreme environment that induces complex multiphysics phenomena, occurring over distances ranging from inter-atomic spacing to meters, and times scales ranging from microseconds to years. This multiphysics behavior is often tightly coupled, a well known example being the thermomechanical behavior. Adding to this complexity, important aspects of fuel behavior are inherently

  11. Validity evidence based on test content.

    Science.gov (United States)

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  12. Development and validation of a new solver based on the interfacial area transport equation for the numerical simulation of sub-cooled boiling with OpenFOAM CFD code for nuclear safety applications

    Energy Technology Data Exchange (ETDEWEB)

    Alali, Abdullah

    2014-02-21

    The one-group interfacial area transport equation has been coupled to a wall heat flux partitioning model in the framework of two-phase Eulerian approach using the OpenFOAM CFD code for better prediction of subcooled boiling phenomena which is essential for safety analysis of nuclear reactors. The interfacial area transport equation has been modified to include the effect of bubble nucleation at the wall and condensation by subcooled liquid in the bulk that governs the non-uniform bubble size distribution.

  13. Development and validation of a new solver based on the interfacial area transport equation for the numerical simulation of sub-cooled boiling with OpenFOAM CFD code for nuclear safety applications

    International Nuclear Information System (INIS)

    Alali, Abdullah

    2014-01-01

    The one-group interfacial area transport equation has been coupled to a wall heat flux partitioning model in the framework of two-phase Eulerian approach using the OpenFOAM CFD code for better prediction of subcooled boiling phenomena which is essential for safety analysis of nuclear reactors. The interfacial area transport equation has been modified to include the effect of bubble nucleation at the wall and condensation by subcooled liquid in the bulk that governs the non-uniform bubble size distribution.

  14. Development, validation and application of NAFA 2D-CFD code

    International Nuclear Information System (INIS)

    Vaidya, A.M.; Maheshwari, N.K.; Vijayan, P.K.; Saha, D.

    2010-01-01

    A 2D axi-symmetric code named NAFA (Version 1.0) is developed for studying the pipe flow under various conditions. It can handle laminar/ turbulent flows, with or without heat transfer, under sub-critical/super-critical conditions. The code solves for momentum, energy equations with standard k-ε turbulence model (with standard wall functions). It solves pipe flow subjected to 'velocity inlet', 'wall', 'axis' and 'pressure outlet' boundary conditions. It is validated for several cases by comparing its results with experimental data/analytical solutions/correlations. The code has excellent convergence characteristics as verified from fall of equation residual in each case. It has proven capability of generating mesh independent results for laminar as well as turbulent flows. The code is applied to supercritical flows. For supercritical flows, the effect of mesh size on prediction of heat transfer coefficient is studied. With grid refinement, the Y + reduces and reaches the limiting value of 11.63. Hence the accuracy is found to increase with grid refinement. NAFA is able to qualitatively predict the effect of heat flux and operating pressure on heat transfer coefficient. The heat transfer coefficient matches well with experimental values under various conditions. (author)

  15. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  16. Thyc, a 3D thermal-hydraulic code for rod bundles. Recent developments and validation tests

    International Nuclear Information System (INIS)

    Caremoli, C.; Rascle, P.; Aubry, S.; Olive, J.

    1993-09-01

    PWR or LMFBR cores or fuel assemblies, PWR steam generators, condensers, tubular heat exchangers, are basic components of a nuclear power plant involving two-phase flows in tube or rod bundles. A deep knowledge of the detailed flow patterns on the shell side is necessary to evaluate DNB margins in reactor cores, singularity effects (grids, wire spacers, support plates, baffles), corrosion on steam generator tube sheet, bypass effects and vibration risks. For that purpose, Electricite de France has developed, since 1986, a general purpose code named THYC (Thermal HYdraulic Code) designed to study three-dimensional single and two phase flows in rod or tube bundles (pressurized water reactor cores, steam generators, condensers, heat exchangers). It considers the three-dimensional domain to contain two kinds of components: fluid and solids. The THYC model is obtained by space-time averaging of the instantaneous equations (mass, momentum and energy) of each phase over control volumes including fluid and solids. This paper briefly presents the physical model and the numerical method used in THYC. Then, validation tests (comparison with experiments) and applications (coupling with three-dimensional neutronics code and DNB predictions) are presented. They emphasize the last developments and new capabilities of the code. (authors). 10 figs., 3 tabs., 21 refs

  17. Experimental validation for combustion analysis of GOTHIC code in 2-dimensional combustion chamber

    International Nuclear Information System (INIS)

    Lee, J. W.; Yang, S. Y.; Park, K. C.; Jung, S. H.

    2002-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. The experimental chamber has about 24 liter free volume (1x0.024x1 m 3 ) and 2-dimensional rectangular shape. The test were preformed with 10% hydrogen/air gas mixture and conducted with combination of two igniter positions (top center, top corner) and two boundary conditions (bottom full open, bottom right half open). Using the lumped parameter and mechanistic combustion model in GOTHIC code, the SNU experiments were simulated under the same conditions. The GOTHIC code prediction of the hydrogen combustion phenomena did not compare well with the experimental results. In case of lumped parameter simulation, the combustion time was predicted appropriately. But any other local information related combustion phenomena could not be obtained. In case of mechanistic combustion analysis, the physical combustion phenomena of gas mixture were not matched experimental ones. In boundary open cases, the GOTHIC predicted very long combustion time and the flame front propagation could not simulate appropriately. Though GOTHIC showed flame propagation phenomenon in adiabatic calculation, the induction time of combustion was still very long compare with experimental results. Also, it was found that the combustion model of GOTHIC code had some weak points in low concentration of hydrogen combustion simulation

  18. Validation of TEMP: A finite line heat transfer code for geologic repositories for nuclear waste

    International Nuclear Information System (INIS)

    Atterbury, W.G.; Hetteburg, J.R.; Wurm, K.J.

    1987-09-01

    TEMP is a FORTRAN computer code for calculating temperatures in a geologic repository for nuclear waste. A previous report discusses the structure, usage, verification, and benchmarking of TEMP V1.0 (Wurm et al., 1987). This report discusses modifications to the program in the development of TEMP V1.1 and documents the validation of TEMP. The development of TEMP V1.1 from TEMP V1.0 consisted of two major efforts. The first was to recode several of the subroutines to improve logic flow and to allow for geometry-independent temperature calculation routines which, in turn, allowed for the addition of the geometry-independent validation option. The validation option provides TEMP with the ability to model any geometry of temperature sources with any step-wise heat release rate. This capability allows TEMP to model the geometry and heat release characteristics of the validation problems. The validation of TEMP V1.1 consists of the comparison of TEMP to three in-ground heater tests. The three tests chosen were Avery Island, Louisiana, Site A; Avery Island, Louisiana, Site C; and Asse Mine, Federal Republic of Germany, Site 2. TEMP shows marginal comparison with the two Avery Island sites and good comparison with the Asse Mine Site. 8 refs., 25 figs., 14 tabs

  19. Methodology for testing and validating knowledge bases

    Science.gov (United States)

    Krishnamurthy, C.; Padalkar, S.; Sztipanovits, J.; Purves, B. R.

    1987-01-01

    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation.

  20. Central Decoding for Multiple Description Codes based on Domain Partitioning

    Directory of Open Access Journals (Sweden)

    M. Spiertz

    2006-01-01

    Full Text Available Multiple Description Codes (MDC can be used to trade redundancy against packet loss resistance for transmitting data over lossy diversity networks. In this work we focus on MD transform coding based on domain partitioning. Compared to Vaishampayan’s quantizer based MDC, domain based MD coding is a simple approach for generating different descriptions, by using different quantizers for each description. Commonly, only the highest rate quantizer is used for reconstruction. In this paper we investigate the benefit of using the lower rate quantizers to enhance the reconstruction quality at decoder side. The comparison is done on artificial source data and on image data. 

  1. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  2. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  3. Using clinician text notes in electronic medical record data to validate transgender-related diagnosis codes.

    Science.gov (United States)

    Blosnich, John R; Cashy, John; Gordon, Adam J; Shipherd, Jillian C; Kauth, Michael R; Brown, George R; Fine, Michael J

    2018-04-04

    Transgender individuals are vulnerable to negative health risks and outcomes, but research remains limited because data sources, such as electronic medical records (EMRs), lack standardized collection of gender identity information. Most EMR do not include the gold standard of self-identified gender identity, but International Classification of Diseases (ICDs) includes diagnostic codes indicating transgender-related clinical services. However, it is unclear if these codes can indicate transgender status. The objective of this study was to determine the extent to which patients' clinician notes in EMR contained transgender-related terms that could corroborate ICD-coded transgender identity. Data are from the US Department of Veterans Affairs Corporate Data Warehouse. Transgender patients were defined by the presence of ICD9 and ICD10 codes associated with transgender-related clinical services, and a 3:1 comparison group of nontransgender patients was drawn. Patients' clinician text notes were extracted and searched for transgender-related words and phrases. Among 7560 patients defined as transgender based on ICD codes, the search algorithm identified 6753 (89.3%) with transgender-related terms. Among 22 072 patients defined as nontransgender without ICD codes, 246 (1.1%) had transgender-related terms; after review, 11 patients were identified as transgender, suggesting a 0.05% false negative rate. Using ICD-defined transgender status can facilitate health services research when self-identified gender identity data are not available in EMR.

  4. Validity of administrative database code algorithms to identify vascular access placement, surgical revisions, and secondary patency.

    Science.gov (United States)

    Al-Jaishi, Ahmed A; Moist, Louise M; Oliver, Matthew J; Nash, Danielle M; Fleet, Jamie L; Garg, Amit X; Lok, Charmaine E

    2018-03-01

    We assessed the validity of physician billing codes and hospital admission using International Classification of Diseases 10th revision codes to identify vascular access placement, secondary patency, and surgical revisions in administrative data. We included adults (≥18 years) with a vascular access placed between 1 April 2004 and 31 March 2013 at the University Health Network, Toronto. Our reference standard was a prospective vascular access database (VASPRO) that contains information on vascular access type and dates of placement, dates for failure, and any revisions. We used VASPRO to assess the validity of different administrative coding algorithms by calculating the sensitivity, specificity, and positive predictive values of vascular access events. The sensitivity (95% confidence interval) of the best performing algorithm to identify arteriovenous access placement was 86% (83%, 89%) and specificity was 92% (89%, 93%). The corresponding numbers to identify catheter insertion were 84% (82%, 86%) and 84% (80%, 87%), respectively. The sensitivity of the best performing coding algorithm to identify arteriovenous access surgical revisions was 81% (67%, 90%) and specificity was 89% (87%, 90%). The algorithm capturing arteriovenous access placement and catheter insertion had a positive predictive value greater than 90% and arteriovenous access surgical revisions had a positive predictive value of 20%. The duration of arteriovenous access secondary patency was on average 578 (553, 603) days in VASPRO and 555 (530, 580) days in administrative databases. Administrative data algorithms have fair to good operating characteristics to identify vascular access placement and arteriovenous access secondary patency. Low positive predictive values for surgical revisions algorithm suggest that administrative data should only be used to rule out the occurrence of an event.

  5. Validation of full core geometry model of the NODAL3 code in the PWR transient Benchmark problems

    International Nuclear Information System (INIS)

    T-M Sembiring; S-Pinem; P-H Liem

    2015-01-01

    The coupled neutronic and thermal-hydraulic (T/H) code, NODAL3 code, has been validated in some PWR static benchmark and the NEACRP PWR transient benchmark cases. However, the NODAL3 code have not yet validated in the transient benchmark cases of a control rod assembly (CR) ejection at peripheral core using a full core geometry model, the C1 and C2 cases. By this research work, the accuracy of the NODAL3 code for one CR ejection or the unsymmetrical group of CRs ejection case can be validated. The calculations by the NODAL3 code have been carried out by the adiabatic method (AM) and the improved quasistatic method (IQS). All calculated transient parameters by the NODAL3 code were compared with the reference results by the PANTHER code. The maximum relative difference of 16 % occurs in the calculated time of power maximum parameter by using the IQS method, while the relative difference of the AM method is 4 % for C2 case. All calculation results by the NODAL3 code shows there is no systematic difference, it means the neutronic and T/H modules are adopted in the code are considered correct. Therefore, all calculation results by using the NODAL3 code are very good agreement with the reference results. (author)

  6. Validation and optimisation of an ICD-10-coded case definition for sepsis using administrative health data

    Science.gov (United States)

    Jolley, Rachel J; Jetté, Nathalie; Sawka, Keri Jo; Diep, Lucy; Goliath, Jade; Roberts, Derek J; Yipp, Bryan G; Doig, Christopher J

    2015-01-01

    Objective Administrative health data are important for health services and outcomes research. We optimised and validated in intensive care unit (ICU) patients an International Classification of Disease (ICD)-coded case definition for sepsis, and compared this with an existing definition. We also assessed the definition's performance in non-ICU (ward) patients. Setting and participants All adults (aged ≥18 years) admitted to a multisystem ICU with general medicosurgical ICU care from one of three tertiary care centres in the Calgary region in Alberta, Canada, between 1 January 2009 and 31 December 2012 were included. Research design Patient medical records were randomly selected and linked to the discharge abstract database. In ICU patients, we validated the Canadian Institute for Health Information (CIHI) ICD-10-CA (Canadian Revision)-coded definition for sepsis and severe sepsis against a reference standard medical chart review, and optimised this algorithm through examination of other conditions apparent in sepsis. Measures Sensitivity (Sn), specificity (Sp), positive predictive value (PPV) and negative predictive value (NPV) were calculated. Results Sepsis was present in 604 of 1001 ICU patients (60.4%). The CIHI ICD-10-CA-coded definition for sepsis had Sn (46.4%), Sp (98.7%), PPV (98.2%) and NPV (54.7%); and for severe sepsis had Sn (47.2%), Sp (97.5%), PPV (95.3%) and NPV (63.2%). The optimised ICD-coded algorithm for sepsis increased Sn by 25.5% and NPV by 11.9% with slightly lowered Sp (85.4%) and PPV (88.2%). For severe sepsis both Sn (65.1%) and NPV (70.1%) increased, while Sp (88.2%) and PPV (85.6%) decreased slightly. Conclusions This study demonstrates that sepsis is highly undercoded in administrative data, thus under-ascertaining the true incidence of sepsis. The optimised ICD-coded definition has a higher validity with higher Sn and should be preferentially considered if used for surveillance purposes. PMID:26700284

  7. Real-time validation of receiver state information in optical space-time block code systems.

    Science.gov (United States)

    Alamia, John; Kurzweg, Timothy

    2014-06-15

    Free space optical interconnect (FSOI) systems are a promising solution to interconnect bottlenecks in high-speed systems. To overcome some sources of diminished FSOI performance caused by close proximity of multiple optical channels, multiple-input multiple-output (MIMO) systems implementing encoding schemes such as space-time block coding (STBC) have been developed. These schemes utilize information pertaining to the optical channel to reconstruct transmitted data. The STBC system is dependent on accurate channel state information (CSI) for optimal system performance. As a result of dynamic changes in optical channels, a system in operation will need to have updated CSI. Therefore, validation of the CSI during operation is a necessary tool to ensure FSOI systems operate efficiently. In this Letter, we demonstrate a method of validating CSI, in real time, through the use of moving averages of the maximum likelihood decoder data, and its capacity to predict the bit error rate (BER) of the system.

  8. The TALL-3D facility design and commissioning tests for validation of coupled STH and CFD codes

    Energy Technology Data Exchange (ETDEWEB)

    Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se; Jeltsov, Marti, E-mail: marti@safety.sci.kth.se; Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se; Karbojian, Aram, E-mail: karbojan@kth.se; Villanueva, Walter, E-mail: walter@safety.sci.kth.se; Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se

    2015-08-15

    Highlights: • Design of a heavy liquid thermal-hydraulic loop for CFD/STH code validation. • Description of the loop instrumentation and assessment of measurement error. • Experimental data from forced to natural circulation transient. - Abstract: Application of coupled CFD (Computational Fluid Dynamics) and STH (System Thermal Hydraulics) codes is a prerequisite for computationally affordable and sufficiently accurate prediction of thermal-hydraulics of complex systems. Coupled STH and CFD codes require validation for understanding and quantification of the sources of uncertainties in the code prediction. TALL-3D is a liquid Lead Bismuth Eutectic (LBE) loop developed according to the requirements for the experimental data for validation of coupled STH and CFD codes. The goals of the facility design are to provide (i) mutual feedback between natural circulation in the loop and complex 3D mixing and stratification phenomena in the pool-type test section, (ii) a possibility to validate standalone STH and CFD codes for each subsection of the facility, and (iii) sufficient number of experimental data to separate the process of input model calibration and code validation. Description of the facility design and its main components, approach to estimation of experimental uncertainty and calibration of model input parameters that are not directly measured in the experiment are discussed in the paper. First experimental data from the forced to natural circulation transient is also provided in the paper.

  9. Development and validation of the fast doppler broadening module coupled within RMC code

    International Nuclear Information System (INIS)

    Yu Jiankai; Liang Jin'gang; Yu Ganglin; Wang Kan

    2015-01-01

    It is one of the efficient approach to reduce the memory consumption in Monte Carlo based reactor physical simulations by using the On-the-fly Doppler broadening for temperature dependent nuclear cross sections. RXSP is a nuclear cross sections processing code being developed by REAL team in Department of Engineering Physics in Tsinghua University, which has an excellent performance in Doppler broadening the temperature dependent continuous energy neutron cross sections. To meet the dual requirements of both accuracy and efficiency during the Monte Carlo simulations with many materials and many temperatures in it, this work enables the capability of on-the-fly pre-Doppler broadening cross sections during the neutron transport by coupling the Fast Doppler Broaden module in RXSP code embedded in the RMC code also being developed by REAL team in Tsinghua University. Additionally, the original OpenMP-based parallelism has been successfully converted into the MPI-based framework, being fully compatible with neutron transport in RMC code, which has achieved a vast parallel efficiency improvement. This work also provides a flexible approach to solve Monte Carlo based full core depletion calculation with many temperatures feedback in many isotopes. (author)

  10. Validation of the XLACS code related to contribution of resolved and unresolved resonances and background cross sections

    International Nuclear Information System (INIS)

    Anaf, J.; Chalhoub, E.S.

    1990-01-01

    The procedures for calculating contributions of resolved and unresolved resonances and background cross sections, in XLACS code, were revised. Constant weighting function and zero Kelvin temperature were considered. Discrepancies found were corrected and now the validated XLACS code generates results that are correct and in accordance with its originally established procedures. (author)

  11. Validation of the MCNP-DSP Monte Carlo code for calculating source-driven noise parameters of subcritical systems

    International Nuclear Information System (INIS)

    Valentine, T.E.; Mihalczo, J.T.

    1995-01-01

    This paper describes calculations performed to validate the modified version of the MCNP code, the MCNP-DSP, used for: the neutron and photon spectra of the spontaneous fission of californium 252; the representation of the detection processes for scattering detectors; the timing of the detection process; and the calculation of the frequency analysis parameters for the MCNP-DSP code

  12. Relative validity of the pre-coded food diary used in the Danish National Survey of Diet and Physical Activity

    DEFF Research Database (Denmark)

    Knudsen, Vibeke Kildegaard; Gille, Maj-Britt; Nielsen, Trine Holmgaard

    2011-01-01

    Objective: To determine the relative validity of the pre-coded food diary applied in the Danish National Survey of Dietary Habits and Physical Activity. Design: A cross-over study among seventy-two adults (aged 20 to 69 years) recording diet by means of a pre-coded food diary over 4 d and a 4 d...

  13. Assessment of heat transfer correlations for supercritical water in the frame of best-estimate code validation

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Espinoza, Victor H. Sanchez; Schneider, Niko; Hurtado, Antonio

    2009-01-01

    Within the frame of the Generation IV international forum six innovative reactor concepts are the subject of comprehensive investigations. In some projects supercritical water will be considered as coolant, moderator (as for the High Performance Light Water Reactor) or secondary working fluid (one possible option for Liquid Metal-cooled Fast Reactors). Supercritical water is characterized by a pronounced change of the thermo-physical properties when crossing the pseudo-critical line, which goes hand in hand with a change in the heat transfer (HT) behavior. Hence, it is essential to estimate, in a proper way, the heat-transfer coefficient and subsequently the wall temperature. The scope of this paper is to present and discuss the activities at the Institute for Reactor Safety (IRS) related to the implementation of correlations for wall-to-fluid HT at supercritical conditions in Best-Estimate codes like TRACE as well as its validation. It is important to validate TRACE before applying it to safety analyses of HPLWR or of other reactor systems. In the past 3 decades various experiments have been performed all over the world to reveal the peculiarities of wall-to-fluid HT at supercritical conditions. Several different heat transfer phenomena such as HT enhancement (due to higher Prandtl numbers in the vicinity of the pseudo-critical point) or HT deterioration (due to strong property variations) were observed. Since TRACE is a component based system code with a finite volume method the resolution capabilities are limited and not all physical phenomena can be modeled properly. But Best -Estimate system codes are nowadays the preferred option for safety related investigations of full plants or other integral systems. Thus, the increase of the confidence in such codes is of high priority. In this paper, the post-test analysis of experiments with supercritical parameters will be presented. For that reason various correlations for the HT, which considers the characteristics

  14. Researching on knowledge architecture of design by analysis based on ASME code

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2003-01-01

    The quality of knowledge-based system's knowledge architecture is one of decisive factors of knowledge-based system's validity and rationality. For designing the ASME code knowledge based system, this paper presents a knowledge acquisition method which is extracting knowledge through document analysis consulted domain experts' knowledge. Then the paper describes knowledge architecture of design by analysis based on the related rules in ASME code. The knowledge of the knowledge architecture is divided into two categories: one is empirical knowledge, and another is ASME code knowledge. Applied as the basement of the knowledge architecture, a general procedural process of design by analysis that is met the engineering design requirements and designers' conventional mode is generalized and explained detailed in the paper. For the sake of improving inference efficiency and concurrent computation of KBS, a kind of knowledge Petri net (KPN) model is proposed and adopted in expressing the knowledge architecture. Furthermore, for validating and verifying of the empirical rules, five knowledge validation and verification theorems are given in the paper. Moreover the research production is applicable to design the knowledge architecture of ASME codes or other engineering standards. (author)

  15. Unfolding code for neutron spectrometry based on neural nets technology

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the R obust Design of Artificial Neural Networks Methodology . The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a 6 Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  16. Unfolding code for neutron spectrometry based on neural nets technology

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Apdo. Postal 336, 98000 Zacatecas (Mexico)

    2012-10-15

    The most delicate part of neutron spectrometry, is the unfolding process. The derivation of the spectral information is not simple because the unknown is not given directly as a result of the measurements. The drawbacks associated with traditional unfolding procedures have motivated the need of complementary approaches. Novel methods based on Artificial Neural Networks have been widely investigated. In this work, a neutron spectrum unfolding code based on neural nets technology is presented. This unfolding code called Neutron Spectrometry and Dosimetry by means of Artificial Neural Networks was designed in a graphical interface under LabVIEW programming environment. The core of the code is an embedded neural network architecture, previously optimized by the {sup R}obust Design of Artificial Neural Networks Methodology{sup .} The main features of the code are: is easy to use, friendly and intuitive to the user. This code was designed for a Bonner Sphere System based on a {sup 6}Lil(Eu) neutron detector and a response matrix expressed in 60 energy bins taken from an International Atomic Energy Agency compilation. The main feature of the code is that as entrance data, only seven rate counts measurement with a Bonner spheres spectrometer are required for simultaneously unfold the 60 energy bins of the neutron spectrum and to calculate 15 dosimetric quantities, for radiation protection porpoises. This code generates a full report in html format with all relevant information. (Author)

  17. Decay heat experiment and validation of calculation code systems for fusion reactor

    International Nuclear Information System (INIS)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of ±10%. (author)

  18. Decay heat experiment and validation of calculation code systems for fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Wada, Masayuki

    1999-10-01

    Although accurate estimation of decay heat value is essential for safety analyses of fusion reactors against loss of coolant accidents and so on, no experimental work has been devoted to validating the estimation. Hence, a decay heat measurement experiment was performed as a task (T-339) of ITER/EDA. A new detector, the Whole Energy Absorption Spectrometer (WEAS), was developed for accurate and efficient measurements of decay heat. Decay heat produced in the thirty-two sample materials which were irradiated by 14-MeV neutrons at FNS/JAERI were measured with WEAS for a wide cooling time period from 1 min to 400 days. The data presently obtained were the first experimental decay heat data in the field of fusion. Validity of decay heat calculation codes of ACT4 and CINAC-V4, activation cross section libraries of FENDL/A-2.0 and JENDL Activation File, and decay data was investigated through analyses of the experiment. As a result, several points that should be modified were found in the codes and data. After solving the problems, it was demonstrated that decay heat valued calculated for most of samples were in good agreement with the experimental data. Especially for stainless steel 316 and copper, which were important materials for ITER, decay heat could be predicted with accuracy of {+-}10%. (author)

  19. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    Energy Technology Data Exchange (ETDEWEB)

    Hilmy, N. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)], E-mail: nazly@batan.go.id; Febrida, A.; Basril, A. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)

    2007-11-15

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  20. Validation of two-phase flow code THYC on VATICAN experiment

    International Nuclear Information System (INIS)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B.

    1997-01-01

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project > has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  1. Validation of two-phase flow code THYC on VATICAN experiment

    Energy Technology Data Exchange (ETDEWEB)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B. [EDF/DER, Dept. TTA, 78 - Chatou (France)

    1997-12-31

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project <> has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  2. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  3. Validation of the metal fuel version of the SAS4A accident analysis code

    International Nuclear Information System (INIS)

    Tentner, A.M.

    1991-01-01

    This paper describes recent work directed towards the validation of the metal fuel version of the SAS4A accident analysis code. The SAS4A code system has been developed at Argonne National Laboratory for the simulation of hypothetical severe accidents in Liquid Metal-Cooled Reactors (LMR), designed to operate in a fast neutron spectrum. SAS4A was initially developed for the analysis of oxide-fueled liquid metal-cooled reactors and has played an important role in the simulation and assessment of the energetics potential for postulated severe accidents in these reactors. Due to the current interest in the metal-fueled liquid metal-cooled reactors, a metal fuel version of the SAS4A accident analysis code is being developed in the Integral Fast Reactor program at Argonne. During such postulated accident scenarios as the unprotected (i.e. without scram) loss-of-flow and transient overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling, and fuel and cladding melting and relocation. Due to strong neutronic feedbacks these events can significantly influence the reactor power history in the accident progression. The paper presents the results of a recent SAS4A simulation of the M7 TREAT experiment. 6 refs., 5 figs

  4. Validation of the ASSERT subchannel code for MAPLE-X10 reactor conditions

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Junop, S.V.; Wasilewicz, J.F.

    1993-01-01

    The ASSERT subchannel analysis code has been developed specifically to model flow and phase distributions within CANDU fuel channels. Recently, ASSERT has been adapted for use in simulating the MAPLE-X10 reactor. ASSERT uses an advanced drift-flux model, which permits the phases to have unequal velocities and unequal temperatures (UVUT), and thus can model non-equilibrium effects such as phase separation tendencies and subcooled boiling. Modelling subcooled boiling accurately is particularly important for MAPLE-X10. This paper briefly summarizes the non-equilibrium model used in the ASSERT code, the equations used to represent these models, and the algorithms used to solve the equations numerically. Very few modifications to the ASSERT models were needed to address MAPLE conditions. These centered on the manner in which finned fuel rods are treated, and they are discussed in the paper. The paper also gives results from validation exercises, in which the ASSERT code predictions of subcooled boiling void-fraction and critical heat flux were compared to experiments using MAPLE-X10 finned fuel elements in annuli and various bundles. 18 refs., 13 figs., 3 tabs

  5. Validation of CATHARE 3D code against UPTF TRAM C3 transients

    International Nuclear Information System (INIS)

    Glantz, Tony; Freitas, Roberto

    2007-01-01

    Within the nuclear reactor safety analysis, one of the events that could potentially lead to a recriticality accident in case of a Small Break LOCA (SBLOCA) in a pressurized water reactor (PWR) is a boron dilution scenario followed by a coolant mixing transient. Some UPTF experiments can be interpreted as generic boron dilution experiments. In fact, the UPTF experiments were originally designed to conduct separate effects studies focused on multi-dimensional thermal hydraulic phenomena. But, in the case of experimental program TRAM, some studies are realized on the boron mixing: tests C3. Some of these tests have been used for the validation and assessment of the 3D module of CATHARE code. Results are very satisfying; CATHARE 3D code is able to reproduce correctly the main features of the UPTF TRAM C3 tests, the temperature mixing in the cold leg, the formation of a strong stratification in the upper downcomer, the perfect mixing temperature in the lower downcomer and the strong stratification in the lower plenum. These results are also compared with the CFX-5 and TRIO-U codes results on these tests. (author)

  6. Validations of BWR nuclear design code using ABWR MOX numerical benchmark problems

    International Nuclear Information System (INIS)

    Takano, Shou; Sasagawa, Masaru; Yamana, Teppei; Ikehara, Tadashi; Yanagisawa, Naoki

    2017-01-01

    BWR core design code package (the HINES assembly code and the PANACH core simulator), being used for full MOX-ABWR core design, has been benchmarked against the high-fidelity numerical solutions as references, for the purpose of validating its capability of predicting the BWR core design parameters systematically from UO 2 to 100% MOX cores. The reference solutions were created by whole core critical calculations using MCNPs with the precisely modeled ABWR cores both in hot and cold conditions at BOC and EOC of the equilibrium cycle. A Doppler-Broadening Rejection Correction (DCRB) implemented MCNP5-1.4 with ENDF/B-VII.0 was mainly used to evaluate the core design parameters, except for effective delayed neutron fraction (β eff ) and prompt neutron lifetime (l) with MCNP6.1. The discrepancies in the results between the design codes HINES-PANACH and MCNPs for the core design parameters such as the bundle powers, hot pin powers, control rod worth, boron worth, void reactivity, Doppler reactivity, β eff and l, are almost within target accuracy, leading to the conclusion that HINES-PANACH has sufficient fidelity for application to full MOX-ABWR core design. (author)

  7. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  8. Water evaporation over sump surface in nuclear containment studies: CFD and LP codes validation on TOSQAN tests

    Energy Technology Data Exchange (ETDEWEB)

    Malet, J., E-mail: jeanne.malet@irsn.fr [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES/SCA BP 68, 91192 Gif-sur-Yvette (France); Degrees du Lou, O. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES/SCA BP 68, 91192 Gif-sur-Yvette (France); Arts et Métiers ParisTech, DynFluid Lab. EA92, 151, boulevard de l’Hôpital, 75013 Paris (France); Gelain, T. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), PSN-RES/SCA BP 68, 91192 Gif-sur-Yvette (France)

    2013-10-15

    Highlights: • Simulations of evaporative TOSQAN sump tests are performed. • These tests are under air–steam gas conditions with addition of He, CO{sub 2} and SF{sub 6}. • ASTEC-CPA LP and TONUS-CFD codes with UDF for sump model are used. • Validation of sump models of both codes show good results. • The code–experiment differences are attributed to turbulent gas mixing modeling. -- Abstract: During the course of a severe accident in a Nuclear Power Plant, water can be collected in the sump containment through steam condensation on walls and spray systems activation. The objective of this paper is to present code validation on evaporative sump tests performed on TOSQAN facility. The ASTEC-CPA code is used as a lumped-parameter code and specific user-defined-functions are developed for the TONUS-CFD code. The seven tests are air–steam tests, as well as tests with other non-condensable gases (He, CO{sub 2} and SF{sub 6}) under steady and transient conditions (two depressurization tests). The results show a good agreement between codes and experiments, indicating a good behavior of the sump models in both codes. The sump model developed as User-Defined Functions (UDF) for TONUS is considered as well validated and is ‘ready-to-use’ for all CFD codes in which such UDF can be added. The remaining discrepancies between codes and experiments are caused by turbulent transport and gas mixing, especially in the presence of non-condensable gases other than air, so that code validation on this important topic for hydrogen safety analysis is still recommended.

  9. Validation of integrated burnup code system SWAT2 by the analyses of isotopic composition of spent nuclear fuel

    International Nuclear Information System (INIS)

    Suyama, K.; Mochizuki, H.; Okuno, H.; Miyoshi, Y.

    2004-01-01

    This paper provides validation results of SWAT2, the revised version of SWAT, which is a code system combining point burnup code ORIGEN2 and continuous energy Monte Carlo code MVP, by the analysis of post irradiation examinations (PIEs). Some isotopes show differences of calculation results between SWAT and SWAT2. However, generally, the differences are smaller than the error of PIE analysis that was reported in previous SWAT validation activity, and improved results are obtained for several important fission product nuclides. This study also includes comparison between an assembly and a single pin cell geometry models. (authors)

  10. Diagnosis-based and external cause-based criteria to identify adverse drug reactions in hospital ICD-coded data: application to an Australia population-based study

    Directory of Open Access Journals (Sweden)

    Wei Du

    2017-04-01

    Full Text Available Objectives: External cause International Classification of Diseases (ICD codes are commonly used to ascertain adverse drug reactions (ADRs related to hospitalisation. We quantified ascertainment of ADR-related hospitalisation using external cause codes and additional ICD-based hospital diagnosis codes. Methods: We reviewed the scientific literature to identify different ICD-based criteria for ADR-related hospitalisations, developed algorithms to capture ADRs based on candidate hospital ICD-10 diagnoses and external cause codes (Y40–Y59, and incorporated previously published causality ratings estimating the probability that a specific diagnosis was ADR related. We applied the algorithms to the NSW Admitted Patient Data Collection records of 45 and Up Study participants (2011–2013. Results: Of 493 442 hospitalisations among 267 153 study participants during 2011–2013, 18.8% (n = 92 953 had hospital diagnosis codes that were potentially ADR related; 1.1% (n = 5305 had high/very high–probability ADR-related diagnosis codes (causality ratings: A1 and A2; and 2.0% (n = 10 039 had ADR-related external cause codes. Overall, 2.2% (n = 11 082 of cases were classified as including an ADR-based hospitalisation on either external cause codes or high/very high–probability ADR-related diagnosis codes. Hence, adding high/very high–probability ADR-related hospitalisation codes to standard external cause codes alone (Y40–Y59 increased the number of hospitalisations classified as having an ADR-related diagnosis by 10.4%. Only 6.7% of cases with high-probability ADR-related mental symptoms were captured by external cause codes. Conclusion: Selective use of high-probability ADR-related hospital diagnosis codes in addition to external cause codes yielded a modest increase in hospitalised ADR incidence, which is of potential clinical significance. Clinically validated combinations of diagnosis codes could potentially further enhance capture.

  11. Machine function based control code algebras

    NARCIS (Netherlands)

    Bergstra, J.A.

    Machine functions have been introduced by Earley and Sturgis in [6] in order to provide a mathematical foundation of the use of the T-diagrams proposed by Bratman in [5]. Machine functions describe the operation of a machine at a very abstract level. A theory of hardware and software based on

  12. Validation of the CATHARE2 code against experimental data from Brayton-cycle plants

    International Nuclear Information System (INIS)

    Bentivoglio, Fabrice; Tauveron, Nicolas; Geffraye, Genevieve; Gentner, Herve

    2008-01-01

    In recent years the Commissariat a l'Energie Atomique (CEA) has commissioned a wide range of feasibility studies of future-advanced nuclear reactors, in particular gas-cooled reactors (GCR). The thermohydraulic behaviour of these systems is a key issue for, among other things, the design of the core, the assessment of thermal stresses, and the design of decay heat removal systems. These studies therefore require efficient and reliable simulation tools capable of modelling the whole reactor, including the core, the core vessel, piping, heat exchangers and turbo-machinery. CATHARE2 is a thermal-hydraulic 1D reference safety code developed and extensively validated for the French pressurized water reactors. It has been recently adapted to deal also with gas-cooled reactor applications. In order to validate CATHARE2 for these new applications, CEA has initiated an ambitious long-term experimental program. The foreseen experimental facilities range from small-scale loops for physical correlations, to component technology and system demonstration loops. In the short-term perspective, CATHARE2 is being validated against existing experimental data. And in particular from the German power plants Oberhausen I and II. These facilities have both been operated by the German utility Energie Versorgung Oberhausen (E.V.O.) and their power conversion systems resemble to the high-temperature reactor concepts: Oberhausen I is a 13.75-MWe Brayton-cycle air turbine plant, and Oberhausen II is a 50-MWe Brayton-cycle helium turbine plant. The paper presents these two plants, the adopted CATHARE2 modelling and a comparison between experimental data and code results for both steady state and transient cases

  13. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  14. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  15. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  16. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  17. Modification and validation of the natural heat convection and subcooled void formation models in the code PARET

    International Nuclear Information System (INIS)

    Hainoun, A.; Alhabit, F.; Ghazi, N.

    2008-01-01

    Two new modifications have been included in the current PARET code that is widely applied in the dynamic and safety analysis of research reactors. A new model was implemented for the simulation of void formation in the subcooled boiling regime, the other modification dealt with the implementation of a new approach to improve the prediction of heat transfer coefficient under natural circulation condition. The modified code was successfully validated using adequate single effect tests covering the physical phenomena of interest for both natural circulation and subcooled void formation at low pressure and low heat flux. The validation results indicate significant improvement of the code compared to the default version. Additionally, to simplify the code application an interactive user interface was developed enabling pre and post-processing of the code predictions. (author)

  18. Novel power saving architecture for FBG based OCDMA code generation

    Science.gov (United States)

    Osadola, Tolulope B.; Idris, Siti K.; Glesk, Ivan

    2013-10-01

    A novel architecture for generating incoherent, 2-dimensional wavelength hopping-time spreading optical CDMA codes is presented. The architecture is designed to facilitate the reuse of optical source signal that is unused after an OCDMA code has been generated using fiber Bragg grating based encoders. Effective utilization of available optical power is therefore achieved by cascading several OCDMA encoders thereby enabling 3dB savings in optical power.

  19. Validation Of Critical Knowledge-Based Systems

    Science.gov (United States)

    Duke, Eugene L.

    1992-01-01

    Report discusses approach to verification and validation of knowledge-based systems. Also known as "expert systems". Concerned mainly with development of methodologies for verification of knowledge-based systems critical to flight-research systems; e.g., fault-tolerant control systems for advanced aircraft. Subject matter also has relevance to knowledge-based systems controlling medical life-support equipment or commuter railroad systems.

  20. Initial validation of 4D-model for a clinical PET scanner using the Monte Carlo code gate

    International Nuclear Information System (INIS)

    Vieira, Igor F.; Lima, Fernando R.A.; Gomes, Marcelo S.; Vieira, Jose W.; Pacheco, Ludimila M.; Chaves, Rosa M.

    2011-01-01

    Building exposure computational models (ECM) of emission tomography (PET and SPECT) currently has several dedicated computing tools based on Monte Carlo techniques (SimSET, SORTEO, SIMIND, GATE). This paper is divided into two steps: (1) using the dedicated code GATE (Geant4 Application for Tomographic Emission) to build a 4D model (where the fourth dimension is the time) of a clinical PET scanner from General Electric, GE ADVANCE, simulating the geometric and electronic structures suitable for this scanner, as well as some phenomena 4D, for example, rotating gantry; (2) the next step is to evaluate the performance of the model built here in the reproduction of test noise equivalent count rate (NEC) based on the NEMA Standards Publication NU protocols 2-2007 for this tomography. The results for steps (1) and (2) will be compared with experimental and theoretical values of the literature showing actual state of art of validation. (author)

  1. Initial validation of 4D-model for a clinical PET scanner using the Monte Carlo code gate

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Igor F.; Lima, Fernando R.A.; Gomes, Marcelo S., E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Vieira, Jose W.; Pacheco, Ludimila M. [Instituto Federal de Educacao, Ciencia e Tecnologia (IFPE), Recife, PE (Brazil); Chaves, Rosa M. [Instituto de Radium e Supervoltagem Ivo Roesler, Recife, PE (Brazil)

    2011-07-01

    Building exposure computational models (ECM) of emission tomography (PET and SPECT) currently has several dedicated computing tools based on Monte Carlo techniques (SimSET, SORTEO, SIMIND, GATE). This paper is divided into two steps: (1) using the dedicated code GATE (Geant4 Application for Tomographic Emission) to build a 4D model (where the fourth dimension is the time) of a clinical PET scanner from General Electric, GE ADVANCE, simulating the geometric and electronic structures suitable for this scanner, as well as some phenomena 4D, for example, rotating gantry; (2) the next step is to evaluate the performance of the model built here in the reproduction of test noise equivalent count rate (NEC) based on the NEMA Standards Publication NU protocols 2-2007 for this tomography. The results for steps (1) and (2) will be compared with experimental and theoretical values of the literature showing actual state of art of validation. (author)

  2. Model validation of GAMMA code with heat transfer experiment for KO TBM in ITER

    International Nuclear Information System (INIS)

    Yum, Soo Been; Lee, Eo Hwak; Lee, Dong Won; Park, Goon Cherl

    2013-01-01

    Highlights: ► In this study, helium supplying system was constructed. ► Preparation for heat transfer experiment in KO TBM condition using helium supplying system was progressed. ► To get more applicable results, test matrix was made to cover the condition for KO TBM. ► Using CFD code; CFX 11, validation and modification for system code GAMMA was performed. -- Abstract: By considering the requirements for a DEMO-relevant blanket concept, Korea (KO) has proposed a He cooled molten lithium (HCML) test blanket module (TBM) for testing in ITER. A performance analysis for the thermal–hydraulics and a safety analysis for the KO TBM have been carried out using a commercial CFD code, ANSYS-CFX, and a system code, GAMMA (GAs multicomponent mixture analysis), which was developed by the gas cooled reactor in Korea. To verify the codes, a preliminary study was performed by Lee using a single TBM first wall (FW) mock-up made from the same material as the KO TBM, ferritic martensitic steel, using a 6 MPa nitrogen gas loop. The test was performed at pressures of 1.1, 1.9 and 2.9 MPa, and under various ranges of flow rate from 0.0105 to 0.0407 kg/s with a constant wall temperature condition. In the present study, a thermal–hydraulic test was performed with the newly constructed helium supplying system, in which the design pressure and temperature were 9 MPa and 500 °C, respectively. In the experiment, the same mock-up was used, and the test was performed under the conditions of 3 MPa pressure, 30 °C inlet temperature and 70 m/s helium velocity, which are almost same conditions of the KO TBM FW. One side of the mock-up was heated with a constant heat flux of 0.3–0.5 MW/m 2 using a graphite heating system, KoHLT-2 (Korea heat load test facility-2). Because the comparison result between CFX 11 and GAMMA showed a difference tendency, the modification of heat transfer correlation included in GAMMA was performed. And the modified GAMMA showed the strong parity with CFX

  3. PERMUTATION-BASED POLYMORPHIC STEGO-WATERMARKS FOR PROGRAM CODES

    Directory of Open Access Journals (Sweden)

    Denys Samoilenko

    2016-06-01

    Full Text Available Purpose: One of the most actual trends in program code protection is code marking. The problem consists in creation of some digital “watermarks” which allow distinguishing different copies of the same program codes. Such marks could be useful for authority protection, for code copies numbering, for program propagation monitoring, for information security proposes in client-server communication processes. Methods: We used the methods of digital steganography adopted for program codes as text objects. The same-shape symbols method was transformed to same-semantic element method due to codes features which makes them different from ordinary texts. We use dynamic principle of marks forming making codes similar to be polymorphic. Results: We examined the combinatorial capacity of permutations possible in program codes. As a result it was shown that the set of 5-7 polymorphic variables is suitable for the most modern network applications. Marks creation and restoration algorithms where proposed and discussed. The main algorithm is based on full and partial permutations in variables names and its declaration order. Algorithm for partial permutation enumeration was optimized for calculation complexity. PHP code fragments which realize the algorithms were listed. Discussion: Methodic proposed in the work allows distinguishing of each client-server connection. In a case if a clone of some network resource was found the methodic could give information about included marks and thereby data on IP, date and time, authentication information of client copied the resource. Usage of polymorphic stego-watermarks should improve information security indexes in network communications.

  4. Development of a coupled code system based on system transient code, RETRAN, and 3-D neutronics code, MASTER

    International Nuclear Information System (INIS)

    Kim, K. D.; Jung, J. J.; Lee, S. W.; Cho, B. O.; Ji, S. K.; Kim, Y. H.; Seong, C. K.

    2002-01-01

    A coupled code system of RETRAN/MASTER has been developed for best-estimate simulations of interactions between reactor core neutron kinetics and plant thermal-hydraulics by incorporation of a 3-D reactor core kinetics analysis code, MASTER into system transient code, RETRAN. The soundness of the consolidated code system is confirmed by simulating the MSLB benchmark problem developed to verify the performance of a coupled kinetics and system transient codes by OECD/NEA

  5. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Jacek Ilow

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of k information packets to construct r redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of k information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of n=k+r received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  6. Design of Packet-Based Block Codes with Shift Operators

    Directory of Open Access Journals (Sweden)

    Ilow Jacek

    2010-01-01

    Full Text Available This paper introduces packet-oriented block codes for the recovery of lost packets and the correction of an erroneous single packet. Specifically, a family of systematic codes is proposed, based on a Vandermonde matrix applied to a group of information packets to construct redundant packets, where the elements of the Vandermonde matrix are bit-level right arithmetic shift operators. The code design is applicable to packets of any size, provided that the packets within a block of information packets are of uniform length. In order to decrease the overhead associated with packet padding using shift operators, non-Vandermonde matrices are also proposed for designing packet-oriented block codes. An efficient matrix inversion procedure for the off-line design of the decoding algorithm is presented to recover lost packets. The error correction capability of the design is investigated as well. The decoding algorithm, based on syndrome decoding, to correct a single erroneous packet in a group of received packets is presented. The paper is equipped with examples of codes using different parameters. The code designs and their performance are tested using Monte Carlo simulations; the results obtained exhibit good agreement with the corresponding theoretical results.

  7. Contribution to the validation of the Apollo code library for thermal neutron reactors

    International Nuclear Information System (INIS)

    Tellier, H.; Van der Gucht, C.; Vanuxeem, J.

    1988-03-01

    The neutron nuclear data which are needed by reactor physicists to perform core calculation are brought together in the evaluated files. The files are processed to provide multigroup cross sections. The accuracy of the core calculations depends on the initial data which are sometimes not accurate enough. Therefore the reactor physicists carry out integral experiments. We show in this paper, how the use of these integral experiments and the application of the tendency research method can improve the accuracy of the neutron data. This technique was applied to the validation of the Apollo code library. For this purpose 60 buckling measurements (34 for uranium fuel multiplying media and 26 for plutonium fuel multiplying media) and 42 spent fuel analysis were used. Small modifications of the initial data are proposed. The final values are compared which recent recommended values of microscopic data and the agreement is good [fr

  8. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    Science.gov (United States)

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  9. Validation of Code ASTEC with LIVE-L1 Experimental Results

    International Nuclear Information System (INIS)

    Bachrata, Andrea

    2008-01-01

    The severe accidents with core melting are considered at the design stage of project at Generation 3+ of Nuclear Power Plants (NPP). Moreover, there is an effort to apply the severe accident management to the operated NPP. The one of main goals of severe accidents mitigation is corium localization and stabilization. The two strategies that fulfil this requirement are: the in-vessel retention (e.g. AP-600, AP- 1000) and the ex-vessel retention (e.g. EPR). To study the scenario of in-vessel retention, a large experimental program and the integrated codes have been developed. The LIVE-L1 experimental facility studied the formation of melt pools and the melt accumulation in the lower head using different cooling conditions. Nowadays, a new European computer code ASTEC is being developed jointly in France and Germany. One of the important steps in ASTEC development in the area of in-vessel retention of corium is its validation with LIVE-L1 experimental results. Details of the experiment are reported. Results of the ASTEC (module DIVA) application to the analysis of the test are presented. (author)

  10. The reactor kinetics code tank: a validation against selected SPERT-1b experiments

    International Nuclear Information System (INIS)

    Ellis, R.J.

    1990-01-01

    The two-dimensional space-time analysis code TANK is being developed for the simulation of transient behaviour in the MAPLE class of research reactors. MAPLE research reactor cores are compact, light-water-cooled and -moderated, with a high degree of forced subcooling. The SPERT-1B(24/32) reactor core had many similarities to MAPLE-X10, and the results of the SPERT transient experiments are well documented. As a validation of TANK, a series of simulations of certain SPERT reactor transients was undertaken. Special features were added to the TANK code to model reactors with plate-type fuel and to allow for the simulation of rapid void production. The results of a series of super-prompt-critical reactivity step-insertion transient simulations are presented. The selected SPERT transients were all initiated from low power, at ambient temperatures, and with negligible coolant flow. Th results of the TANK simulations are in good agreement with the trends in the experimental SPERT data

  11. Decay heat measurement on fusion reactor materials and validation of calculation code system

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Ikeda, Yujiro; Wada, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Decay heat rates for 32 fusion reactor relevant materials irradiated with 14-MeV neutrons were measured for the cooling time period between 1 minute and 400 days. With using the experimental data base, validity of decay heat calculation systems for fusion reactors were investigated. (author)

  12. Edge-preserving Intra Depth Coding based on Context-coding and H.264/AVC

    DEFF Research Database (Denmark)

    Zamarin, Marco; Salmistraro, Matteo; Forchhammer, Søren

    2013-01-01

    Depth map coding plays a crucial role in 3D Video communication systems based on the “Multi-view Video plus Depth” representation as view synthesis performance is strongly affected by the accuracy of depth information, especially at edges in the depth map image. In this paper an efficient algorithm...... for edge-preserving intra depth compression based on H.264/AVC is presented. The proposed method introduces a new Intra mode specifically targeted to depth macroblocks with arbitrarily shaped edges, which are typically not efficiently represented by DCT. Edge macroblocks are partitioned into two regions...... each approximated by a flat surface. Edge information is encoded by means of contextcoding with an adaptive template. As a novel element, the proposed method allows exploiting the edge structure of previously encoded edge macroblocks during the context-coding step to further increase compression...

  13. Test and validation of the iterative code for the neutrons spectrometry and dosimetry: NSDUAZ; Prueba y validacion del codigo iterativo para la espectrometria y dosimetria de neutrones: NSDUAZ

    Energy Technology Data Exchange (ETDEWEB)

    Reyes H, A.; Ortiz R, J. M.; Reyes A, A.; Castaneda M, R.; Solis S, L. O.; Vega C, H. R., E-mail: alfredo_reyesh@hotmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Lopez Velarde 801, Col. Centro, 98000 Zacatecas (Mexico)

    2014-08-15

    In this work was realized the test and validation of an iterative code for neutronic spectrometry known as Neutron Spectrometry and Dosimetry of the Universidad Autonoma de Zacatecas (NSDUAZ). This code was designed in a user graph interface, friendly and intuitive in the environment programming of LabVIEW using the iterative algorithm known as SPUNIT. The main characteristics of the program are: the automatic selection of the initial spectrum starting from the neutrons spectra catalog compiled by the International Atomic Energy Agency, the possibility to generate a report in HTML format that shows in graph and numeric way the neutrons flowing and calculates the ambient dose equivalent with base to this. To prove the designed code, the count rates of a spectrometer system of Bonner spheres were used with a detector of {sup 6}LiI(Eu) with 7 polyethylene spheres with diameter of 0, 2, 3, 5, 8, 10 and 12. The count rates measured with two neutron sources: {sup 252}Cf and {sup 239}PuBe were used to validate the code, the obtained results were compared against those obtained using the BUNKIUT code. We find that the reconstructed spectra present an error that is inside the limit reported in the literature that oscillates around 15%. Therefore, it was concluded that the designed code presents similar results to those techniques used at the present time. (Author)

  14. Entropy Evaluation Based on Value Validity

    Directory of Open Access Journals (Sweden)

    Tarald O. Kvålseth

    2014-09-01

    Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.

  15. Validation matrix for the assessment of thermal-hydraulic codes for VVER LOCA and transients. A report by the OECD support group on the VVER thermal-hydraulic code validation matrix

    International Nuclear Information System (INIS)

    2001-06-01

    This report deals with an internationally agreed experimental test facility matrix for the validation of best estimate thermal-hydraulic computer codes applied for the analysis of VVER reactor primary systems in accident and transient conditions. Firstly, the main physical phenomena that occur during the considered accidents are identified, test types are specified, and test facilities that supplement the CSNI CCVMs and are suitable for reproducing these aspects are selected. Secondly, a list of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. The construction of VVER Thermal-Hydraulic Code Validation Matrix follows the logic of the CSNI Code Validation Matrices (CCVM). Similar to the CCVM it is an attempt to collect together in a systematic way the best sets of available test data for VVER specific code validation, assessment and improvement, including quantitative assessment of uncertainties in the modelling of phenomena by the codes. In addition to this objective, it is an attempt to record information which has been generated in countries operating VVER reactors over the last 20 years so that it is more accessible to present and future workers in that field than would otherwise be the case. (authors)

  16. Optical information encryption based on incoherent superposition with the help of the QR code

    Science.gov (United States)

    Qin, Yi; Gong, Qiong

    2014-01-01

    In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.

  17. A neutron spectrum unfolding code based on iterative procedures

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.

    2012-10-01

    In this work, the version 3.0 of the neutron spectrum unfolding code called Neutron Spectrometry and Dosimetry from Universidad Autonoma de Zacatecas (NSDUAZ), is presented. This code was designed in a graphical interface under the LabVIEW programming environment and it is based on the iterative SPUNIT iterative algorithm, using as entrance data, only the rate counts obtained with 7 Bonner spheres based on a 6 Lil(Eu) neutron detector. The main features of the code are: it is intuitive and friendly to the user; it has a programming routine which automatically selects the initial guess spectrum by using a set of neutron spectra compiled by the International Atomic Energy Agency. Besides the neutron spectrum, this code calculates the total flux, the mean energy, H(10), h(10), 15 dosimetric quantities for radiation protection porpoises and 7 survey meter responses, in four energy grids, based on the International Atomic Energy Agency compilation. This code generates a full report in html format with all relevant information. In this work, the neutron spectrum of a 241 AmBe neutron source on air, located at 150 cm from detector, is unfolded. (Author)

  18. Computer code validation study of PWR core design system, CASMO-3/MASTER-α

    International Nuclear Information System (INIS)

    Lee, K. H.; Kim, M. H.; Woo, S. W.

    1999-01-01

    In this paper, the feasibility of CASMO-3/MASTER-α nuclear design system was investigated for commercial PWR core. Validation calculation was performed as follows. Firstly, the accuracy of cross section generation from table set using linear feedback model was estimated. Secondly, the results of CASMO-3/MASTER-α was compared with CASMO-3/NESTLE 5.02 for a few benchmark problems. Microscopic cross sections computed from table set were almost the same with those from CASMO-3. There were small differences between calculated results of two code systems. Thirdly, the repetition of CASMO-3/MASTER-α calculation for Younggwang Unit-3, Cycle-1 core was done and their results were compared with nuclear design report(NDR) and uncertainty analysis results of KAERI. It was found that uncertainty analysis results were reliable enough because results were agreed each other. It was concluded that the use of nuclear design system CASMO-3/MASTER-α was validated for commercial PWR core

  19. International integral experiments databases in support of nuclear data and code validation

    International Nuclear Information System (INIS)

    Briggs, J. Blair; Gado, Janos; Hunter, Hamilton; Kodeli, Ivan; Salvatores, Massimo; Sartori, Enrico

    2002-01-01

    The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: SINBAD - A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding. ICSBEP - International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. IRPhEP - International Reactor Physics Experimental Benchmarks Evaluation Project. The different projects are described in the following including results achieved, work in progress and planned. (author)

  20. Lossless Image Compression Based on Multiple-Tables Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Rung-Ching Chen

    2009-01-01

    Full Text Available This paper is intended to present a lossless image compression method based on multiple-tables arithmetic coding (MTAC method to encode a gray-level image f. First, the MTAC method employs a median edge detector (MED to reduce the entropy rate of f. The gray levels of two adjacent pixels in an image are usually similar. A base-switching transformation approach is then used to reduce the spatial redundancy of the image. The gray levels of some pixels in an image are more common than those of others. Finally, the arithmetic encoding method is applied to reduce the coding redundancy of the image. To promote high performance of the arithmetic encoding method, the MTAC method first classifies the data and then encodes each cluster of data using a distinct code table. The experimental results show that, in most cases, the MTAC method provides a higher efficiency in use of storage space than the lossless JPEG2000 does.

  1. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  2. Trellis-coded CPM for satellite-based mobile communications

    Science.gov (United States)

    Abrishamkar, Farrokh; Biglieri, Ezio

    1988-01-01

    Digital transmission for satellite-based land mobile communications is discussed. To satisfy the power and bandwidth limitations imposed on such systems, a combination of trellis coding and continuous-phase modulated signals are considered. Some schemes based on this idea are presented, and their performance is analyzed by computer simulation. The results obtained show that a scheme based on directional detection and Viterbi decoding appears promising for practical applications.

  3. Tuning iteration space slicing based tiled multi-core code implementing Nussinov's RNA folding.

    Science.gov (United States)

    Palkowski, Marek; Bielecki, Wlodzimierz

    2018-01-15

    RNA folding is an ongoing compute-intensive task of bioinformatics. Parallelization and improving code locality for this kind of algorithms is one of the most relevant areas in computational biology. Fortunately, RNA secondary structure approaches, such as Nussinov's recurrence, involve mathematical operations over affine control loops whose iteration space can be represented by the polyhedral model. This allows us to apply powerful polyhedral compilation techniques based on the transitive closure of dependence graphs to generate parallel tiled code implementing Nussinov's RNA folding. Such techniques are within the iteration space slicing framework - the transitive dependences are applied to the statement instances of interest to produce valid tiles. The main problem at generating parallel tiled code is defining a proper tile size and tile dimension which impact parallelism degree and code locality. To choose the best tile size and tile dimension, we first construct parallel parametric tiled code (parameters are variables defining tile size). With this purpose, we first generate two nonparametric tiled codes with different fixed tile sizes but with the same code structure and then derive a general affine model, which describes all integer factors available in expressions of those codes. Using this model and known integer factors present in the mentioned expressions (they define the left-hand side of the model), we find unknown integers in this model for each integer factor available in the same fixed tiled code position and replace in this code expressions, including integer factors, with those including parameters. Then we use this parallel parametric tiled code to implement the well-known tile size selection (TSS) technique, which allows us to discover in a given search space the best tile size and tile dimension maximizing target code performance. For a given search space, the presented approach allows us to choose the best tile size and tile dimension in

  4. RELAP5-3D code validation of RBMK-1500 reactor reactivity measurement transients

    International Nuclear Information System (INIS)

    Kaliatka, Algirdas; Bubelis, Evaldas; Uspuras, Eugenijus

    2003-01-01

    This paper deals with the modeling of transients taking place during the measurements of the void and fast power reactivity coefficients performed at Ignalina NPP. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Following the simulation of the two above mentioned transients with RELAP5-3D code, a conclusion was made that the obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviors of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. The calculated reactivity and the total reactor core power behavior in time are also in reasonable agreement with the measured plant data. Despite of the small differences, RELAP5-3D code predicts reactivity and the total reactor core power behavior during the transients in a reasonable manner. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modeling of the neutronic processes taking place in RBMK-1500 reactor core

  5. TASS code topical report. V.2 TASS code validation report for the non-LOCA transient analysis of the CE and Westinghouse type plants

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Lee, S. J.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    The development of TASS 1.0 code has been completed and validated its capability in applying for the licensing transient analyses of the CE and Westinghouse type operating reactors as well as the PWR plants under construction in Korea. The validation of the TASS 1.0 code has been achieved through the comparison calculations of the FSAR transients, loss of AC power transient plant data, load rejection and startup test data for the reference plants as well as the BETHSY loop steam generator tube rupture test data. TASS 1.0 calculation agrees well with the best FSAR transient and shows its capability in simulating plant transient analyses. (author). 12 refs., 32 tabs., 132 figs

  6. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...

  7. Code it rite the first time : automated invoice processing solution designed to ensure validity to field ticket coding

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, G.

    2010-03-15

    An entrepreneur who ran 55 rigs for a major oilfield operator in Calgary has developed a solution for the oil industry that reduces field ticketing errors from 40 per cent to almost none. The Code-Rite not only simplifies field ticketing but can eliminate weeks of trying to balance authorization for expenditure (AFE) numbers. A service provider who wants a field ticket signed for billing purposes following a service call to a well site receives all pertinent information on a barcode that includes AFE number, location, routing, approval authority and mailing address. Attaching the label to the field ticket provides all the invoicing information needed. This article described the job profile, education and life experiences and opportunities that led the innovator to develop this technology that solves an industry-wide problem. Code-Rite is currently being used by 3 large upstream oil and gas operators and plans are underway to automate the entire invoice processing system. 1 fig.

  8. Burn-up function of fuel management code for aqueous homogeneous reactors and its validation

    International Nuclear Information System (INIS)

    Wang Liangzi; Yao Dong; Wang Kan

    2011-01-01

    Fuel Management Code for Aqueous Homogeneous Reactors (FMCAHR) is developed based on the Monte Carlo transport method, to analyze the physics characteristics of aqueous homogeneous reactors. FMCAHR has the ability of doing resonance treatment, searching for critical rod heights, thermal hydraulic parameters calculation, radiolytic-gas bubbles' calculation and bum-up calculation. This paper introduces the theory model and scheme of its burn-up function, and then compares its calculation results with benchmarks and with DRAGON's burn-up results, which confirms its bum-up computing precision and its applicability in the bum-up calculation and analysis for aqueous solution reactors. (authors)

  9. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  10. File compression and encryption based on LLS and arithmetic coding

    Science.gov (United States)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.

  11. Preliminary design of a small air loop for system analysis and validation of Cathare code

    International Nuclear Information System (INIS)

    Marchand, M.; Saez, M.; Tauveron, N.; Tenchine, D.; Germain, T.; Geffraye, G.; Ruby, G.P.

    2007-01-01

    The French Atomic Energy Commission (Cea) is carrying on the design of a Small Air Loop for System Analysis (SALSA), devoted to the study of gas cooled nuclear reactors behaviour in normal and incidental/accidental operating conditions. The reduced size of the SALSA components compared to a full-scale reactor and air as gaseous coolant instead of Helium will allow an easy management of the loop. The main purpose of SALSA will be the validation of the associated thermal hydraulic safety simulation codes, like CATHARE. The main goal of this paper is to present the methodology used to define the characteristics of the loop. In a first step, the study has been focused on a direct-cycle system for the SALSA loop with few global constraints using a similarity analysis to support the definition and design of the loop. Similarity requirements have been evaluated to determine the scale factors which have to be applied to the SALSA loop components. The preliminary conceptual design of the SALSA plant with a definition of each component has then be carried out. The whole plant has been modelled using the CATHARE code. Calculations of the SALSA steady-state in nominal conditions and of different plant transients in direct-cycle have been made. The first system results obtained on the global behaviour of the loop confirm that SALSA can be representative of a Gas-Cooled nuclear reactor with some minor design modifications. In a second step, the current prospects focus on the SALSA loop capability to reproduce correctly the heat transfer occurring in specific incidental situations. Heat decay removal by natural convection is a crucial point of interest. The first results show that the behaviour and the efficiency of the loop are strongly influenced by the definition of the main parameters for each component. A complete definition of SALSA is under progress. (authors)

  12. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  13. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  14. Apar-T: code, validation, and physical interpretation of particle-in-cell results

    Science.gov (United States)

    Melzani, Mickaël; Winisdoerffer, Christophe; Walder, Rolf; Folini, Doris; Favre, Jean M.; Krastanov, Stefan; Messmer, Peter

    2013-10-01

    We present the parallel particle-in-cell (PIC) code Apar-T and, more importantly, address the fundamental question of the relations between the PIC model, the Vlasov-Maxwell theory, and real plasmas. First, we present four validation tests: spectra from simulations of thermal plasmas, linear growth rates of the relativistic tearing instability and of the filamentation instability, and nonlinear filamentation merging phase. For the filamentation instability we show that the effective growth rates measured on the total energy can differ by more than 50% from the linear cold predictions and from the fastest modes of the simulation. We link these discrepancies to the superparticle number per cell and to the level of field fluctuations. Second, we detail a new method for initial loading of Maxwell-Jüttner particle distributions with relativistic bulk velocity and relativistic temperature, and explain why the traditional method with individual particle boosting fails. The formulation of the relativistic Harris equilibrium is generalized to arbitrary temperature and mass ratios. Both are required for the tearing instability setup. Third, we turn to the key point of this paper and scrutinize the question of what description of (weakly coupled) physical plasmas is obtained by PIC models. These models rely on two building blocks: coarse-graining, i.e., grouping of the order of p ~ 1010 real particles into a single computer superparticle, and field storage on a grid with its subsequent finite superparticle size. We introduce the notion of coarse-graining dependent quantities, i.e., quantities depending on p. They derive from the PIC plasma parameter ΛPIC, which we show to behave as ΛPIC ∝ 1/p. We explore two important implications. One is that PIC collision- and fluctuation-induced thermalization times are expected to scale with the number of superparticles per grid cell, and thus to be a factor p ~ 1010 smaller than in real plasmas, a fact that we confirm with

  15. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    Science.gov (United States)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.

  16. Development of a CAD-based neutron transport code with the method of characteristics

    International Nuclear Information System (INIS)

    Chen Zhenping; Wang Dianxi; He Tao; Wang Guozhong; Zheng Huaqing

    2012-01-01

    The main problem determining whether the method of characteristics (MOC) can be used in complicated and highly heterogeneous geometry is how to combine an effective geometry processing method with MOC. In this study, a new idea making use of MCAM, which is a Mutlti-Calculation Automatic Modeling for Neutronics and Radiation Transport program developed by FDS Team, for geometry description and ray tracing of particle transport was brought forward to solve the geometry problem mentioned above. Based on the theory and approach as the foregoing statement, a two dimensional neutron transport code was developed which had been integrated into VisualBUS, developed by FDS Team. Several benchmarks were used to verify the validity of the code and the numerical results were coincident with the reference values very well, which indicated the accuracy and feasibility of the method and the MOC code. (authors)

  17. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  18. Validation of CONTAIN-LMR code for accident analysis of sodium-cooled fast reactor containments

    Energy Technology Data Exchange (ETDEWEB)

    Gordeev, S.; Hering, W.; Schikorr, M.; Stieglitz, R. [Inst. for Neutron Physic and Reactor Technology, Karlsruhe Inst. of Technology, Campus Nord (Germany)

    2012-07-01

    CONTAIN-LMR 1 is an analytical tool for the containment performance of sodium cooled fast reactors. In this code, the modelling for the sodium fire is included: the oxygen diffusion model for the sodium pool fire, and the liquid droplet model for the sodium spray fire. CONTAIN-LMR is also able to model the interaction of liquid sodium with concrete structure. It may be applicable to different concrete compositions. Testing and validation of these models will help to qualify the simulation results. Three experiments with sodium performed in the FAUNA facility at FZK have been used for the validation of CONTAIN-LMR. For pool fire tests, calculations have been performed with two models. The first model consists of one gas cell representing the volume of the burn compartment. The volume of the second model is subdivided into 32 coupled gas cells. The agreement between calculations and experimental data is acceptable. The detailed pool fire model shows less deviation from experiments. In the spray fire, the direct heating from the sodium burning in the media is dominant. Therefore, single cell modeling is enough to describe the phenomena. Calculation results have reasonable agreement with experimental data. Limitations of the implemented spray model can cause the overestimation of predicted pressure and temperature in the cell atmosphere. The ability of the CONTAIN-LMR to simulate the sodium pool fire accompanied by sodium-concrete reactions was tested using the experimental study of sodium-concrete interactions for construction concrete as well as for shielding concrete. The model provides a reasonably good representation of chemical processes during sodium-concrete interaction. The comparison of time-temperature profiles of sodium and concrete shows, that the model requires modifications for predictions of the test results. (authors)

  19. Validity of congenital malformation diagnostic codes recorded in Québec's administrative databases.

    Science.gov (United States)

    Blais, Lucie; Bérard, Anick; Kettani, Fatima-Zohra; Forget, Amélie

    2013-08-01

    To assess the validity of the diagnostic codes of congenital malformations (CMs) recorded in two of Québec's administrative databases. A cohort of pregnancies and infants born to asthmatic and non-asthmatic women in 1990-2002 was reconstructed using Québec's administrative databases. From this cohort, we selected 269 infants with a CM and 144 without CM born to asthmatic women, together with 284 and 138 infants, respectively, born to non-asthmatic women. The diagnoses of CMs recorded in the databases were compared with the diagnoses written by the physicians in the infants' medical charts. The positive predictive values (PPV) and negative predictive values (NPV) for all, major, and several specific CMs were estimated. The PPVs for all CMs and major CMs were 82.2% (95% confidence interval (CI): 78.5%-85.9%) and 78.1% (74.1%-82.1%), respectively, in the asthmatic group and were 79.2% (75.4%-83.1%) and 69.0% (64.6%-73.4%), respectively, in the non-asthmatic group. PPVs >80% were found for several specific CMs, including cardiac, cleft, and limb CMs in both groups. The NPV for any CM was 88.2% (95% CI: 85.1%-91.3%) in the asthmatic group and 94.2% (92.2%-96.2%) in the non-asthmatic group. Québec's administrative databases are valid tools for epidemiological research of CMs. The results were similar between infants born to women with and without asthma. Copyright © 2013 John Wiley & Sons, Ltd.

  20. VULCAN: An Open-source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Shang-Min; Grosheintz, Luc; Kitzmann, Daniel; Heng, Kevin [University of Bern, Center for Space and Habitability, Sidlerstrasse 5, CH-3012, Bern (Switzerland); Lyons, James R. [Arizona State University, School of Earth and Space Exploration, Bateman Physical Sciences, Tempe, AZ 85287-1404 (United States); Rimmer, Paul B., E-mail: shang-min.tsai@space.unibe.ch, E-mail: kevin.heng@csh.unibe.ch, E-mail: jimlyons@asu.edu [University of St. Andrews, School of Physics and Astronomy, St. Andrews, KY16 9SS (United Kingdom)

    2017-02-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K, using a reduced C–H–O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer and Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. We also use VULCAN to examine the theoretical trends produced when the temperature–pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching approximation and find that it is accurate for methane but breaks down for acetylene, because the disequilibrium abundance of acetylene is not directly determined by transport-induced quenching, but is rather indirectly controlled by the disequilibrium abundance of methane. Therefore we suggest that the quenching approximation should be used with caution and must always be checked against a chemical kinetics calculation. A one-dimensional model atmosphere with 100 layers, computed using VULCAN, typically takes several minutes to complete. VULCAN is part of the Exoclimes Simulation Platform (ESP; exoclime.net) and publicly available at https://github.com/exoclime/VULCAN.

  1. Validation of one-dimensional module of MARS 2.1 computer code by comparison with the RELAP5/MOD3.3 developmental assessment results

    International Nuclear Information System (INIS)

    Lee, Y. J.; Bae, S. W.; Chung, B. D.

    2003-02-01

    This report records the results of the code validation for the one-dimensional module of the MARS 2.1 thermal hydraulics analysis code by means of result-comparison with the RELAP5/MOD3.3 computer code. For the validation calculations, simulations of the RELAP5 code development assessment problem, which consists of 22 simulation problems in 3 categories, have been selected. The results of the 3 categories of simulations demonstrate that the one-dimensional module of the MARS 2.1 code and the RELAP5/MOD3.3 code are essentially the same code. This is expected as the two codes have basically the same set of field equations, constitutive equations and main thermal hydraulic models. The results suggests that the high level of code validity of the RELAP5/MOD3.3 can be directly applied to the MARS one-dimensional module

  2. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. GRADSPMHD: A parallel MHD code based on the SPH formalism

    Science.gov (United States)

    Vanaverbeke, S.; Keppens, R.; Poedts, S.

    2014-03-01

    We present GRADSPMHD, a completely Lagrangian parallel magnetohydrodynamics code based on the SPH formalism. The implementation of the equations of SPMHD in the “GRAD-h” formalism assembles known results, including the derivation of the discretized MHD equations from a variational principle, the inclusion of time-dependent artificial viscosity, resistivity and conductivity terms, as well as the inclusion of a mixed hyperbolic/parabolic correction scheme for satisfying the ∇ṡB→ constraint on the magnetic field. The code uses a tree-based formalism for neighbor finding and can optionally use the tree code for computing the self-gravity of the plasma. The structure of the code closely follows the framework of our parallel GRADSPH FORTRAN 90 code which we added previously to the CPC program library. We demonstrate the capabilities of GRADSPMHD by running 1, 2, and 3 dimensional standard benchmark tests and we find good agreement with previous work done by other researchers. The code is also applied to the problem of simulating the magnetorotational instability in 2.5D shearing box tests as well as in global simulations of magnetized accretion disks. We find good agreement with available results on this subject in the literature. Finally, we discuss the performance of the code on a parallel supercomputer with distributed memory architecture. Catalogue identifier: AERP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 620503 No. of bytes in distributed program, including test data, etc.: 19837671 Distribution format: tar.gz Programming language: FORTRAN 90/MPI. Computer: HPC cluster. Operating system: Unix. Has the code been vectorized or parallelized?: Yes, parallelized using MPI. RAM: ˜30 MB for a

  4. Development of an automatic validation system for simulation codes of the fusion research; Entwicklung eines automatischen Validierungssystems fuer Simulationscodes der Fusionsforschung

    Energy Technology Data Exchange (ETDEWEB)

    Galonska, Andreas

    2010-03-15

    In the present master thesis the development oa an automatic validation system for the simulation code ERO is documented. This 3D Monte-carlo code models the transport of impurities as well as plasma-wall interaction processes and has great importance for the fusion research. The validation system is based on JuBE (Julich Benchmarking Environment), the flexibility of which allows a slight extension of the system to other codes, for instance such, which are operated in the framework of the EU Task Force ITM (Integrated Tokamak Modelling). The chosen solution - JuBE and a special program for the ''intellectual'' comparison of actual and reference-edition data of ERO is described and founded. The use of this program and the configuration of JuBE are detailedly described. Simulations to different plasma experiments, which serve as reference cases for the automatic validation, are explained. The working of the system is illustrated by the description of a test case. This treats the failure localization and improvement in the parallelization of an important ERO module (tracking of physically eroded particle). It is demonstrated, how the system reacts in an erroneous validation and the subsequently performed error correction leads to a positive result. Finally a speed-up curve of the parallelization is established by means of the output data of JuBE.

  5. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  6. Validation of the TASS/SMR-S Code for the PRHRS Condensation Heat Transfer Model

    International Nuclear Information System (INIS)

    Jun, In Sub; Yang, Soo Hyoung; Chung, Young Jong; Lee, Won Jae

    2011-01-01

    When some accidents or events are occurred in the SMART, the secondary system is used to remove the core decay heat for the long time such as a feedwater system. But if the feedwater system can't remove the residual core heat because of its malfunction, the core decay heat is removed using the Passive Residual Heat Removal System (PRHRS). The PRHRS is passive type safety system adopted to enhance the safety of the SMART. It can fundamentally eliminate the uncertainty of operator action. TASS/SMR-S (Transient And Setpoint Simulation/ System-integrated Modular Reactor-Safety) code has various heat transfer models reflecting the design features of the SMART. One of the heat transfer models is the PRHRS condensation heat transfer model. The role of this model is to calculate the heat transfer coefficient in the heat exchanger (H/X) tube side using the relevant heat transfer correlations for all of the heat transfer modes. In this paper, the validation of the condensation heat transfer model was carried out using the POSTECH H/X heat transfer test

  7. Validation of PV-RPM Code in the System Advisor Model.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whether the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.

  8. On various validity criteria for the configuration average in collisional-radiative codes

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M [Commissariat a l' Energie Atomique, Service ' Photons, Atomes et Molecules' , Centre d' Etudes de Saclay, F91191 Gif-sur-Yvette Cedex (France)

    2008-01-28

    The characterization of out-of-local-thermal-equilibrium plasmas requires the use of collisional-radiative kinetic equations. This leads to the solution of large linear systems, for which statistical treatments such as configuration average may bring considerable simplification. In order to check the validity of this procedure, a criterion based on the comparison between a partial-rate systems and the Saha-Boltzmann solution is discussed in detail here. Several forms of this criterion are discussed. The interest of these variants is that they involve each type of relevant transition (collisional or radiative), which allows one to check separately the influence of each of these processes on the configuration-average validity. The method is illustrated by a charge-distribution analysis in carbon and neon plasmas. Finally, it is demonstrated that when the energy dispersion of every populated configuration is smaller than the electron thermal energy, the proposed criterion is fulfilled in each of its forms.

  9. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  10. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  11. Validation of the TRACR3D code for soil water flow under saturated/unsaturated conditions in three experiments

    International Nuclear Information System (INIS)

    Perkins, B.; Travis, B.; DePoorter, G.

    1985-01-01

    Validation of the TRACR3D code in a one-dimensional form was obtained for flow of soil water in three experiments. In the first experiment, a pulse of water entered a crushed-tuff soil and initially moved under conditions of saturated flow, quickly followed by unsaturated flow. In the second experiment, steady-state unsaturated flow took place. In the final experiment, two slugs of water entered crushed tuff under field conditions. In all three experiments, experimentally measured data for volumetric water content agreed, within experimental errors, with the volumetric water content predicted by the code simulations. The experiments and simulations indicated the need for accurate knowledge of boundary and initial conditions, amount and duration of moisture input, and relevant material properties as input into the computer code. During the validation experiments, limitations on monitoring of water movement in waste burial sites were also noted. 5 references, 34 figures, 9 tables

  12. CSNI Integral test facility validation matrix for the assessment of thermal-hydraulic codes for LWR LOCA and transients

    International Nuclear Information System (INIS)

    1996-07-01

    This report deals with an internationally agreed integral test facility (ITF) matrix for the validation of best estimate thermal-hydraulic computer codes. Firstly, the main physical phenomena that occur during the considered accidents are identified, test types are specified, and test facilities suitable for reproducing these aspects are selected. Secondly, a life of selected experiments carried out in these facilities has been set down. The criteria to achieve the objectives are outlined. The construction of such a matrix is an attempt to collect together in a systematic way the best sets of openly available test data for code validation, assessment and improvement, including quantitative assessment of uncertainties in the modelling of phenomena by the codes. In addition to this objective, it is an attempt to record information which has been generated around the world over the last 20 years so that it is more accessible to present and future workers in that field than would otherwise be the case

  13. Validation study of the reactor physics lattice transport code WIMSD-5B by TRX and BAPL critical experiments of light water reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.

    2015-01-01

    Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to

  14. Validation of Printed Circuit Heat Exchanger Design Code KAIST{sub H}XD

    Energy Technology Data Exchange (ETDEWEB)

    Baik, Seungjoon; Kim, Seong Gu; Lee, Jekyoung; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    Supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle has been suggested for the SFR due to the relatively mild sodium-CO{sub 2} interaction. The S-CO{sub 2} power conversion cycle can achieve not only high safety but also high efficiency with SFR core thermal condition. However, due to the dramatic property change near the critical point, the inlet pressure and temperature conditions of compressor can have significant effect on the overall cycle efficiency. To maintain the inlet condition of compressor, a sensitive precooler control system is required for stable operation. Therefore understanding the precooler performance is essential for the S-CO{sub 2} power conversion system. According to experimental result, designed PCHE showed high effectiveness in various operating regions. Comparing the experimental and the design data, heat transfer performance estimation showed less than 6% error. On the other hand, the pressure drop estimation showed large gap. The water side pressure drop showed 50-70% under estimation. Because the form losses were not included in the design code, water side pressure drop estimation result seems reliable. However, the CO{sub 2} side showed more than 70% over estimation in the pressure drop from the code. The authors suspect that the differences may have occurred by the channel corner shape. The real channel has round corners and smooth edge, but the correlation is based on the sharp edged zig-zag channel. Further studies are required to understand and interpret the results correctly in the future.

  15. The PHREEQE Geochemical equilibrium code data base and calculations

    International Nuclear Information System (INIS)

    Andersoon, K.

    1987-01-01

    Compilation of a thermodynamic data base for actinides and fission products for use with PHREEQE has begun and a preliminary set of actinide data has been tested for the PHREEQE code in a version run on an IBM XT computer. The work until now has shown that the PHREEQE code mostly gives satisfying results for specification of actinides in natural water environment. For U and Np under oxidizing conditions, however, the code has difficulties to converge with pH and Eh conserved when a solubility limit is applied. For further calculations of actinide and fission product specification and solubility in a waste repository and in the surrounding geosphere, more data are needed. It is necessary to evaluate the influence of the large uncertainties of some data. A quality assurance and a check on the consistency of the data base is also needed. Further work with data bases should include: an extension to fission products, an extension to engineering materials, an extension to other ligands than hydroxide and carbonate, inclusion of more mineral phases, inclusion of enthalpy data, a control of primary references in order to decide if values from different compilations are taken from the same primary reference and contacts and discussions with other groups, working with actinide data bases, e.g. at the OECD/NEA and at the IAEA. (author)

  16. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  17. Validation of coupled Relap5-3D code in the analysis of RBMK-1500 specific transients

    International Nuclear Information System (INIS)

    Evaldas, Bubelis; Algirdas, Kaliatka; Eugenijus, Uspuras

    2003-01-01

    This paper deals with the modelling of RBMK-1500 specific transients taking place at Ignalina NPP. These transients include: measurements of void and fast power reactivity coefficients, change of graphite cooling conditions and reactor power reduction transients. The simulation of these transients was performed using RELAP5-3D code model of RBMK-1500 reactor. At the Ignalina NPP void and fast power reactivity coefficients are measured on a regular basis and, based on the total reactor power, reactivity, control and protection system control rods positions and the main circulation circuit parameter changes during the experiments, the actual values of these reactivity coefficients are determined. Graphite temperature reactivity coefficient at the plant is determined by changing graphite cooling conditions in the reactor cavity. This type of transient is very unique and important from the gap between fuel channel and the graphite bricks model validation point of view. The measurement results, obtained during this transient, allowed to determine the thermal conductivity coefficient for this gap and to validate the graphite temperature reactivity feedback model. Reactor power reduction is a regular operation procedure during the entire lifetime of the reactor. In all cases it starts by either a scram or a power reduction signal activation by the reactor control and protection system or by an operator. The obtained calculation results demonstrate reasonable agreement with Ignalina NPP measured data. Behaviours of the separate MCC thermal-hydraulic parameters as well as physical processes are predicted reasonably well to the real processes, occurring in the primary circuit of RBMK-1500 reactor. Reasonable agreement of the measured and the calculated total reactor power change in time demonstrates the correct modelling of the neutronic processes taking place in RBMK- 1500 reactor core. And finally, the performed validation of RELAP5-3D model of Ignalina NPP RBMK-1500

  18. Three-dimensional all-speed CFD code for safety analysis of nuclear reactor containment: Status of GASFLOW parallelization, model development, validation and application

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Jianjun, E-mail: jianjun.xiao@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Travis, John R., E-mail: jack_travis@comcast.com [Engineering and Scientific Software Inc., 3010 Old Pecos Trail, Santa Fe, NM 87505 (United States); Royl, Peter, E-mail: peter.royl@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Necker, Gottfried, E-mail: gottfried.necker@partner.kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Svishchev, Anatoly, E-mail: anatoly.svishchev@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany); Jordan, Thomas, E-mail: thomas.jordan@kit.edu [Institute of Nuclear and Energy Technologies, Karlsruhe Institute of Technology, P.O. Box 3640, 76021 Karlsruhe (Germany)

    2016-05-15

    Highlights: • 3-D scalable semi-implicit pressure-based CFD code for containment safety analysis. • Robust solution algorithm valid for all-speed flows. • Well validated and widely used CFD code for hydrogen safety analysis. • Code applied in various types of nuclear reactor containments. • Parallelization enables high-fidelity models in large scale containment simulations. - Abstract: GASFLOW is a three dimensional semi-implicit all-speed CFD code which can be used to predict fluid dynamics, chemical kinetics, heat and mass transfer, aerosol transportation and other related phenomena involved in postulated accidents in nuclear reactor containments. The main purpose of the paper is to give a brief review on recent GASFLOW code development, validations and applications in the field of nuclear safety. GASFLOW code has been well validated by international experimental benchmarks, and has been widely applied to hydrogen safety analysis in various types of nuclear power plants in European and Asian countries, which have been summarized in this paper. Furthermore, four benchmark tests of a lid-driven cavity flow, low Mach number jet flow, 1-D shock tube and supersonic flow over a forward-facing step are presented in order to demonstrate the accuracy and wide-ranging capability of ICE’d ALE solution algorithm for all-speed flows. GASFLOW has been successfully parallelized using the paradigms of Message Passing Interface (MPI) and domain decomposition. The parallel version, GASFLOW-MPI, adds great value to large scale containment simulations by enabling high-fidelity models, including more geometric details and more complex physics. It will be helpful for the nuclear safety engineers to better understand the hydrogen safety related physical phenomena during the severe accident, to optimize the design of the hydrogen risk mitigation systems and to fulfill the licensing requirements by the nuclear regulatory authorities. GASFLOW-MPI is targeting a high

  19. GOTHIC-IST 6.1b code validation exercises relating to heat removal by dousing and air coolers in CANDU containment

    International Nuclear Information System (INIS)

    Ramachandran, S.; Krause, M.; Nguyen, T.

    2003-01-01

    This paper presents validation results relating to the use of the GOTHIC containment analysis code for CANDU safety analysis. The validation results indicate that GOTHIC predicts heat removal by dousing and air cooler heat transfer with reasonable accuracy. (author)

  20. Scintillator Based Coded-Aperture Imaging for Neutron Detection

    International Nuclear Information System (INIS)

    Hayes, Sean-C.; Gamage, Kelum-A-A.

    2013-06-01

    In this paper we are going to assess the variations of neutron images using a series of Monte Carlo simulations. We are going to study neutron images of the same neutron source with different source locations, using a scintillator based coded-aperture system. The Monte Carlo simulations have been conducted making use of the EJ-426 neutron scintillator detector. This type of detector has a low sensitivity to gamma rays and is therefore of particular use in a system with a source that emits a mixed radiation field. From the use of different source locations, several neutron images have been produced, compared both qualitatively and quantitatively for each case. This allows conclusions to be drawn on how suited the scintillator based coded-aperture neutron imaging system is to detecting various neutron source locations. This type of neutron imaging system can be easily used to identify and locate nuclear materials precisely. (authors)

  1. Finger Vein Recognition Based on Local Directional Code

    Science.gov (United States)

    Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang

    2012-01-01

    Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP. PMID:23202194

  2. Finger Vein Recognition Based on Local Directional Code

    Directory of Open Access Journals (Sweden)

    Rongyang Xiao

    2012-11-01

    Full Text Available Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP, Local Derivative Pattern (LDP and Local Line Binary Pattern (LLBP. However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD, this paper represents a new direction based local descriptor called Local Directional Code (LDC and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP.

  3. Automatic Structure-Based Code Generation from Coloured Petri Nets

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Westergaard, Michael

    2010-01-01

    Automatic code generation based on Coloured Petri Net (CPN) models is challenging because CPNs allow for the construction of abstract models that intermix control flow and data processing, making translation into conventional programming constructs difficult. We introduce Process-Partitioned CPNs...... (PP-CPNs) which is a subclass of CPNs equipped with an explicit separation of process control flow, message passing, and access to shared and local data. We show how PP-CPNs caters for a four phase structure-based automatic code generation process directed by the control flow of processes....... The viability of our approach is demonstrated by applying it to automatically generate an Erlang implementation of the Dynamic MANET On-demand (DYMO) routing protocol specified by the Internet Engineering Task Force (IETF)....

  4. A Secure Network Coding Based on Broadcast Encryption in SDN

    Directory of Open Access Journals (Sweden)

    Yue Chen

    2016-01-01

    Full Text Available By allowing intermediate nodes to encode the received packets before sending them out, network coding improves the capacity and robustness of multicast applications. But it is vulnerable to the pollution attacks. Some signature schemes were proposed to thwart such attacks, but most of them need to be homomorphic that the keys cannot be generated and managed easily. In this paper, we propose a novel fast and secure switch network coding multicast (SSNC on the software defined networks (SDN. In our scheme, the complicated secure multicast management was separated from the fast data transmission based on the SDN. Multiple multicasts will be aggregated to one multicast group according to the requirements of services and the network status. Then, the controller will route aggregated multicast group with network coding; only the trusted switch will be allowed to join the network coding by using broadcast encryption. The proposed scheme can use the traditional cryptography without homomorphy, which greatly reduces the complexity of the computation and improves the efficiency of transmission.

  5. Validation of CBZ code system for post-irradiation examination analysis and sensitivity analysis of (n,γ) branching ratio

    International Nuclear Information System (INIS)

    Kawamoto, Yosuke; Chiba, Go; Tsuji, Masashi; Narabayashi, Tadashi

    2013-01-01

    A code system CBZ is being developed in Hokkaido University. In order to validate it, PIE data, which are nuclide composition data of a spent fuel, have been analyzed with CBZ. The validity is evaluated as ratios of the calculation values to the experimental ones (C/E ratios). Differences between experimental values and calculation ones are smaller than 20% except some nuclides. Thus this code system is validated. Additionally, we evaluate influence of change of (n,γ) branching ratio on inventories of fission products and actinides. As a result, branching ratios of Sb-121, Pm-147, and Am-241 influence inventories of several nuclides. We perform PIE analysis using different (n,γ) branching ratio data from the ORIGEN-2 library, JNDC-Ver.2, and JEFF-3.1A, and find that differences in (n,γ) branching ratios between different nuclear libraries have a non-negligible influence on inventories of several nuclides. (author)

  6. Analysis of the impact of correlated benchmark experiments on the validation of codes for criticality safety analysis

    International Nuclear Information System (INIS)

    Bock, M.; Stuke, M.; Behler, M.

    2013-01-01

    The validation of a code for criticality safety analysis requires the recalculation of benchmark experiments. The selected benchmark experiments are chosen such that they have properties similar to the application case that has to be assessed. A common source of benchmark experiments is the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments' (ICSBEP Handbook) compiled by the 'International Criticality Safety Benchmark Evaluation Project' (ICSBEP). In order to take full advantage of the information provided by the individual benchmark descriptions for the application case, the recommended procedure is to perform an uncertainty analysis. The latter is based on the uncertainties of experimental results included in most of the benchmark descriptions. They can be performed by means of the Monte Carlo sampling technique. The consideration of uncertainties is also being introduced in the supplementary sheet of DIN 25478 'Application of computer codes in the assessment of criticality safety'. However, for a correct treatment of uncertainties taking into account the individual uncertainties of the benchmark experiments is insufficient. In addition, correlations between benchmark experiments have to be handled correctly. For example, these correlations can arise due to different cases of a benchmark experiment sharing the same components like fuel pins or fissile solutions. Thus, manufacturing tolerances of these components (e.g. diameter of the fuel pellets) have to be considered in a consistent manner in all cases of the benchmark experiment. At the 2012 meeting of the Expert Group on 'Uncertainty Analysis for Criticality Safety Assessment' (UACSA) of the OECD/NEA a benchmark proposal was outlined that aimed for the determination of the impact on benchmark correlations on the estimation of the computational bias of the neutron multiplication factor (k eff ). The analysis presented here is based on this proposal. (orig.)

  7. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  8. Comparison and validation of the results of the AZNHEX v.1.0 code with the MCNP code simulating the core of a fast reactor cooled with sodium

    International Nuclear Information System (INIS)

    Galicia A, J.; Francois L, J. L.; Bastida O, G. E.; Esquivel E, J.

    2016-09-01

    The development of the AZTLAN platform for the analysis and design of nuclear reactors is led by Instituto Nacional de Investigaciones Nucleares (ININ) and divided into four working groups, which have well-defined activities to achieve significant progress in this project individually and jointly. Within these working groups is the users group, whose main task is to use the codes that make up the AZTLAN platform to provide feedback to the developers, and in this way to make the final versions of the codes are efficient and at the same time reliable and easy to understand. In this paper we present the results provided by the AZNHEX v.1.0 code when simulating the core of a fast reactor cooled with sodium at steady state. The validation of these results is a fundamental part of the platform development and responsibility of the users group, so in this research the results obtained with AZNHEX are compared and analyzed with those provided by the Monte Carlo code MCNP-5, software worldwide used and recognized. A description of the methodology used with MCNP-5 is also presented for the calculation of the interest variables and the difference that is obtained with respect to the calculated with AZNHEX. (Author)

  9. Validation of film dryout model in a three-fluid code FIDAS

    International Nuclear Information System (INIS)

    Sugawara, Satoru

    1989-11-01

    Analytical prediction model of critical heat flux (CHF) has been developed on the basis of film dryout criterion due to droplets deposition and entrainment in annular mist flow. CHF in round tubes were analyzed by the Film Dryout Analysis Code in Subchannels, FIDAS, which is based on the three-fluid, three-field and newly developed film dryout model. Predictions by FIDAS were compared with the world-wide experimental data on CHF obtained in water and Freon for uniformly and non-uniformly heated tubes under vertical upward flow condition. Furthermore, CHF prediction capability of FIDAS was compared with those of other film dryout models for annular flow and Katto's CHF correlation. The predictions of FIDAS are in sufficient agreement with the experimental CHF data, and indicate better agreement than the other film dryout models and empirical correlation of Katto. (author)

  10. Online cross-validation-based ensemble learning.

    Science.gov (United States)

    Benkeser, David; Ju, Cheng; Lendle, Sam; van der Laan, Mark

    2018-01-30

    Online estimators update a current estimate with a new incoming batch of data without having to revisit past data thereby providing streaming estimates that are scalable to big data. We develop flexible, ensemble-based online estimators of an infinite-dimensional target parameter, such as a regression function, in the setting where data are generated sequentially by a common conditional data distribution given summary measures of the past. This setting encompasses a wide range of time-series models and, as special case, models for independent and identically distributed data. Our estimator considers a large library of candidate online estimators and uses online cross-validation to identify the algorithm with the best performance. We show that by basing estimates on the cross-validation-selected algorithm, we are asymptotically guaranteed to perform as well as the true, unknown best-performing algorithm. We provide extensions of this approach including online estimation of the optimal ensemble of candidate online estimators. We illustrate excellent performance of our methods using simulations and a real data example where we make streaming predictions of infectious disease incidence using data from a large database. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. MISTRA facility for containment lumped parameter and CFD codes validation. Example of the International Standard Problem ISP47

    International Nuclear Information System (INIS)

    Tkatschenko, I.; Studer, E.; Paillere, H.

    2005-01-01

    During a severe accident in a Pressurized Water Reactor (PWR), the formation of a combustible gas mixture in the complex geometry of the reactor depends on the understanding of hydrogen production, the complex 3D thermal-hydraulics flow due to gas/steam injection, natural convection, heat transfer by condensation on walls and effect of mitigation devices. Numerical simulation of such flows may be performed either by Lumped Parameter (LP) or by Computational Fluid Dynamics (CFD) codes. Advantages and drawbacks of LP and CFD codes are well-known. LP codes are mainly developed for full size containment analysis but they need improvements, especially since they are not able to accurately predict the local gas mixing within the containment. CFD codes require a process of validation on well-instrumented experimental data before they can be used with a high degree of confidence. The MISTRA coupled effect test facility has been built at CEA to fulfil this validation objective: with numerous measurement points in the gaseous volume - temperature, gas concentration, velocity and turbulence - and with well controlled boundary conditions. As illustration of both experimental and simulation areas of this topic, a recent example in the use of MISTRA test data is presented for the case of the International Standard Problem ISP47. The proposed experimental work in the MISTRA facility provides essential data to fill the gaps in the modelling/validation of computational tools. (author)

  12. Probabilistic Decision Based Block Partitioning for Future Video Coding

    KAUST Repository

    Wang, Zhao

    2017-11-29

    In the latest Joint Video Exploration Team development, the quadtree plus binary tree (QTBT) block partitioning structure has been proposed for future video coding. Compared to the traditional quadtree structure of High Efficiency Video Coding (HEVC) standard, QTBT provides more flexible patterns for splitting the blocks, which results in dramatically increased combinations of block partitions and high computational complexity. In view of this, a confidence interval based early termination (CIET) scheme is proposed for QTBT to identify the unnecessary partition modes in the sense of rate-distortion (RD) optimization. In particular, a RD model is established to predict the RD cost of each partition pattern without the full encoding process. Subsequently, the mode decision problem is casted into a probabilistic framework to select the final partition based on the confidence interval decision strategy. Experimental results show that the proposed CIET algorithm can speed up QTBT block partitioning structure by reducing 54.7% encoding time with only 1.12% increase in terms of bit rate. Moreover, the proposed scheme performs consistently well for the high resolution sequences, of which the video coding efficiency is crucial in real applications.

  13. Development of Safety Analysis Codes and Experimental Validation for a Very High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang, H. Oh, PhD; Cliff Davis; Richard Moore

    2004-11-01

    The very high temperature gas-cooled reactors (VHTGRs) are those concepts that have average coolant temperatures above 900 degrees C or operational fuel temperatures above 1250 degrees C. These concepts provide the potential for increased energy conversion efficiency and for high-temperature process heat application in addition to power generation and nuclear hydrogen generation. While all the High Temperature Gas Cooled Reactor (HTGR) concepts have sufficiently high temperatures to support process heat applications, such as desalination and cogeneration, the VHTGR's higher temperatures are suitable for particular applications such as thermochemical hydrogen production. However, the high temperature operation can be detrimental to safety following a loss-of-coolant accident (LOCA) initiated by pipe breaks caused by seismic or other events. Following the loss of coolant through the break and coolant depressurization, air from the containment will enter the core by molecular diffusion and ultimately by natural convection, leading to oxidation of the in-core graphite structures and fuel. The oxidation will release heat and accelerate the heatup of the reactor core. Thus, without any effective countermeasures, a pipe break may lead to significant fuel damage and fission product release. The Idaho National Engineering and Environmental Laboratory (INEEL) has investigated this event for the past three years for the HTGR. However, the computer codes used, and in fact none of the world's computer codes, have been sufficiently developed and validated to reliably predict this event. New code development, improvement of the existing codes, and experimental validation are imperative to narrow the uncertaninty in the predictions of this type of accident. The objectives of this Korean/United States collaboration are to develop advanced computational methods for VHTGR safety analysis codes and to validate these computer codes.

  14. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    Science.gov (United States)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  15. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  16. Compact Hilbert Curve Index Algorithm Based on Gray Code

    Directory of Open Access Journals (Sweden)

    CAO Xuefeng

    2016-12-01

    Full Text Available Hilbert curve has best clustering in various kinds of space filling curves, and has been used as an important tools in discrete global grid spatial index design field. But there are lots of redundancies in the standard Hilbert curve index when the data set has large differences between dimensions. In this paper, the construction features of Hilbert curve is analyzed based on Gray code, and then the compact Hilbert curve index algorithm is put forward, in which the redundancy problem has been avoided while Hilbert curve clustering preserved. Finally, experiment results shows that the compact Hilbert curve index outperforms the standard Hilbert index, their 1 computational complexity is nearly equivalent, but the real data set test shows the coding time and storage space decrease 40%, the speedup ratio of sorting speed is nearly 4.3.

  17. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  18. A restructuring proposal based on MELCOR for severe accident analysis code development

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sun Hee; Song, Y. M.; Kim, D. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    In order to develop a template based on existing MELCOR code, current data saving and transferring methods used in MELCOR are addressed first. Then a naming convention for the constructed module is suggested and an automatic program to convert old variables into new derived type variables has been developed. Finally, a restructured module for the SPR package has been developed to be applied to MELCOR. The current MELCOR code ensures a fixed-size storage for four different data types, and manages the variable-sized data within the storage limit by storing the data on the stacked packages. It uses pointer to identify the variables between the packages. This technique causes a difficult grasping of the meaning of the variables as well as memory waste. New features of FORTRAN90, however, make it possible to allocate the storage dynamically, and to use the user-defined data type which lead to a restructured module development for the SPR package. An efficient memory treatment and as easy understanding of the code are allowed in this developed module. The validation of the template has been done by comparing the results of the modified code with those from the existing code, and it is confirmed that the results are the same. The template for the SPR package suggested in this report hints the extension of the template to the entire code. It is expected that the template will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models. 3 refs., 15 figs., 16 tabs. (Author)

  19. Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice

    Science.gov (United States)

    Gambrill, Eileen

    2007-01-01

    Different views of evidence-based practice (EBP) include defining it as the use of empirically-validated treatments and practice guidelines (i.e., the EBPs approach) in contrast to the broad philosophy and related evolving process described by the originators. Social workers can draw on their code of ethics and accreditation standards both to…

  20. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  1. Experimental benchmark of non-local-thermodynamic-equilibrium plasma atomic physics codes; Validation experimentale des codes de physique atomique des plasmas hors equilibre thermodynamique local

    Energy Technology Data Exchange (ETDEWEB)

    Nagels-Silvert, V

    2004-09-15

    The main purpose of this thesis is to get experimental data for the testing and validation of atomic physics codes dealing with non-local-thermodynamical-equilibrium plasmas. The first part is dedicated to the spectroscopic study of xenon and krypton plasmas that have been produced by a nanosecond laser pulse interacting with a gas jet. A Thomson scattering diagnostic has allowed us to measure independently plasma parameters such as electron temperature, electron density and the average ionisation state. We have obtained time integrated spectra in the range between 5 and 10 angstroms. We have identified about one hundred xenon rays between 8.6 and 9.6 angstroms via the use of the Relac code. We have discovered unknown rays for the krypton between 5.2 and 7.5 angstroms. In a second experiment we have extended the wavelength range to the X UV domain. The Averroes/Transpec code has been tested in the ranges from 9 to 15 angstroms and from 10 to 130 angstroms, the first range has been well reproduced while the second range requires a more complex data analysis. The second part is dedicated to the spectroscopic study of aluminium, selenium and samarium plasmas in femtosecond operating rate. We have designed an interferometry diagnostic in the frequency domain that has allowed us to measure the expanding speed of the target's backside. Via the use of an adequate isothermal model this parameter has led us to know the plasma electron temperature. Spectra and emission times of various rays from the aluminium and selenium plasmas have been computed satisfactorily with the Averroes/Transpec code coupled with Film and Multif hydrodynamical codes. (A.C.)

  2. Four Year-Olds Use Norm-Based Coding for Face Identity

    Science.gov (United States)

    Jeffery, Linda; Read, Ainsley; Rhodes, Gillian

    2013-01-01

    Norm-based coding, in which faces are coded as deviations from an average face, is an efficient way of coding visual patterns that share a common structure and must be distinguished by subtle variations that define individuals. Adults and school-aged children use norm-based coding for face identity but it is not yet known if pre-school aged…

  3. A predictive coding account of bistable perception - a model-based fMRI study.

    Directory of Open Access Journals (Sweden)

    Veith Weilnhammer

    2017-05-01

    Full Text Available In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together

  4. A predictive coding account of bistable perception - a model-based fMRI study.

    Science.gov (United States)

    Weilnhammer, Veith; Stuke, Heiner; Hesselmann, Guido; Sterzer, Philipp; Schmack, Katharina

    2017-05-01

    In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together, our current work

  5. Fractal Image Coding Based on a Fitting Surface

    Directory of Open Access Journals (Sweden)

    Sheng Bi

    2014-01-01

    Full Text Available A no-search fractal image coding method based on a fitting surface is proposed. In our research, an improved gray-level transform with a fitting surface is introduced. One advantage of this method is that the fitting surface is used for both the range and domain blocks and one set of parameters can be saved. Another advantage is that the fitting surface can approximate the range and domain blocks better than the previous fitting planes; this can result in smaller block matching errors and better decoded image quality. Since the no-search and quadtree techniques are adopted, smaller matching errors also imply less number of blocks matching which results in a faster encoding process. Moreover, by combining all the fitting surfaces, a fitting surface image (FSI is also proposed to speed up the fractal decoding. Experiments show that our proposed method can yield superior performance over the other three methods. Relative to range-averaged image, FSI can provide faster fractal decoding process. Finally, by combining the proposed fractal coding method with JPEG, a hybrid coding method is designed which can provide higher PSNR than JPEG while maintaining the same Bpp.

  6. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  7. MARS-KS code validation activity through the atlas domestic standard problem

    International Nuclear Information System (INIS)

    Choi, K. Y.; Kim, Y. S.; Kang, K. H.; Park, H. S.; Cho, S.

    2012-01-01

    The 2 nd Domestic Standard Problem (DSP-02) exercise using the ATLAS integral effect test data was executed to transfer the integral effect test data to domestic nuclear industries and to contribute to improving the safety analysis methodology for PWRs. A small break loss of coolant accident of a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. Ten calculation results using MARS-KS code were collected, major prediction results were described qualitatively and code prediction accuracy was assessed quantitatively using the FFTBM. In addition, special code assessment activities were carried out to find out the area where the model improvement is required in the MARS-KS code. The lessons from this DSP-02 and recommendations to code developers are described in this paper. (authors)

  8. Performance enhancement of successive interference cancellation scheme based on spectral amplitude coding for optical code-division multiple-access systems using Hadamard codes

    Science.gov (United States)

    Eltaif, Tawfig; Shalaby, Hossam M. H.; Shaari, Sahbudin; Hamarsheh, Mohammad M. N.

    2009-04-01

    A successive interference cancellation scheme is applied to optical code-division multiple-access (OCDMA) systems with spectral amplitude coding (SAC). A detailed analysis of this system, with Hadamard codes used as signature sequences, is presented. The system can easily remove the effect of the strongest signal at each stage of the cancellation process. In addition, simulation of the prose system is performed in order to validate the theoretical results. The system shows a small bit error rate at a large number of active users compared to the SAC OCDMA system. Our results reveal that the proposed system is efficient in eliminating the effect of the multiple-user interference and in the enhancement of the overall performance.

  9. Development and Application of a Code for Internal Exposure (CINEX) based on the CINDY code

    International Nuclear Information System (INIS)

    Kravchik, T.; Duchan, N.; Sarah, R.; Gabay, Y.; Kol, R.

    2004-01-01

    Internal exposure to radioactive materials at the NRCN is evaluated using the CINDY (Code for Internal Dosimetry) Package. The code was developed by the Pacific Northwest Laboratory to assist the interpretation of bioassay data, provide bioassay projections and evaluate committed and calendar-year doses from intake or bioassay measurement data. It provides capabilities to calculate organ dose and effective dose equivalents using the International Commission on Radiological Protection (ICRP) 30 approach. The CINDY code operates under DOS operating system and consequently its operation needs a relatively long procedure which also includes a lot of manual typing that can lead to personal human mistakes. A new code has been developed at the NRCN, the CINEX (Code for Internal Exposure), which is an Excel application and leads to a significant reduction in calculation time (in the order of 5-10 times) and in the risk of personal human mistakes. The code uses a database containing tables which were constructed by the CINDY and contain the bioassay values predicted by the ICRP30 model after an intake of an activity unit of each isotope. Using the database, the code than calculates the appropriate intake and consequently the committed effective dose and organ dose. Calculations with the CINEX code were compared to similar calculations with the CINDY code. The discrepancies were less than 5%, which is the rounding error of the CINDY code. Attached is a table which compares parameters calculated with the CINEX and the CINDY codes (for a class Y uranium). The CINEX is now used at the NRCN to calculate occupational intakes and doses to workers with radioactive materials

  10. Code Development on Fission Product Behavior under Severe Accident-Validation of Aerosol Sedimentation

    International Nuclear Information System (INIS)

    Ha, Kwang Soon; Kim, Sung Il; Jang, Jin Sung; Kim, Dong Ha

    2016-01-01

    The gas and aerosol phases of the radioactive materials move through the reactor coolant systems and containments as loaded on the carrier gas or liquid, such as steam or water. Most radioactive materials might escape in the form of aerosols from a nuclear power plant during a severe reactor accident, and it is very important to predict the behavior of these radioactive aerosols in the reactor cooling system and in the containment building under severe accident conditions. Aerosols are designated as very small solid particles or liquid droplets suspended in a gas phase. The suspended solid or liquid particles typically have a range of sizes of 0.01 m to 20 m. Aerosol concentrations in reactor accident analyses are typically less than 100 g/m3 and usually less than 1 g/m3. When there are continuing sources of aerosol to the gas phase or when there are complicated processes involving engineered safety features, much more complicated size distributions develop. It is not uncommon for aerosols in reactor containments to have bimodal size distributions for at least some significant periods of time early during an accident. Salient features of aerosol physics under reactor accident conditions that will affect the nature of the aerosols are (1) the formation of aerosol particles, (2) growth of aerosol particles, (3) shape of aerosol particles. At KAERI, a fission product module has been developed to predict the behaviors of the radioactive materials in the reactor coolant system under severe accident conditions. The fission product module consists of an estimation of the initial inventories, species release from the core, aerosol generation, gas transport, and aerosol transport. The final outcomes of the fission product module designate the radioactive gas and aerosol distribution in the reactor coolant system. The aerosol sedimentation models in the fission product module were validated using ABCOVE and LACE experiments. There were some discrepancies on the predicted

  11. Code Development on Fission Product Behavior under Severe Accident-Validation of Aerosol Sedimentation

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Kwang Soon; Kim, Sung Il; Jang, Jin Sung; Kim, Dong Ha [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The gas and aerosol phases of the radioactive materials move through the reactor coolant systems and containments as loaded on the carrier gas or liquid, such as steam or water. Most radioactive materials might escape in the form of aerosols from a nuclear power plant during a severe reactor accident, and it is very important to predict the behavior of these radioactive aerosols in the reactor cooling system and in the containment building under severe accident conditions. Aerosols are designated as very small solid particles or liquid droplets suspended in a gas phase. The suspended solid or liquid particles typically have a range of sizes of 0.01 m to 20 m. Aerosol concentrations in reactor accident analyses are typically less than 100 g/m3 and usually less than 1 g/m3. When there are continuing sources of aerosol to the gas phase or when there are complicated processes involving engineered safety features, much more complicated size distributions develop. It is not uncommon for aerosols in reactor containments to have bimodal size distributions for at least some significant periods of time early during an accident. Salient features of aerosol physics under reactor accident conditions that will affect the nature of the aerosols are (1) the formation of aerosol particles, (2) growth of aerosol particles, (3) shape of aerosol particles. At KAERI, a fission product module has been developed to predict the behaviors of the radioactive materials in the reactor coolant system under severe accident conditions. The fission product module consists of an estimation of the initial inventories, species release from the core, aerosol generation, gas transport, and aerosol transport. The final outcomes of the fission product module designate the radioactive gas and aerosol distribution in the reactor coolant system. The aerosol sedimentation models in the fission product module were validated using ABCOVE and LACE experiments. There were some discrepancies on the predicted

  12. Chiari malformation Type I surgery in pediatric patients. Part 1: validation of an ICD-9-CM code search algorithm.

    Science.gov (United States)

    Ladner, Travis R; Greenberg, Jacob K; Guerrero, Nicole; Olsen, Margaret A; Shannon, Chevis N; Yarbrough, Chester K; Piccirillo, Jay F; Anderson, Richard C E; Feldstein, Neil A; Wellons, John C; Smyth, Matthew D; Park, Tae Sung; Limbrick, David D

    2016-05-01

    OBJECTIVE Administrative billing data may facilitate large-scale assessments of treatment outcomes for pediatric Chiari malformation Type I (CM-I). Validated International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code algorithms for identifying CM-I surgery are critical prerequisites for such studies but are currently only available for adults. The objective of this study was to validate two ICD-9-CM code algorithms using hospital billing data to identify pediatric patients undergoing CM-I decompression surgery. METHODS The authors retrospectively analyzed the validity of two ICD-9-CM code algorithms for identifying pediatric CM-I decompression surgery performed at 3 academic medical centers between 2001 and 2013. Algorithm 1 included any discharge diagnosis code of 348.4 (CM-I), as well as a procedure code of 01.24 (cranial decompression) or 03.09 (spinal decompression or laminectomy). Algorithm 2 restricted this group to the subset of patients with a primary discharge diagnosis of 348.4. The positive predictive value (PPV) and sensitivity of each algorithm were calculated. RESULTS Among 625 first-time admissions identified by Algorithm 1, the overall PPV for CM-I decompression was 92%. Among the 581 admissions identified by Algorithm 2, the PPV was 97%. The PPV for Algorithm 1 was lower in one center (84%) compared with the other centers (93%-94%), whereas the PPV of Algorithm 2 remained high (96%-98%) across all subgroups. The sensitivity of Algorithms 1 (91%) and 2 (89%) was very good and remained so across subgroups (82%-97%). CONCLUSIONS An ICD-9-CM algorithm requiring a primary diagnosis of CM-I has excellent PPV and very good sensitivity for identifying CM-I decompression surgery in pediatric patients. These results establish a basis for utilizing administrative billing data to assess pediatric CM-I treatment outcomes.

  13. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  14. VALIDATION OF FULL CORE GEOMETRY MODEL OF THE NODAL3 CODE IN THE PWR TRANSIENT BENCHMARK PROBLEMS

    Directory of Open Access Journals (Sweden)

    Tagor Malem Sembiring

    2015-10-01

    Full Text Available ABSTRACT VALIDATION OF FULL CORE GEOMETRY MODEL OF THE NODAL3 CODE IN THE PWR TRANSIENT BENCHMARK PROBLEMS. The coupled neutronic and thermal-hydraulic (T/H code, NODAL3 code, has been validated in some PWR static benchmark and the NEACRP PWR transient benchmark cases. However, the NODAL3 code have not yet validated in the transient benchmark cases of a control rod assembly (CR ejection at peripheral core using a full core geometry model, the C1 and C2 cases.  By this research work, the accuracy of the NODAL3 code for one CR ejection or the unsymmetrical group of CRs ejection case can be validated. The calculations by the NODAL3 code have been carried out by the adiabatic method (AM and the improved quasistatic method (IQS. All calculated transient parameters by the NODAL3 code were compared with the reference results by the PANTHER code. The maximum relative difference of 16% occurs in the calculated time of power maximum parameter by using the IQS method, while the relative difference of the AM method is 4% for C2 case.  All calculation results by the NODAL3 code shows there is no systematic difference, it means the neutronic and T/H modules are adopted in the code are considered correct. Therefore, all calculation results by using the NODAL3 code are very good agreement with the reference results. Keywords: nodal method, coupled neutronic and thermal-hydraulic code, PWR, transient case, control rod ejection.   ABSTRAK VALIDASI MODEL GEOMETRI TERAS PENUH PAKET PROGRAM NODAL3 DALAM PROBLEM BENCHMARK GAYUT WAKTU PWR. Paket program kopel neutronik dan termohidraulika (T/H, NODAL3, telah divalidasi dengan beberapa kasus benchmark statis PWR dan kasus benchmark gayut waktu PWR NEACRP.  Akan tetapi, paket program NODAL3 belum divalidasi dalam kasus benchmark gayut waktu akibat penarikan sebuah perangkat batang kendali (CR di tepi teras menggunakan model geometri teras penuh, yaitu kasus C1 dan C2. Dengan penelitian ini, akurasi paket program

  15. Validation of Advanced Computer Codes for VVER Technology: LB-LOCA Transient in PSB-VVER Facility

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available The OECD/NEA PSB-VVER project provided unique and useful experimental data for code validation from PSB-VVER test facility. This facility represents the scaled-down layout of the Russian-designed pressurized water reactor, namely, VVER-1000. Five experiments were executed, dealing with loss of coolant scenarios (small, intermediate, and large break loss of coolant accidents, a primary-to-secondary leak, and a parametric study (natural circulation test aimed at characterizing the VVER system at reduced mass inventory conditions. The comparative analysis, presented in the paper, regards the large break loss of coolant accident experiment. Four participants from three different institutions were involved in the benchmark and applied their own models and set up for four different thermal-hydraulic system codes. The benchmark demonstrated the performances of such codes in predicting phenomena relevant for safety on the basis of fixed criteria.

  16. Acceptance and validation test report for HANSF code version 1.3.2

    International Nuclear Information System (INIS)

    PIEPHO, M.G.

    2001-01-01

    The HANSF code, Version 1.3.2, is a stand-along code that runs only in DOS. As a result, it runs on any Windows' platform, since each Windows(trademark) platform can create a DOS-prompt window and execute HANSF in the DOS window. The HANSF code is proprietary to Fauske and Associates, Inc., (FAI) of Burr Ridge, IL, the developers of the code. The SNF Project has a license from FAI to run the HANSF code on any computer for only work related to SNF Project. The SNF Project owns the MCO.FOR routine, which is the main routine in HANSF for CVDF applications. The HANSF code calculates physical variables such as temperature, pressure, oxidation rates due to chemical reactions of uranium metal/fuel with water or oxygen. The code is used by the Spent Nuclear Fuel (SNF) Project at Hanford; for example, the report Thermal Analysis of Cold Vacuum Drying of Spent Nuclear Fuel (HNF-SD-SNF-CN-023). The primary facilities of interest are the K-Basins, Cold Vacuum Drying Facility (CVDF), Canister Storage Building (CSB) and T Plant. The overall Summary is presented in Section 2.0, Variances in Section 3.0, Comprehensive Assessment in Section 4.0, Results in Section 5.0, Evaluation in Section 6.0, and Summary of Activities in Section 7.0

  17. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  18. Visual Coding of Human Bodies: Perceptual Aftereffects Reveal Norm-Based, Opponent Coding of Body Identity

    Science.gov (United States)

    Rhodes, Gillian; Jeffery, Linda; Boeing, Alexandra; Calder, Andrew J.

    2013-01-01

    Despite the discovery of body-selective neural areas in occipitotemporal cortex, little is known about how bodies are visually coded. We used perceptual adaptation to determine how body identity is coded. Brief exposure to a body (e.g., anti-Rose) biased perception toward an identity with opposite properties (Rose). Moreover, the size of this…

  19. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  20. Convolutional Code Based PAPR Reduction Scheme for Multicarrier Transmission with Higher Number of Subcarriers

    Directory of Open Access Journals (Sweden)

    SAJJAD ALIMEMON

    2017-10-01

    Full Text Available Multicarrier transmission technique has become a prominent transmission technique in high-speed wireless communication systems. It is due to its frequency diversity,small inter-symbol interference in the multipath fading channel, simple equalizer structure, and high bandwidth efficiency. Nevertheless, in thetime domain, multicarrier transmission signal has high PAPR (Peak-to-Average Power Ratio thatinterprets to low power amplifier efficiencies. To decrease the PAPR, a CCSLM (Convolutional Code Selective Mapping scheme for multicarrier transmission with a high number of subcarriers is proposed in this paper. Proposed scheme is based on SLM method and employs interleaver and convolutional coding. Related works on the PAPR reduction have considered either 128 or 256 number of subcarriers. However, PAPR of multicarrier transmission signal will increase as a number of subcarriers increases. The proposed method achieves significant PAPR reduction for ahigher number of subcarriers as well as better power amplifier efficiency. Simulation outcomes validate the usefulness of projected scheme.

  1. Validation of thermohydraulic codes by comparison of experimental results with computer simulations

    International Nuclear Information System (INIS)

    Madeira, A.A.; Galetti, M.R.S.; Pontedeiro, A.C.

    1989-01-01

    The results obtained by simulation of three cases from CANON depressurization experience, using the TRAC-PF1 computer code, version 7.6, implanted in the VAX-11/750 computer of Brazilian CNEN, are presented. The CANON experience was chosen as first standard problem in thermo-hydraulic to be discussed at ENFIR for comparing results from different computer codes with results obtained experimentally. The ability of TRAC-PF1 code to prevent the depressurization phase of a loss of primary collant accident in pressurized water reactors is evaluated. (M.C.K.) [pt

  2. On-going activities in the European JASMIN project for the development and validation of ASTEC-Na SFR safety simulation code - 15072

    International Nuclear Information System (INIS)

    Girault, N.; Cloarec, L.; Herranz, L.; Bandini, G.; Perez-Martin, S.; Ammirabile, L.

    2015-01-01

    The 4-year JASMIN collaborative project (Joint Advanced Severe accidents Modelling and Integration for Na-cooled fast reactors), started in Dec.2011 in the frame of the 7. Framework Programme of the European Commission. It aims at developing a new European simulation code, ASTEC-Na, dealing with the primary phase of SFR core disruptive accidents. The development of a new code, based on a robust advanced simulation tool and able to encompass the in-vessel and in-containment phenomena occurring during a severe accident is indeed of utmost interest for advanced and innovative future SFRs for which an enhanced safety level will be required. This code, based on the ASTEC European code system developed by IRSN and GRS for severe accidents in water-cooled reactors, is progressively integrating and capitalizing the state-of-the-art knowledge of SFR accidents through physical model improvement or development of new ones. New models are assessed on in-pile (CABRI, SCARABEE etc...) and out-of pile experiments conducted during the 70's-80's and code-o-code benchmarking with current accident simulation tools for SFRs is also conducted. During the 2 and a half first years of the project, model specifications and developments were conducted and the validation test matrix was built. The first version of ASTEC-Na available in early 2014 already includes a thermal-hydraulics module able to simulate single and two-phase sodium flow conditions, a zero point neutronic model with simple definition of channel and axial dependences of reactivity feedbacks and models derived from SCANAIR IRSN code for simulating fuel pin thermo-mechanical behaviour and fission gas release/retention. Meanwhile, models have been developed in the source term area for in-containment particle generation and particle chemical transformation, but their implementation is still to be done. As a first validation step, the ASTEC-Na calculations were satisfactorily compared to thermal-hydraulics experimental

  3. Nuclear component design ontology building based on ASME codes

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    The adoption of ontology analysis in the study of concept knowledge acquisition and representation for the nuclear component design process based on computer-supported cooperative work (CSCW) makes it possible to share and reuse numerous concept knowledge of multi-disciplinary domains. A practical ontology building method is accordingly proposed based on Protege knowledge model in combination with both top-down and bottom-up approaches together with Formal Concept Analysis (FCA). FCA exhibits its advantages in the way it helps establish and improve taxonomic hierarchy of concepts and resolve concept conflict occurred in modeling multi-disciplinary domains. With Protege-3.0 as the ontology building tool, a nuclear component design ontology based ASME codes is developed by utilizing the ontology building method. The ontology serves as the basis to realize concept knowledge sharing and reusing of nuclear component design. (authors)

  4. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    Science.gov (United States)

    2014-03-27

    Vehicle Code System (VCS), the Monte Carlo Adjoint SHielding (MASH), and the Monte Carlo n- Particle ( MCNP ) code. Of the three, the oldest and still most...widely utilized radiation transport code is MCNP . First created at Los Alamos National Laboratory (LANL) in 1957, the code simulated neutral...particle types, and previous versions of MCNP were repeatedly validated using both simple and complex 10 geometries [12, 13]. Much greater discussion and

  5. A 3D transport-based core analysis code for research reactors with unstructured geometry

    International Nuclear Information System (INIS)

    Zhang, Tengfei; Wu, Hongchun; Zheng, Youqi; Cao, Liangzhi; Li, Yunzhao

    2013-01-01

    Highlights: • A core analysis code package based on 3D neutron transport calculation in complex geometry is developed. • The fine considerations on flux mapping, control rod effects and isotope depletion are modeled. • The code is proved to be with high accuracy and capable of handling flexible operational cases for research reactors. - Abstract: As an effort to enhance the accuracy in simulating the operations of research reactors, a 3D transport core analysis code system named REFT was developed. HELIOS is employed due to the flexibility of describing complex geometry. A 3D triangular nodal S N method transport solver, DNTR, endows the package the capability of modeling cores with unstructured geometry assemblies. A series of dedicated methods were introduced to meet the requirements of research reactor simulations. Afterwards, to make it more user friendly, a graphical user interface was also developed for REFT. In order to validate the developed code system, the calculated results were compared with the experimental results. Both the numerical and experimental results are in close agreement with each other, with the relative errors of k eff being less than 0.5%. Results for depletion calculations were also verified by comparing them with the experimental data and acceptable consistency was observed in results

  6. LSB-Based Steganography Using Reflected Gray Code

    Science.gov (United States)

    Chen, Chang-Chu; Chang, Chin-Chen

    Steganography aims to hide secret data into an innocuous cover-medium for transmission and to make the attacker cannot recognize the presence of secret data easily. Even the stego-medium is captured by the eavesdropper, the slight distortion is hard to be detected. The LSB-based data hiding is one of the steganographic methods, used to embed the secret data into the least significant bits of the pixel values in a cover image. In this paper, we propose an LSB-based scheme using reflected-Gray code, which can be applied to determine the embedded bit from secret information. Following the transforming rule, the LSBs of stego-image are not always equal to the secret bits and the experiment shows that the differences are up to almost 50%. According to the mathematical deduction and experimental results, the proposed scheme has the same image quality and payload as the simple LSB substitution scheme. In fact, our proposed data hiding scheme in the case of G1 (one bit Gray code) system is equivalent to the simple LSB substitution scheme.

  7. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  8. Code cases for implementing risk-based inservice testing in the ASME OM code

    Energy Technology Data Exchange (ETDEWEB)

    Rowley, C.W.

    1996-12-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices.

  9. Code cases for implementing risk-based inservice testing in the ASME OM code

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1996-01-01

    Historically inservice testing has been reasonably effective, but quite costly. Recent applications of plant PRAs to the scope of the IST program have demonstrated that of the 30 pumps and 500 valves in the typical plant IST program, less than half of the pumps and ten percent of the valves are risk significant. The way the ASME plans to tackle this overly-conservative scope for IST components is to use the PRA and plant expert panels to create a two tier IST component categorization scheme. The PRA provides the quantitative risk information and the plant expert panel blends the quantitative and deterministic information to place the IST component into one of two categories: More Safety Significant Component (MSSC) or Less Safety Significant Component (LSSC). With all the pumps and valves in the IST program placed in MSSC or LSSC categories, two different testing strategies will be applied. The testing strategies will be unique for the type of component, such as centrifugal pump, positive displacement pump, MOV, AOV, SOV, SRV, PORV, HOV, CV, and MV. A series of OM Code Cases are being developed to capture this process for a plant to use. One Code Case will be for Component Importance Ranking. The remaining Code Cases will develop the MSSC and LSSC testing strategy for type of component. These Code Cases are planned for publication in early 1997. Later, after some industry application of the Code Cases, the alternative Code Case requirements will gravitate to the ASME OM Code as appendices

  10. Validation of the ASSERT subchannel code: Prediction of critical heat flux in standard and nonstandard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1995-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of Canada uranium deuterium (CANDU) pressurized heavy water reactor fuel channels and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting CHF at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental database. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. The numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology are discussed. The evolutionary validation plan is also discussed and early validation exercises are summarized. More recent validation exercises in standard and nonstandard geometries are emphasized

  11. Design and Analysis of Self-Healing Tree-Based Hybrid Spectral Amplitude Coding OCDMA System

    Directory of Open Access Journals (Sweden)

    Waqas A. Imtiaz

    2017-01-01

    Full Text Available This paper presents an efficient tree-based hybrid spectral amplitude coding optical code division multiple access (SAC-OCDMA system that is able to provide high capacity transmission along with fault detection and restoration throughout the passive optical network (PON. Enhanced multidiagonal (EMD code is adapted to elevate system’s performance, which negates multiple access interference and associated phase induced intensity noise through efficient two-matrix structure. Moreover, system connection availability is enhanced through an efficient protection architecture with tree and star-ring topology at the feeder and distribution level, respectively. The proposed hybrid architecture aims to provide seamless transmission of information at minimum cost. Mathematical model based on Gaussian approximation is developed to analyze performance of the proposed setup, followed by simulation analysis for validation. It is observed that the proposed system supports 64 subscribers, operating at the data rates of 2.5 Gbps and above. Moreover, survivability and cost analysis in comparison with existing schemes show that the proposed tree-based hybrid SAC-OCDMA system provides the required redundancy at minimum cost of infrastructure and operation.

  12. Depth Measurement Based on Infrared Coded Structured Light

    Directory of Open Access Journals (Sweden)

    Tong Jia

    2014-01-01

    Full Text Available Depth measurement is a challenging problem in computer vision research. In this study, we first design a new grid pattern and develop a sequence coding and decoding algorithm to process the pattern. Second, we propose a linear fitting algorithm to derive the linear relationship between the object depth and pixel shift. Third, we obtain depth information on an object based on this linear relationship. Moreover, 3D reconstruction is implemented based on Delaunay triangulation algorithm. Finally, we utilize the regularity of the error curves to correct the system errors and improve the measurement accuracy. The experimental results show that the accuracy of depth measurement is related to the step length of moving object.

  13. Fire-safety engineering and performance-based codes

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    project administrators, etc. The book deals with the following topics: • Historical presentation on the subject of fire • Legislation and building project administration • European fire standardization • Passive and active fire protection • Performance-based Codes • Fire-safety Engineering • Fundamental......Fire-safety Engineering is written as a textbook for Engineering students at universities and other institutions of higher education that teach in the area of fire. The book can also be used as a work of reference for consulting engineers, Building product manufacturers, contractors, building...... thermodynamics • Heat exchange during the fire process • Skin burns • Burning rate, energy release rate and design fires • Proposal to Risk-based design fires • Proposal to a Fire scale • Material ignition and flame spread • Fire dynamics in buildings • Combustion products and toxic gases • Smoke inhalation...

  14. Validation of the computer code system ATHLET / ATHLET-CD. Final report; Validierung des Rechenprogrammsystems ATHLET / ATHLET-CD. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Austregesilo, H.; Bals, C.; Erdmann, W.; Horche, W.; Krzykacz-Hausmann, B.; Pointner, W.; Schoeffel, P.; Skorek, T.; Weber, S.; Wielenberg, A.

    2010-04-15

    In the frame of the reactor safety project RS1173, sponsored by the German Federal Ministry of Economics and Technology, analyses of international integral and separate effects tests have been performed for the validation of the code system ATHLET/ATHLET-CD. The work mainly comprised post-test calculations of selected experiments and the contributions to the working groups accompanying the experimental programs. For the assessment of the thermal-hydraulic models in ATHLET 8 integral tests and 4 separate effect tests have been considered. Together with the corroboration of the existing models, the validation analyses were mainly dedicated to the assessment of the modelling of non-condensable gases and their influence on two-phase natural circulation and on the primary heat removal through steam generators, as well as of the simulation of multi-dimensional flow processes. The validation calculations with respect to the simulation of multi-dimensional one- and two-phase flows aimed to investigate the range of applicability and limitations of the method of parallel channels in connection with the separate momentum equations for water and steam current used in ATHLET as well as to assess the status of the coupled version ATHLET/FLUBOX-3D. The ATHLET-CD validation analyses included the post-test calculations of 9 bundle tests, and was mainly focussed on the assessment of the improved and new models for core degradation, including the models for oxidation, melt formation and relocation for BWR components, as well as of the modelling of fission products and aerosol transport within the primary circuit taking into account chemical reactions within the module SOPHAEROS. As an additional contribution to code validation, the GRS methodology of uncertainty and sensitivity analysis was applied exemplarily to two validation calculations, one with ATHLET and one with ATHLET-CD. The results of these uncertainty analyses endorse the capability of the code system to reproduce

  15. Validation of internal dosimetry protocols based on stochastic method

    International Nuclear Information System (INIS)

    Mendes, Bruno M.; Fonseca, Telma C.F.; Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R.

    2015-01-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  16. Validation of internal dosimetry protocols based on stochastic method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Bruno M.; Fonseca, Telma C.F., E-mail: bmm@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R., E-mail: tprcampos@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  17. Radiation sterilization of tissue allografts: Requirements for validation and routine control. A code of practice

    International Nuclear Information System (INIS)

    2007-12-01

    These recommendations for the radiation sterilization of tissue allografts adopt the principles that the International Organization for Standardization (ISO) applies to the radiation sterilization of health care products. The approach has been adapted to take into account the special features associated with human tissues and the features that distinguish them from industrially produced sterile health care products. The approach as described here is not applicable if viral contamination is identified. Thus it is emphasized that the human donors of the tissues must be medically and serologically screened. To further support this screening it is recommended that autopsy reports be reviewed if available. This adaptation of established ISO methods can thus only be applied to sterilization of tissue allografts if the radiation sterilization described here is the terminal stage of a careful, detailed, documented sequence of procedures involving: donor selection; tissue retrieval; tissue banking general procedures; specific processing procedures; labelling; and distribution. The methods proposed here for the establishment of a sterilization dose are based on statistical approaches used for the sterilization of health care products and modified appropriately for the low numbers of tissue allograft samples typically available. This code of practice will be useful to tissue banking staff, surgeons using tissues for transplantation, regulators who oversee the safety of transplantation and radiation sterilization procedures, members of tissue banking associations, health service personnel in hospitals in which tissue transplantations are performed and inter-governmental organizations involved in transplantation issues, for example the World Health Organization. This publication was discussed extensively at an international meeting in Wrexham in the United Kingdom and was approved by the Technical Advisory Committee of the relevant IAEA project, which included the Chairpersons

  18. Qualitative and quantitative validation of the SINBAD code on complex HPGe gamma-ray spectra

    Energy Technology Data Exchange (ETDEWEB)

    Rohee, E.; Coulon, R.; Normand, S.; Carrel, F. [CEA, LIST, Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Dautremer, T.; Barat, E.; Montagu, T. [CEA, LIST, Laboratoire Modelisation, Simulation et Systemes, F-91191 Gif-sur-Yvette, (France); Jammes, C. [CEA/DEN/SPEx/LDCI, Centre de Cadarache, F-13109 Saint-Paul-lez-Durance, (France)

    2015-07-01

    Radionuclides identification and quantification is a serious concern for many applications as safety or security of nuclear power plant or fuel cycle facility, CBRN risk identification, environmental radioprotection and waste measurements. High resolution gamma-ray spectrometry based on HPGe detectors is a performing solution for all these topics. During last decades, a great number of software has been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when photoelectric peaks are folded together with a high ratio between theirs amplitudes, when the Compton background is much larger compared to the signal of a single peak and when spectra are composed of a great number of peaks. This study deals with the comparison between conventional methods in radionuclides identification and quantification and the code called SINBAD ('Spectrometrie par Inference Non parametrique Bayesienne Deconvolutive'). For many years, SINBAD has been developed by CEA LIST for unfolding complex spectra from HPGe detectors. Contrary to conventional methods using fitting procedures, SINBAD uses a probabilistic approach with Bayesian inference to describe spectrum data. This conventional fitting method founded for example in Genie 2000 is compared with the nonparametric SINBAD approach regarding some key figures of merit as the peak centroid evaluation (identification) and peak surface evaluation (quantification). Unfriendly cases are studied for nuclides detection with closed gamma-rays energies and high photoelectric peak intensity differences. Tests are performed with spectra from the International Atomic Energy Agency (IAEA) for gamma spectra analysis software benchmark and with spectra acquired at the laboratory. Results show that SINBAD and Genie 2000 performances are quite similar with sometimes best results for SINBAD with the important difference that to achieve same performances the nonparametric method is user-friendly compared

  19. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  20. Validation Calculations for the Application of MARS Code to the Safety Analysis of Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Park, Cheol; Kim, H.; Chae, H. T.; Lim, I. C

    2006-10-15

    In order to investigate the applicability of MARS code to the accident analysis of the HANARO and other RRs, the following test data were simulated. Test data of the HANARO design and operation, Test data of flow instability and void fraction from published documents, IAEA RR transient data in TECDOC-643, Brazilian IEA-R1 experimental data. For the simulation of the HANARO data with finned rod type fuels at low pressure and low temperature conditions, MARS code, developed for the transient analysis of power reactors, was modified. Its prediction capability was assessed against the experimental data for the HANARO. From the assessment results, it can be said that the modified MARS code could be used for analyzing the thermal hydraulic transient of the HANARO. Some other simulations such as flow instability test and reactor transients were also done for the application of MARS code to RRs with plate type fuels. In the simulation for these cases, no modification was made. The results of simulated cases show that the MARS code can be used to the transient analysis of RRs with careful considerations. In particular, it seems that an improvement on a void model may be necessary for dealing with the phenomena in high void conditions.

  1. Validation Calculations for the Application of MARS Code to the Safety Analysis of Research Reactors

    International Nuclear Information System (INIS)

    Park, Cheol; Kim, H.; Chae, H. T.; Lim, I. C.

    2006-10-01

    In order to investigate the applicability of MARS code to the accident analysis of the HANARO and other RRs, the following test data were simulated. Test data of the HANARO design and operation, Test data of flow instability and void fraction from published documents, IAEA RR transient data in TECDOC-643, Brazilian IEA-R1 experimental data. For the simulation of the HANARO data with finned rod type fuels at low pressure and low temperature conditions, MARS code, developed for the transient analysis of power reactors, was modified. Its prediction capability was assessed against the experimental data for the HANARO. From the assessment results, it can be said that the modified MARS code could be used for analyzing the thermal hydraulic transient of the HANARO. Some other simulations such as flow instability test and reactor transients were also done for the application of MARS code to RRs with plate type fuels. In the simulation for these cases, no modification was made. The results of simulated cases show that the MARS code can be used to the transient analysis of RRs with careful considerations. In particular, it seems that an improvement on a void model may be necessary for dealing with the phenomena in high void conditions

  2. Confirm Content Validity and Sender Authenticity for Text Messages by Using QR Code

    Directory of Open Access Journals (Sweden)

    Firas Mohammed Aswad

    2018-05-01

    Full Text Available In light of the information revolution taking place in the modern world, therefore it becomes necessary and important to save this electronic messages. So we offered this technique to ensure the safety of the content of the messages and authenticity of the sender through  networks communication by converting the message's symbols to numbers , each one of this symbols (letters, numbers, symbols will converted into three digits, the first digit represents the ASCII code of the symbol , the second digit represents the frequency of this symbol in the message (the number of times this symbol is appear in the message, and the third digit represents the total number of the locations of the symbol (calculates the symbol location from the first symbol in the message to this symbol itself and blanks also calculated too .The digital signature of the sender will converted to numbers like the symbols of message we explained it before, and this numbers of the digital signature will gathering together to produce three numbers only, this number will gathering with each numbers of the message's symbols, the final  numbers will converted to QR Code , the QR Code will placed with the message and sent to the recipient. The recipient returns the steps of the sender (produce QR Code from the received message and compared it the received QR Codes, if it is match or not. The recipient will ensure that the content is secure, and confirms the authenticity of the sender.

  3. Gröbner Bases, Coding, and Cryptography

    CERN Document Server

    Sala, Massimiliano; Perret, Ludovic

    2009-01-01

    Coding theory and cryptography allow secure and reliable data transmission, which is at the heart of modern communication. This book offers a comprehensive overview on the application of commutative algebra to coding theory and cryptography. It analyzes important properties of algebraic/geometric coding systems individually.

  4. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  5. PHEBUS FP release analysis using a microstructure-based code

    International Nuclear Information System (INIS)

    Carlucci, L.N.

    1992-03-01

    The results of pre-test fission-product (FP) release analyses of the first two PHEBUS FP experiments, FPT0 and FPT1, indicate that the FREEDOM microstructure-based code predicts significant differences in both the timing and percent of gaseous FP releases for the two tests. To provide an indication of its predictive capability, FREEDOM was also used to model the high-burnup fuel tested in the Oak Ridge National Laboratory experiments VI-2 and VI-3. For these, the code was found to overpredict releases during the early stages of the tests and to underpredict releases during the later stages. The release kinetics in both tests were reasonably predicted, however. In view of the above, it is likely that the FREEDOM predictions of the final cumulative releases for the first two PHEBUS FP tests are lower-bound estimates. However, the significant difference in the predicted timing of initial releases for the two tests is felt to be indicative of what will occur. Therefore, this difference should be considered in the planning and conduct of the two tests, particularly aspects related to on-line measurements

  6. Experiments for the validation of computer codes uses to assess the protection factors afforded by dwellings

    International Nuclear Information System (INIS)

    Le Grand, J.; Roux, Y.; Kerlau, G.

    1988-09-01

    Two experimental campaigns were carried out to verify: 1) the method of assessing the mean kerma in a household used in the computer code BILL calculating the protection factor afforded by dwellings; 2) in what conditions the kerma calculated in cubic meshes of a given size (code PIECE) agreed with TLD measurements. To that purpose, a house was built near the caesium 137 source of the Ecosystem irradiator located at the Cadarache Nuclear Research Center. During the first campaign, four experiments with different house characteristics were conducted. Some 50 TLSs locations describing the inhabitable volume were defined in order to obtain the mean kerma. 16 locations were considered outside the house. During the second campaign a cobalt 60 source was installed on the side. Only five measurement locations were defined, each with 6 TLDs. The results of dosimetric measurements are presented and compared with the calculations of the two computer codes. The effects of wall heterogeneity were also studied [fr

  7. Validation of a new library of nuclear constants of the WIMS code

    International Nuclear Information System (INIS)

    Aguilar H, F.

    1991-10-01

    The objective of the present work is to reproduce the experimental results of the thermal reference problems (benchmarks) TRX-1, TRX-2 and BAPL-1 to BAPL-3 with the WIMS code. It was proceeded in two stages, the first one consisted on using the original library of the code, while in the second one, a library that only contains the present elements in the benchmarks: H 1 , O 16 , Al 27 , U 235 and U 238 was generated. To generate the present nuclear data in the WIMS library, it was used the ENDF/B-IV database and the Data processing system of Nuclear Data NJOY, the library was generated using the FIXER code. (Author)

  8. Spike-based population coding and working memory.

    Directory of Open Access Journals (Sweden)

    Martin Boerlin

    2011-02-01

    Full Text Available Compelling behavioral evidence suggests that humans can make optimal decisions despite the uncertainty inherent in perceptual or motor tasks. A key question in neuroscience is how populations of spiking neurons can implement such probabilistic computations. In this article, we develop a comprehensive framework for optimal, spike-based sensory integration and working memory in a dynamic environment. We propose that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons. As a result, these networks can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Importantly, we propose that population responses and persistent working memory states represent entire probability distributions and not only single stimulus values. These memories are reflected by sustained, asynchronous patterns of activity which make relevant information available to downstream neurons within their short time window of integration. Model neurons act as predictive encoders, only firing spikes which account for new information that has not yet been signaled. Thus, spike times signal deterministically a prediction error, contrary to rate codes in which spike times are considered to be random samples of an underlying firing rate. As a consequence of this coding scheme, a multitude of spike patterns can reliably encode the same information. This results in weakly correlated, Poisson-like spike trains that are sensitive to initial conditions but robust to even high levels of external neural noise. This spike train variability reproduces the one observed in cortical sensory spike trains, but cannot be equated to noise. On the contrary, it is a consequence of optimal spike-based inference. In contrast, we show that rate-based models perform poorly when implemented with stochastically spiking neurons.

  9. Validation of Simulation Codes for Future Systems: Motivations, Approach and the Role of Nuclear Data

    International Nuclear Information System (INIS)

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design

  10. Comparison of SISEC code simulations with earthquake data of ordinary and base-isolated buildings

    International Nuclear Information System (INIS)

    Wang, C.Y.; Gvildys, J.

    1991-01-01

    At Argonne National Laboratory (ANL), a 3-D computer program SISEC (Seismic Isolation System Evaluation Code) is being developed for simulating the system response of isolated and ordinary structures (Wang et al. 1991). This paper describes comparison of SISEC code simulations with building response data of actual earthquakes. To ensure the accuracy of analytical simulations, recorded data of full-size reinforced concrete structures located in Sendai, Japan are used in this benchmark comparison. The test structures consist of two three-story buildings, one base-isolated and the other one ordinary founded. They were constructed side by side to investigate the effect of base isolation on the acceleration response. Among 20 earthquakes observed since April 1989, complete records of three representative earthquakes, no.2, no.6, and no.17, are used for the code validation presented in this paper. Correlations of observed and calculated accelerations at all instrument locations are made. Also, relative response characteristics of ordinary and isolated building structures are investigated. (J.P.N.)

  11. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    Science.gov (United States)

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  12. Development of multi-physics code systems based on the reactor dynamics code DYN3D

    Energy Technology Data Exchange (ETDEWEB)

    Kliem, Soeren; Gommlich, Andre; Grahn, Alexander; Rohde, Ulrich [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany); Schuetze, Jochen [ANSYS Germany GmbH, Darmstadt (Germany); Frank, Thomas [ANSYS Germany GmbH, Otterfing (Germany); Gomez Torres, Armando M.; Sanchez Espinoza, Victor Hugo [Karlsruher Institut fuer Technologie (KIT), Eggenstein-Leopoldshafen (Germany)

    2011-07-15

    The reactor dynamics code DYN3D has been coupled with the CFD code ANSYS CFX and the 3D thermal hydraulic core model FLICA4. In the coupling with ANSYS CFX, DYN3D calculates the neutron kinetics and the fuel behavior including the heat transfer to the coolant. The physical data interface between the codes is the volumetric heat release rate into the coolant. In the coupling with FLICA4 only the neutron kinetics module of DYN3D is used. Fluid dynamics and related transport phenomena in the reactor's coolant and fuel behavior is calculated by FLICA4. The correctness of the coupling of DYN3D with both thermal hydraulic codes was verified by the calculation of different test problems. These test problems were set-up in such a way that comparison with the DYN3D stand-alone code was possible. This included steady-state and transient calculations of a mini-core consisting of nine real-size PWR fuel assemblies with ANSYS CFX/DYN3D as well as mini-core and a full core steady-state calculation using FLICA4/DYN3D. (orig.)

  13. Development of multi-physics code systems based on the reactor dynamics code DYN3D

    International Nuclear Information System (INIS)

    Kliem, Soeren; Gommlich, Andre; Grahn, Alexander; Rohde, Ulrich; Schuetze, Jochen; Frank, Thomas; Gomez Torres, Armando M.; Sanchez Espinoza, Victor Hugo

    2011-01-01

    The reactor dynamics code DYN3D has been coupled with the CFD code ANSYS CFX and the 3D thermal hydraulic core model FLICA4. In the coupling with ANSYS CFX, DYN3D calculates the neutron kinetics and the fuel behavior including the heat transfer to the coolant. The physical data interface between the codes is the volumetric heat release rate into the coolant. In the coupling with FLICA4 only the neutron kinetics module of DYN3D is used. Fluid dynamics and related transport phenomena in the reactor's coolant and fuel behavior is calculated by FLICA4. The correctness of the coupling of DYN3D with both thermal hydraulic codes was verified by the calculation of different test problems. These test problems were set-up in such a way that comparison with the DYN3D stand-alone code was possible. This included steady-state and transient calculations of a mini-core consisting of nine real-size PWR fuel assemblies with ANSYS CFX/DYN3D as well as mini-core and a full core steady-state calculation using FLICA4/DYN3D. (orig.)

  14. ANITA-2000 activation code package - updating of the decay data libraries and validation on the experimental data of the 14 MeV Frascati Neutron Generator

    Directory of Open Access Journals (Sweden)

    Frisoni Manuela

    2016-01-01

    Full Text Available ANITA-2000 is a code package for the activation characterization of materials exposed to neutron irradiation released by ENEA to OECD-NEADB and ORNL-RSICC. The main component of the package is the activation code ANITA-4M that computes the radioactive inventory of a material exposed to neutron irradiation. The code requires the decay data library (file fl1 containing the quantities describing the decay properties of the unstable nuclides and the library (file fl2 containing the gamma ray spectra emitted by the radioactive nuclei. The fl1 and fl2 files of the ANITA-2000 code package, originally based on the evaluated nuclear data library FENDL/D-2.0, were recently updated on the basis of the JEFF-3.1.1 Radioactive Decay Data Library. This paper presents the results of the validation of the new fl1 decay data library through the comparison of the ANITA-4M calculated values with the measured electron and photon decay heats and activities of fusion material samples irradiated at the 14 MeV Frascati Neutron Generator (FNG of the NEA-Frascati Research Centre. Twelve material samples were considered, namely: Mo, Cu, Hf, Mg, Ni, Cd, Sn, Re, Ti, W, Ag and Al. The ratios between calculated and experimental values (C/E are shown and discussed in this paper.

  15. Experimental validation of the DPM Monte Carlo code using minimally scattered electron beams in heterogeneous media

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A.; Wilderman, Scott J.; Bielajew, Alex F.

    2002-01-01

    A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions. (author)

  16. Validation of a thermal-hydraulic system code on a simple example

    International Nuclear Information System (INIS)

    Kopecek, Vit; Zacha, Pavel

    2014-01-01

    A mathematical model of a U tube was set up and the analytical solution was calculated and used in the assessment of the numerical solutions obtained by using the RELAP5 mod3.3 and TRACE V5 thermal hydraulics codes. A good agreement between the 2 types of calculation was obtained.

  17. Code Validation of CFD Heat Transfer Models for Liquid Rocket Engine Combustion Devices

    National Research Council Canada - National Science Library

    Coy, E. B

    2007-01-01

    .... The design of the rig and its capabilities are described. A second objective of the test rig is to provide CFD validation data under conditions relevant to liquid rocket engine thrust chambers...

  18. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  19. Development, verification and validation of the fuel channel behaviour computer code FACTAR

    Energy Technology Data Exchange (ETDEWEB)

    Westbye, C J; Brito, A C; MacKinnon, J C; Sills, H E; Langman, V J [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    FACTAR (Fuel And Channel Temperature And Response) is a computer code developed to simulate the transient thermal and mechanical behaviour of 37-element or 28-element fuel bundles within a single CANDU fuel channel for moderate loss of coolant accident conditions including transition and large break LOCA`s (loss of coolant accidents) with emergency coolant injection assumed available. FACTAR`s predictions of fuel temperature and sheath failure times are used to subsequent assessment of fission product releases and fuel string expansion. This paper discusses the origin and development history of FACTAR, presents the mathematical models and solution technique, the detailed quality assurance procedures that are followed during development, and reports the future development of the code. (author). 27 refs., 3 figs.

  20. Development of 3D CFD code based on structured non-orthogonal grids

    International Nuclear Information System (INIS)

    Vaidya, Abhijeet Mohan; Maheshwari, Naresh Kumar; Rama Rao, A.

    2016-01-01

    Most of the nuclear industry problems involve complex geometries. Solution of flow and heat transfer over complex geometries is a very important requirement for designing new reactor systems. Hence development of a general purpose three dimensional (3D) CFD code is undertaken. For handling complex shape of computational domain, implementation on structured non-orthogonal coordinates is being done. The code is validated by comparing its results for 3D inclined lid driven cavity at different inclination angles and Reynolds numbers with OpenFOAM results. This paper contains formulation and validation of the new code developed. (author)

  1. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  2. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  3. Validation of the COBRA code for dry out power calculation in CANDU type advanced fuels

    International Nuclear Information System (INIS)

    Daverio, Hernando J.

    2003-01-01

    Stern Laboratories perform a full scale CHF testing of the CANFLEX bundle under AECL request. This experiment is modeled with the COBRA IV HW code to verify it's capacity for the dry out power calculation . Good results were obtained: errors below 10 % with respect to all data measured and 1 % for standard operating conditions in CANDU reactors range . This calculations were repeated for the CNEA advanced fuel CARA obtaining the same performance as the CANFLEX fuel. (author)

  4. Validation of favor code linear elastic fracture solutions for finite-length flaw geometries

    International Nuclear Information System (INIS)

    Dickson, T.L.; Keeney, J.A.; Bryson, J.W.

    1995-01-01

    One of the current tasks within the US Nuclear Regulatory Commission (NRC)-funded Heavy Section Steel Technology Program (HSST) at Oak Ridge National Laboratory (ORNL) is the continuing development of the FAVOR (Fracture, analysis of Vessels: Oak Ridge) computer code. FAVOR performs structural integrity analyses of embrittled nuclear reactor pressure vessels (RPVs) with stainless steel cladding, to evaluate compliance with the applicable regulatory criteria. Since the initial release of FAVOR, the HSST program has continued to enhance the capabilities of the FAVOR code. ABAQUS, a nuclear quality assurance certified (NQA-1) general multidimensional finite element code with fracture mechanics capabilities, was used to generate a database of stress-intensity-factor influence coefficients (SIFICs) for a range of axially and circumferentially oriented semielliptical inner-surface flaw geometries applicable to RPVs with an internal radius (Ri) to wall thickness (w) ratio of 10. This database of SIRCs has been incorporated into a development version of FAVOR, providing it with the capability to perform deterministic and probabilistic fracture analyses of RPVs subjected to transients, such as pressurized thermal shock (PTS), for various flaw geometries. This paper discusses the SIFIC database, comparisons with other investigators, and some of the benchmark verification problem specifications and solutions

  5. On Applicability of Network Coding Technique for 6LoWPAN-based Sensor Networks.

    Science.gov (United States)

    Amanowicz, Marek; Krygier, Jaroslaw

    2018-05-26

    In this paper, the applicability of the network coding technique in 6LoWPAN-based sensor multihop networks is examined. The 6LoWPAN is one of the standards proposed for the Internet of Things architecture. Thus, we can expect the significant growth of traffic in such networks, which can lead to overload and decrease in the sensor network lifetime. The authors propose the inter-session network coding mechanism that can be implemented in resource-limited sensor motes. The solution reduces the overall traffic in the network, and in consequence, the energy consumption is decreased. Used procedures take into account deep header compressions of the native 6LoWPAN packets and the hop-by-hop changes of the header structure. Applied simplifications reduce signaling traffic that is typically occurring in network coding deployments, keeping the solution usefulness for the wireless sensor networks with limited resources. The authors validate the proposed procedures in terms of end-to-end packet delay, packet loss ratio, traffic in the air, total energy consumption, and network lifetime. The solution has been tested in a real wireless sensor network. The results confirm the efficiency of the proposed technique, mostly in delay-tolerant sensor networks.

  6. Development and validation of a low-frequency modeling code for high-moment transmitter rod antennas

    Science.gov (United States)

    Jordan, Jared Williams; Sternberg, Ben K.; Dvorak, Steven L.

    2009-12-01

    The goal of this research is to develop and validate a low-frequency modeling code for high-moment transmitter rod antennas to aid in the design of future low-frequency TX antennas with high magnetic moments. To accomplish this goal, a quasi-static modeling algorithm was developed to simulate finite-length, permeable-core, rod antennas. This quasi-static analysis is applicable for low frequencies where eddy currents are negligible, and it can handle solid or hollow cores with winding insulation thickness between the antenna's windings and its core. The theory was programmed in Matlab, and the modeling code has the ability to predict the TX antenna's gain, maximum magnetic moment, saturation current, series inductance, and core series loss resistance, provided the user enters the corresponding complex permeability for the desired core magnetic flux density. In order to utilize the linear modeling code to model the effects of nonlinear core materials, it is necessary to use the correct complex permeability for a specific core magnetic flux density. In order to test the modeling code, we demonstrated that it can accurately predict changes in the electrical parameters associated with variations in the rod length and the core thickness for antennas made out of low carbon steel wire. These tests demonstrate that the modeling code was successful in predicting the changes in the rod antenna characteristics under high-current nonlinear conditions due to changes in the physical dimensions of the rod provided that the flux density in the core was held constant in order to keep the complex permeability from changing.

  7. Development of System Based Code: Case Study of Life-Cycle Margin Evaluation

    International Nuclear Information System (INIS)

    Tai Asayama; Masaki Morishita; Masanori Tashimo

    2006-01-01

    For a leap of progress in structural deign of nuclear plant components, The late Professor Emeritus Yasuhide Asada proposed the System Based Code. The key concepts of the System Based Code are; (1) life-cycle margin optimization, (2) expansion of technical options as well as combinations of technical options beyond the current codes and standards, and (3) designing to clearly defined target reliabilities. Those concepts are very new to most of the nuclear power plant designers who are naturally obliged to design to current codes and standards; the application of the concepts of the System Based Code to design will lead to entire change of practices that designers have long been accustomed to. On the other hand, experienced designers are supposed to have expertise that can support and accelerate the development of the System Based Code. Therefore, interfacing with experienced designers is of crucial importance for the development of the System Based Code. The authors conducted a survey on the acceptability of the System Based Code concept. The results were analyzed from the possibility of improving structural design both in terms of reliability and cost effectiveness by the introduction of the System Based Code concept. It was concluded that the System Based Code is beneficial for those purposes. Also described is the expertise elicited from the results of the survey that can be reflected to the development of the System Based Code. (authors)

  8. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  9. Preserving Envelope Efficiency in Performance Based Code Compliance

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A. [Thornton Energy Consulting (United States); Sullivan, Greg P. [Efficiency Solutions (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baechler, Michael C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-20

    The City of Seattle 2012 Energy Code (Seattle 2014), one of the most progressive in the country, is under revision for its 2015 edition. Additionally, city personnel participate in the development of the next generation of the Washington State Energy Code and the International Energy Code. Seattle has pledged carbon neutrality by 2050 including buildings, transportation and other sectors. The United States Department of Energy (DOE), through Pacific Northwest National Laboratory (PNNL) provided technical assistance to Seattle in order to understand the implications of one potential direction for its code development, limiting trade-offs of long-lived building envelope components less stringent than the prescriptive code envelope requirements by using better-than-code but shorter-lived lighting and heating, ventilation, and air-conditioning (HVAC) components through the total building performance modeled energy compliance path. Weaker building envelopes can permanently limit building energy performance even as lighting and HVAC components are upgraded over time, because retrofitting the envelope is less likely and more expensive. Weaker building envelopes may also increase the required size, cost and complexity of HVAC systems and may adversely affect occupant comfort. This report presents the results of this technical assistance. The use of modeled energy code compliance to trade-off envelope components with shorter-lived building components is not unique to Seattle and the lessons and possible solutions described in this report have implications for other jurisdictions and energy codes.

  10. Efficacy of Code Optimization on Cache-based Processors

    Science.gov (United States)

    VanderWijngaart, Rob F.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The current common wisdom in the U.S. is that the powerful, cost-effective supercomputers of tomorrow will be based on commodity (RISC) micro-processors with cache memories. Already, most distributed systems in the world use such hardware as building blocks. This shift away from vector supercomputers and towards cache-based systems has brought about a change in programming paradigm, even when ignoring issues of parallelism. Vector machines require inner-loop independence and regular, non-pathological memory strides (usually this means: non-power-of-two strides) to allow efficient vectorization of array operations. Cache-based systems require spatial and temporal locality of data, so that data once read from main memory and stored in high-speed cache memory is used optimally before being written back to main memory. This means that the most cache-friendly array operations are those that feature zero or unit stride, so that each unit of data read from main memory (a cache line) contains information for the next iteration in the loop. Moreover, loops ought to be 'fat', meaning that as many operations as possible are performed on cache data-provided instruction caches do not overflow and enough registers are available. If unit stride is not possible, for example because of some data dependency, then care must be taken to avoid pathological strides, just ads on vector computers. For cache-based systems the issues are more complex, due to the effects of associativity and of non-unit block (cache line) size. But there is more to the story. Most modern micro-processors are superscalar, which means that they can issue several (arithmetic) instructions per clock cycle, provided that there are enough independent instructions in the loop body. This is another argument for providing fat loop bodies. With these restrictions, it appears fairly straightforward to produce code that will run efficiently on any cache-based system. It can be argued that although some of the important

  11. Computing Moment-Based Probability Tables for Self-Shielding Calculations in Lattice Codes

    International Nuclear Information System (INIS)

    Hebert, Alain; Coste, Mireille

    2002-01-01

    As part of the self-shielding model used in the APOLLO2 lattice code, probability tables are required to compute self-shielded cross sections for coarse energy groups (typically with 99 or 172 groups). This paper describes the replacement of the multiband tables (typically with 51 subgroups) with moment-based tables in release 2.5 of APOLLO2. An improved Ribon method is proposed to compute moment-based probability tables, allowing important savings in CPU resources while maintaining the accuracy of the self-shielding algorithm. Finally, a validation is presented where the absorption rates obtained with each of these techniques are compared with exact values obtained using a fine-group elastic slowing-down calculation in the resolved energy domain. Other results, relative to the Rowland's benchmark and to three assembly production cases, are also presented

  12. Plato: A localised orbital based density functional theory code

    Science.gov (United States)

    Kenny, S. D.; Horsfield, A. P.

    2009-12-01

    The Plato package allows both orthogonal and non-orthogonal tight-binding as well as density functional theory (DFT) calculations to be performed within a single framework. The package also provides extensive tools for analysing the results of simulations as well as a number of tools for creating input files. The code is based upon the ideas first discussed in Sankey and Niklewski (1989) [1] with extensions to allow high-quality DFT calculations to be performed. DFT calculations can utilise either the local density approximation or the generalised gradient approximation. Basis sets from minimal basis through to ones containing multiple radial functions per angular momenta and polarisation functions can be used. Illustrations of how the package has been employed are given along with instructions for its utilisation. Program summaryProgram title: Plato Catalogue identifier: AEFC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 219 974 No. of bytes in distributed program, including test data, etc.: 1 821 493 Distribution format: tar.gz Programming language: C/MPI and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux and Mac OS X Has the code been vectorised or parallelised?: Yes, up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Nature of problem: Density functional theory study of electronic structure and total energies of molecules, crystals and surfaces. Solution method: Localised orbital based density functional theory. Restrictions: Tight-binding and density functional theory only, no exact exchange. Unusual features: Both atom centred and uniform meshes available

  13. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  14. Validation of the LH antenna code ALOHA against Tore Supra experiments

    International Nuclear Information System (INIS)

    Hillairet, J.; Ekedahl, A.; Kocan, M.; Gunn, J. P.; Goniche, M.

    2009-01-01

    Comparisons between ALOHA code predictions and experimental measurements of reflection coefficients for the two different Lower Hybrid Current Drive (LHCD) antennas (named C2 and C3) in Tore Supra are presented. A large variation of density in front of the antennas was obtained by varying the distance between the plasma and the antennas. Low power ( 2 ) was used in order to avoid non-linear effects on the wave coupling. Results obtained with ALOHA are in good agreement with the experimental measurements for both Tore Supra antennas and show that ALOHA is an efficient LH predictive tool.

  15. Non-Binary Protograph-Based LDPC Codes: Analysis,Enumerators and Designs

    OpenAIRE

    Sun, Yizeng

    2013-01-01

    Non-binary LDPC codes can outperform binary LDPC codes using sum-product algorithm with higher computation complexity. Non-binary LDPC codes based on protographs have the advantage of simple hardware architecture. In the first part of this thesis, we will use EXIT chart analysis to compute the thresholds of different protographs over GF(q). Based on threshold computation, some non-binary protograph-based LDPC codes are designed and their frame error rates are compared with binary LDPC codes. ...

  16. A novel construction method of QC-LDPC codes based on CRT for optical communications

    Science.gov (United States)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  17. Investigation of a two-phase nozzle flow and validation of several computer codes by the experimental data

    International Nuclear Information System (INIS)

    Kedziur, F.

    1980-03-01

    Stationary experiments with a convergent nozzle are performed in order to validate advanced two-phase computer codes, which find application in the blowdown-phase of a loss-of-coolant accident (LOCA). The steam/water flow presents a broad variety of initial conditions: The pressure varies between 2 and 13 MPa, the void fraction between 0 (subcooled) and about 80%, a great number of subcritical as well as critical experiments with different flow pattern is investigated. Additional air/water experiments serve for the separation of phase transition effects. The transient acceleration of the fluid in the LOCA-case is simulated by a local acceleration in the experiments. The layout of the nozzle and the applied measurement technique allow for a separate testing of physical models and the determination of empirical model parameters, respectively: In the four codes DUESE, DRIX-20, RELAP4/MOD6 and STRUYA the models - if they exist - for slip between the phases, thermodynamic non-equilibrium, pipe friction and critical mass flow rate are validated and criticised in comparison with the experimental data, and the corresponding model parameters are determined. The parameters essentially are a function of the void fraction. (orig.) [de

  18. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  19. Validation and verification of MCNP6 against intermediate and high-energy experimental data and results by other codes

    International Nuclear Information System (INIS)

    Mashnik, Stepan G.

    2011-01-01

    MCNP6, the latest and most advanced LANL transport code representing a recent merger of MCNP5 and MCNPX, has been Validated and Verified (V and V) against a variety of intermediate and high-energy experimental data and against results by different versions of MCNPX and other codes. In the present work, we V and V MCNP6 using mainly the latest modifications of the Cascade-Exciton Model (CEM) and of the Los Alamos version of the Quark-Gluon String Model (LAQGSM) event generators CEM03.02 and LAQGSM03.03. We found that MCNP6 describes reasonably well various reactions induced by particles and nuclei at incident energies from 18 MeV to about 1 TeV per nucleon measured on thin and thick targets and agrees very well with similar results obtained with MCNPX and calculations by CEM03.02, LAQGSM03.01 (03.03), INCL4 + ABLA, and Bertini INC + Dresner evaporation, EPAX, ABRABLA, HIPSE, and AMD, used as stand alone codes. Most of several computational bugs and more serious physics problems observed in MCNP6/X during our V and V have been fixed; we continue our work to solve all the known problems before MCNP6 is distributed to the public. (author)

  20. Experimental validation for combustion analysis of GOTHIC 6.1b code in 2-dimensional premixed combustion experiments

    International Nuclear Information System (INIS)

    Lee, J. Y.; Lee, J. J.; Park, K. C.

    2003-01-01

    In this study, the prediction capability of GOTHIC code for hydrogen combustion phenomena was validated with the results of two-dimensional premixed hydrogen combustion experiment executed by Seoul National University. In the experimental results, we could confirm the propagation characteristics of hydrogen flame such as buoyancy effect, flame front shape etc.. The combustion time of the tests was about 0.1 sec.. In the GOTHIC analyses results, the GOTHIC code could predict the overall hydrogen flame propagation characteristics but the buoyancy effect and flame shape did not compare well with the experimental results. Especially, in case of the flame propagate to the dead-end, GOTHIC predicted the flame did not affected by the flow and this cause quite different results in flame propagation from experimental results. Moreover the combustion time of the analyses was about 1 sec. which is ten times longer than the experimental result. To obtain more reasonable analysis results, it is necessary that combustion model parameters in GOTHIC code apply appropriately and hydrogen flame characteristics be reflected in solving governing equations

  1. Development of a FBR fuel pin bundle deformation analysis code 'BAMBOO' . Development of a dispersion model and its validation

    International Nuclear Information System (INIS)

    Uwaba, Tomoyuki; Ukai, Shigeharu; Asaga, Takeo

    2002-03-01

    Bundle Duct Interaction (BDI) is one of the life limiting factors of a FBR fuel subassembly. Under the BDI condition, the fuel pin dispersion would occur mainly by the deviation of the wire position due to the irradiation. In this study the effect of the dispersion on the bundle deformation was evaluated by using the BAMBOO code and following results were obtained. (1) A new contact analysis model was introduced in BAMBOO code. This model considers the contact condition at the axial position other than the nodal point of the beam element that composes the fuel pin. This improvement made it possible in the bundle deformation analysis to cause fuel pin dispersion due to the deviations of the wire position. (2) This model was validated with the results of the out-of-pile compression test with the wire deviation. The calculated pin-to-duct and pin-to-pin clearances with the dispersion model almost agreed with the test results. Therefore it was confirmed that the BAMBOO code reasonably predicts the bundle deformation with the dispersion. (3) In the dispersion bundle the pin-to-pin clearances widely scattered. And the minimum pin-to-duct clearance increased or decreased depending on the dispersion condition compared to the no-dispersion bundle. This result suggests the possibility that the considerable dispersion would affect the thermal integrity of the bundle. (author)

  2. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    International Nuclear Information System (INIS)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H.

    2014-08-01

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  3. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  4. Validation of Heat Transfer and Film Cooling Capabilities of the 3-D RANS Code TURBO

    Science.gov (United States)

    Shyam, Vikram; Ameri, Ali; Chen, Jen-Ping

    2010-01-01

    The capabilities of the 3-D unsteady RANS code TURBO have been extended to include heat transfer and film cooling applications. The results of simulations performed with the modified code are compared to experiment and to theory, where applicable. Wilcox s k-turbulence model has been implemented to close the RANS equations. Two simulations are conducted: (1) flow over a flat plate and (2) flow over an adiabatic flat plate cooled by one hole inclined at 35 to the free stream. For (1) agreement with theory is found to be excellent for heat transfer, represented by local Nusselt number, and quite good for momentum, as represented by the local skin friction coefficient. This report compares the local skin friction coefficients and Nusselt numbers on a flat plate obtained using Wilcox's k-model with the theory of Blasius. The study looks at laminar and turbulent flows over an adiabatic flat plate and over an isothermal flat plate for two different wall temperatures. It is shown that TURBO is able to accurately predict heat transfer on a flat plate. For (2) TURBO shows good qualitative agreement with film cooling experiments performed on a flat plate with one cooling hole. Quantitatively, film effectiveness is under predicted downstream of the hole.

  5. Development and Validation of a Momentum Integral Numerical Analysis Code for Liquid Metal Fast Reactor

    International Nuclear Information System (INIS)

    Chen, Xiangyi; Suh, Kune Y.

    2016-01-01

    In this work, this benchmark problem is conducted to assess the precision of the upgraded in-house code MINA. Comparison of the results from different best estimate codes employed by various grid spacer pressure drop correlations is carried out to suggest the best one. By modifying In's method, it presents good agreement with the experiment data which is shown in Figure 7. The reason for the failure of the prediction in previous work is caused by the utilization of Rehme's method which is categorized into four groups according to different fitting strategy. Through comparison of drag coefficients calculated by four groups of Rheme's method, equivalent drag coefficient calculated by In's method and experiment data shown in Figure 8, we can conclude that Rehme's method considerably underestimate the drag coefficients in grid spacers used in HELIOS and In's method give a reasonable prediction. Starting from the core inlet, the accumulated pressure losses are presented in figure 9 along the accumulated length of the forced convection flow path; the good agreement of the prediction from MINA with the experiment result shows MINA has very good capability in integrated momentum analysis makes it robust in the future design scoping method development of LFR.

  6. Development and Validation of a Momentum Integral Numerical Analysis Code for Liquid Metal Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xiangyi; Suh, Kune Y. [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this work, this benchmark problem is conducted to assess the precision of the upgraded in-house code MINA. Comparison of the results from different best estimate codes employed by various grid spacer pressure drop correlations is carried out to suggest the best one. By modifying In's method, it presents good agreement with the experiment data which is shown in Figure 7. The reason for the failure of the prediction in previous work is caused by the utilization of Rehme's method which is categorized into four groups according to different fitting strategy. Through comparison of drag coefficients calculated by four groups of Rheme's method, equivalent drag coefficient calculated by In's method and experiment data shown in Figure 8, we can conclude that Rehme's method considerably underestimate the drag coefficients in grid spacers used in HELIOS and In's method give a reasonable prediction. Starting from the core inlet, the accumulated pressure losses are presented in figure 9 along the accumulated length of the forced convection flow path; the good agreement of the prediction from MINA with the experiment result shows MINA has very good capability in integrated momentum analysis makes it robust in the future design scoping method development of LFR.

  7. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Villalobos, Hector J [ORNL; Gregor, Jens [University of Tennessee, Knoxville (UTK); Bingham, Philip R [ORNL

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  8. Highly parallel line-based image coding for many cores.

    Science.gov (United States)

    Peng, Xiulian; Xu, Jizheng; Zhou, You; Wu, Feng

    2012-01-01

    Computers are developing along with a new trend from the dual-core and quad-core processors to ones with tens or even hundreds of cores. Multimedia, as one of the most important applications in computers, has an urgent need to design parallel coding algorithms for compression. Taking intraframe/image coding as a start point, this paper proposes a pure line-by-line coding scheme (LBLC) to meet the need. In LBLC, an input image is processed line by line sequentially, and each line is divided into small fixed-length segments. The compression of all segments from prediction to entropy coding is completely independent and concurrent at many cores. Results on a general-purpose computer show that our scheme can get a 13.9 times speedup with 15 cores at the encoder and a 10.3 times speedup at the decoder. Ideally, such near-linear speeding relation with the number of cores can be kept for more than 100 cores. In addition to the high parallelism, the proposed scheme can perform comparatively or even better than the H.264 high profile above middle bit rates. At near-lossless coding, it outperforms H.264 more than 10 dB. At lossless coding, up to 14% bit-rate reduction is observed compared with H.264 lossless coding at the high 4:4:4 profile.

  9. Short-Block Protograph-Based LDPC Codes

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher

    2010-01-01

    Short-block low-density parity-check (LDPC) codes of a special type are intended to be especially well suited for potential applications that include transmission of command and control data, cellular telephony, data communications in wireless local area networks, and satellite data communications. [In general, LDPC codes belong to a class of error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels.] The codes of the present special type exhibit low error floors, low bit and frame error rates, and low latency (in comparison with related prior codes). These codes also achieve low maximum rate of undetected errors over all signal-to-noise ratios, without requiring the use of cyclic redundancy checks, which would significantly increase the overhead for short blocks. These codes have protograph representations; this is advantageous in that, for reasons that exceed the scope of this article, the applicability of protograph representations makes it possible to design highspeed iterative decoders that utilize belief- propagation algorithms.

  10. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-02

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  11. Calibration and Validation of the Dynamic Wake Meandering Model for Implementation in an Aeroelastic Code

    DEFF Research Database (Denmark)

    Aagaard Madsen, Helge; Larsen, Gunner Chr.; Larsen, Torben J.

    2010-01-01

    in an aeroelastic model. Calibration and validation of the different parts of the model is carried out by comparisons with actuator disk and actuator line (ACL) computations as well as with inflow measurements on a full-scale 2 MW turbine. It is shown that the load generating part of the increased turbulence....... Finally, added turbulence characteristics are compared with correlation results from literature. ©2010 American Society of Mechanical Engineers...

  12. Validation of main nuclear libraries used in thorium reactors using the Serpent code

    International Nuclear Information System (INIS)

    Faga, Lucas J.

    2017-01-01

    The purpose of this work is to validate the library of the Serpent standard database for systems containing U-233, U-235, Th-232, Pu-239 and Pu-240. The project will support the other projects of the newly created study group of Nuclear Engineering Center (CEN) of Instituto de Pesquisas Energéticas e Nucleares (IPEN), linked to the study of several types of reactors and their application in thorium cycles, a subject that gains more and more visibility, due to strong and potential promises of energy revolution. The results obtained at the end of the simulations were satisfactory, with the multiplication factors being effective close to 100 PCM of the values provided by the benchmarks, as expected for a validated library. The minimum distance between these values was 2 PCM and the maximum of 280 PCM. The final analysis demonstrates that the ENDF / B-VII library has validated nuclear data for the isotopes of interest and may be used in future thorium study group projects

  13. Policy and Validity Prospects for Performance-Based Assessment.

    Science.gov (United States)

    Baker, Eva L.; And Others

    1994-01-01

    This article describes performance-based assessment as expounded by its proponents, comments on these conceptions, reviews evidence regarding the technical quality of performance-based assessment, and considers its validity under various policy options. (JDD)

  14. Supervised Learning Based on Temporal Coding in Spiking Neural Networks.

    Science.gov (United States)

    Mostafa, Hesham

    2017-08-01

    Gradient descent training techniques are remarkably successful in training analog-valued artificial neural networks (ANNs). Such training techniques, however, do not transfer easily to spiking networks due to the spike generation hard nonlinearity and the discrete nature of spike communication. We show that in a feedforward spiking network that uses a temporal coding scheme where information is encoded in spike times instead of spike rates, the network input-output relation is differentiable almost everywhere. Moreover, this relation is piecewise linear after a transformation of variables. Methods for training ANNs thus carry directly to the training of such spiking networks as we show when training on the permutation invariant MNIST task. In contrast to rate-based spiking networks that are often used to approximate the behavior of ANNs, the networks we present spike much more sparsely and their behavior cannot be directly approximated by conventional ANNs. Our results highlight a new approach for controlling the behavior of spiking networks with realistic temporal dynamics, opening up the potential for using these networks to process spike patterns with complex temporal information.

  15. Smoothing-Based Relative Navigation and Coded Aperture Imaging

    Science.gov (United States)

    Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher

    2017-01-01

    This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.

  16. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to