WorldWideScience

Sample records for model validation experiments

  1. Validation of dispersion model of RTARC-DSS based on ''KIT'' field experiments

    International Nuclear Information System (INIS)

    Duran, J.

    2000-01-01

    The aim of this study is to present the performance of the Gaussian dispersion model RTARC-DSS (Real Time Accident Release Consequences - Decision Support System) at the 'Kit' field experiments. The Model Validation Kit is a collection of three experimental data sets from Kincaid, Copenhagen, Lillestrom and supplementary Indianopolis experimental campaigns accompanied by software for model evaluation. The validation of the model has been performed on the basis of the maximum arc-wise concentrations using the Bootstrap resampling procedure the variation of the model residuals. Validation was performed for the short-range distances (about 1 - 10 km, maximum for Kincaid data set - 50 km from source). Model evaluation procedure and amount of relative over- or under-prediction are discussed and compared with the model. (author)

  2. Validation of ASTEC v2.0 corium jet fragmentation model using FARO experiments

    International Nuclear Information System (INIS)

    Hermsmeyer, S.; Pla, P.; Sangiorgi, M.

    2015-01-01

    Highlights: • Model validation base extended to six FARO experiments. • Focus on the calculation of the fragmented particle diameter. • Capability and limits of the ASTEC fragmentation model. • Sensitivity analysis of model outputs. - Abstract: ASTEC is an integral code for the prediction of Severe Accidents in Nuclear Power Plants. As such, it needs to cover all physical processes that could occur during accident progression, yet keeping its models simple enough for the ensemble to stay manageable and produce results within an acceptable time. The present paper is concerned with the validation of the Corium jet fragmentation model of ASTEC v2.0 rev3 by means of a selection of six experiments carried out within the FARO facility. The different conditions applied within these six experiments help to analyse the model behaviour in different situations and to expose model limits. In addition to comparing model outputs with experimental measurements, sensitivity analyses are applied to investigate the model. Results of the paper are (i) validation runs, accompanied by an identification of situations where the implemented fragmentation model does not match the experiments well, and discussion of results; (ii) its special attention to the models calculating the diameter of fragmented particles, the identification of a fault in one model implemented, and the discussion of simplification and ad hoc modification to improve the model fit; and, (iii) an investigation of the sensitivity of predictions towards inputs and parameters. In this way, the paper offers a thorough investigation of the merit and limitation of the fragmentation model used in ASTEC

  3. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  4. The difference between traditional experiments and CFD validation benchmark experiments

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L., E-mail: barton.smith@usu.edu

    2017-02-15

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  5. The difference between traditional experiments and CFD validation benchmark experiments

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2017-01-01

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  6. Design of an intermediate-scale experiment to validate unsaturated- zone transport models

    International Nuclear Information System (INIS)

    Siegel, M.D.; Hopkins, P.L.; Glass, R.J.; Ward, D.B.

    1991-01-01

    An intermediate-scale experiment is being carried out to evaluate instrumentation and models that might be used for transport-model validation for the Yucca Mountain Site Characterization Project. The experimental test bed is a 6-m high x 3-m diameter caisson filled with quartz sand with a sorbing layer at an intermediate depth. The experiment involves the detection and prediction of the migration of fluid and tracers through an unsaturated porous medium. Pre-test design requires estimation of physical properties of the porous medium such as the relative permeability, saturation/pressure relations, porosity, and saturated hydraulic conductivity as well as geochemical properties such as surface complexation constants and empircial K d 'S. The pre-test characterization data will be used as input to several computer codes to predict the fluid flow and tracer migration. These include a coupled chemical-reaction/transport model, a stochastic model, and a deterministic model using retardation factors. The calculations will be completed prior to elution of the tracers, providing a basis for validation by comparing the predictions to observed moisture and tracer behavior

  7. Ensemble of cell survival experiments after ion irradiation for validation of RBE models

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, Thomas; Scholz, Uwe; Scholz, Michael [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Durante, Marco [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Institut fuer Festkoerperphysik, TU Darmstadt, Darmstadt (Germany)

    2012-07-01

    There is persistent interest in understanding the systematics of the relative biological effectiveness (RBE). Models such as the Local Effect Model (LEM) or the Microdosimetric Kinetic Model have the goal to predict the RBE. For the validation of these models a collection of many in-vitro cell survival experiments is most appropriate. The set-up of an ensemble of in-vitro cell survival data comprising about 850 survival experiments after both ion and photon irradiation is reported. The survival curves have been taken out from publications. The experiments encompass survival curves obtained in different labs, using different ion species from protons to uranium, varying irradiation modalities (shaped or monoenergetic beam), various energies and linear energy transfers, and a whole variety of cell types (human or rodent; normal, mutagenic or tumor; radioresistant or -sensitive). Each cell survival curve has been parameterized by the linear-quadratic model. The photon parameters have been added to the data base to allow to calculate the experimental RBE to any survival level. We report on experimental trends found within the data ensemble. The data will serve as a testing ground for RBE models such as the LEM. Finally, a roadmap for further validation and first model results using the data base in combination with the LEM are presented.

  8. Design of experiments in medical physics: Application to the AAA beam model validation.

    Science.gov (United States)

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Siegel, M.D.; Cheng, W.C.; Ward, D.B.; Bryan, C.R.

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project

  10. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, M.D.; Cheng, W.C. [Sandia National Labs., Albuquerque, NM (United States); Ward, D.B.; Bryan, C.R. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project.

  11. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    Science.gov (United States)

    Rest, J.

    1989-12-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solids depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism.

  12. Validation of mechanistic models for gas precipitation in solids during postirradiation annealing experiments

    International Nuclear Information System (INIS)

    Rest, J.

    1989-01-01

    A number of different phenomenological models for gas precipitation in solids during postirradiation annealing experiments have been proposed. Validation of such mechanistic models for gas release and swelling is complicated by the use of data containing large systematic errors, and phenomena characterized by synergistic effects as well as uncertainties in materials properties. Statistical regression analysis is recommended for the selection of a reasonably well characterized data base for gas release from irradiated fuel under transient heating conditions. It is demonstrated that an appropriate data selection method is required in order to realistically examine the impact of differing descriptions of the phenomena, and uncertainties in selected materials properties, on the validation results. The results of the analysis show that the kinetics of gas precipitation in solid depend on bubble overpressurization effects and need to be accounted for during the heatup phase of isothermal heating experiments. It is shown that if only the total gas release values (as opposed to time-dependent data) were available, differentiation between different gas precipitation models would be ambiguous. The observed sustained increase in the fractional release curve at relatively high temperatures after the total precipitation of intragranular gas in fission gas bubbles is ascribed to the effects of a grain-growth/grain-boundary sweeping mechanism. (orig.)

  13. Optimal Design and Model Validation for Combustion Experiments in a Shock Tube

    KAUST Repository

    Long, Quan

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate functions. The control parameters are the initial hydrogen concentration and the temperature. First, we build a polynomial based surrogate model for the observable related to the reactions in the shock tube. Second, we use a novel MAP based approach to estimate the expected information gain in the proposed experiments and select the best experimental set-ups corresponding to the optimal expected information gains. Third, we use the synthetic data to carry out virtual validation of our methodology.

  14. Preliminary characterization of materials for a reactive transport model validation experiment

    International Nuclear Information System (INIS)

    Siegel, M.D.; Ward, D.B.; Cheng, W.C.; Bryant, C.; Chocas, C.S.; Reynolds, C.G.

    1993-01-01

    The geochemical properties of a porous sand and several tracers (Ni, Br, and Li) have been characterized for use in a caisson experiment designed to validate sorption models used in models of inactive transport. The surfaces of the sand grains have been examined by a combination of techniques including potentiometric titration, acid leaching, optical microscopy, and scanning electron microscopy with energy-dispersive spectroscopy. The surface studies indicate the presence of small amounts of carbonate, kaolinite and iron-oxyhydroxides. Adsorption of nickel, lithium and bromide by the sand was measured using batch techniques. Bromide was not sorbed by the sand. A linear (K d ) or an isotherm sorption model may adequately describe transport of Li; however, a model describing the changes of pH and the concentrations of other solution species as a function of time and position within the caisson and the concomitant effects on Ni sorption may be required for accurate predictions of nickel transport

  15. Validation of a CFD Analysis Model for Predicting CANDU-6 Moderator Temperature Against SPEL Experiments

    International Nuclear Information System (INIS)

    Churl Yoon; Bo Wook Rhee; Byung-Joo Min

    2002-01-01

    A validation of a 3D CFD model for predicting local subcooling of the moderator in the vicinity of calandria tubes in a CANDU-6 reactor is performed. The small scale moderator experiments performed at Sheridan Park Experimental Laboratory (SPEL) in Ontario, Canada[1] is used for the validation. Also a comparison is made between previous CFD analyses based on 2DMOTH and PHOENICS, and the current analysis for the same SPEL experiment. For the current model, a set of grid structures for the same geometry as the experimental test section is generated and the momentum, heat and continuity equations are solved by CFX-4.3, a CFD code developed by AEA technology. The matrix of calandria tubes is simplified by the porous media approach. The standard k-ε turbulence model associated with logarithmic wall treatment and SIMPLEC algorithm on the body fitted grid are used. Buoyancy effects are accounted for by the Boussinesq approximation. For the test conditions simulated in this study, the flow pattern identified is the buoyancy-dominated flow, which is generated by the interaction between the dominant buoyancy force by heating and inertial momentum forces by the inlet jets. As a result, the current CFD moderator analysis model predicts the moderator temperature reasonably, and the maximum error against the experimental data is kept at less than 2.0 deg. C over the whole domain. The simulated velocity field matches with the visualization of SPEL experiments quite well. (authors)

  16. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  17. Validation experiments of the chimney model for the operational simulation of hydrogen recombiners

    International Nuclear Information System (INIS)

    Simon, Berno

    2013-01-01

    The calculation program REKO-DIREKT allows the simulation of the operational behavior of a hydrogen recombiner during accidents with hydrogen release. The interest is focused on the interaction between the catalyst insertion and the chimney that influences the natural ventilation and thus the throughput through the recombiner significantly. For validation experiments were performed with a small-scale recombiner model in the test facility REKO-4. The results show the correlation between the hydrogen concentration at the recombiner entrance, the temperature on catalyst sheets and the entrance velocity using different chimney heights. The entrance velocity increases with the heights of the installed chimney that influences the natural ventilation significantly. The results allow the generation of a wide data base for validation of the computer code REKO-DIREKT.

  18. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    International Nuclear Information System (INIS)

    Li, Lu; Huang, Xianjia; Bi, Kun; Liu, Xiaoshuang

    2016-01-01

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  19. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    Energy Technology Data Exchange (ETDEWEB)

    Li, Lu [Sate Key Laboratory of Fire Science, University of Science and Technology of China, Hefei 230027 (China); Huang, Xianjia, E-mail: huangxianjia@gziit.ac.cn [Joint Laboratory of Fire Safety in Nuclear Power Plants, Institute of Industry Technology Guangzhou & Chinese Academy of Sciences, Guangzhou 511458 (China); Bi, Kun; Liu, Xiaoshuang [China Nuclear Power Design Co., Ltd., Shenzhen 518045 (China)

    2016-05-15

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  20. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  1. Validation of the containment code Sirius: interpretation of an explosion experiment on a scale model

    International Nuclear Information System (INIS)

    Blanchet, Y.; Obry, P.; Louvet, J.; Deshayes, M.; Phalip, C.

    1979-01-01

    The explicit 2-D axisymmetric Langrangian code SIRIUS, developed at the CEA/DRNR, Cadarache, deals with transient compressive flows in deformable primary tanks with more or less complex internal component geometries. This code has been subjected to a two-year intensive validation program on scale model experiments and a number of improvements have been incorporated. This paper presents a recent calculation of one of these experiments using the SIRIUS code, and the comparison with experimental results shows the encouraging possibilities of this Lagrangian code

  2. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Victor, B.; Morgan, K.; Hossack, A.; Sutherland, D. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Jarboe, T.; Nelson, B. A. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States)

    2015-05-15

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numerical validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.

  3. In Situ Experiment and Numerical Model Validation of a Borehole Heat Exchanger in Shallow Hard Crystalline Rock

    Directory of Open Access Journals (Sweden)

    Mateusz Janiszewski

    2018-04-01

    Full Text Available Accurate and fast numerical modelling of the borehole heat exchanger (BHE is required for simulation of long-term thermal energy storage in rocks using boreholes. The goal of this study was to conduct an in situ experiment to validate the proposed numerical modelling approach. In the experiment, hot water was circulated for 21 days through a single U-tube BHE installed in an underground research tunnel located at a shallow depth in crystalline rock. The results of the simulations using the proposed model were validated against the measurements. The numerical model simulated the BHE’s behaviour accurately and compared well with two other modelling approaches from the literature. The model is capable of replicating the complex geometrical arrangement of the BHE and is considered to be more appropriate for simulations of BHE systems with complex geometries. The results of the sensitivity analysis of the proposed model have shown that low thermal conductivity, high density, and high heat capacity of rock are essential for maximising the storage efficiency of a borehole thermal energy storage system. Other characteristics of BHEs, such as a high thermal conductivity of the grout, a large radius of the pipe, and a large distance between the pipes, are also preferred for maximising efficiency.

  4. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  5. Validation of Friction Models in MARS-MultiD Module with Two-Phase Cross Flow Experiment

    International Nuclear Information System (INIS)

    Choi, Chi-Jin; Yang, Jin-Hwa; Cho, Hyoung-Kyu; Park, Goon-Cher; Euh, Dong-Jin

    2015-01-01

    In the downcomer of Advanced Power Reactor 1400 (APR1400) which has direct vessel injection (DVI) lines as an emergency core cooling system, multidimensional two-phase flow may occur due to the Loss-of-Coolant-Accident (LOCA). The accurate prediction about that is high relevance to evaluation of the integrity of the reactor core. For this reason, Yang performed an experiment that was to investigate the two-dimensional film flow which simulated the two-phase cross flow in the upper downcomer, and obtained the local liquid film velocity and thickness data. From these data, it could be possible to validate the multidimensional modules of system analysis codes. In this study, MARS-MultiD was used to simulate the Yang's experiment, and obtained the local variables. Then, the friction models used in MARS-MultiD were validated by comparing the two-phase flow experimental results with the calculated local variables. In this study, the two-phase cross flow experiment was modeled by the MARS-MultiD. Compared with the experimental results, the calculated results by the code properly presented mass conservation which could be known from the relation between the liquid film velocity and thickness at the same flow rate. The magnitude and direction of the liquid film, however, did not follow well with experimental results. According to the results of Case-2, wall friction should be increased, and interfacial friction should be decreased in MARS-MultiD. These results show that it is needed to modify the friction models in the MARS-MultiD to simulate the two-phase cross flow

  6. CFD validation experiments for hypersonic flows

    Science.gov (United States)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  7. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  8. Validation of ASTEC core degradation and containment models

    International Nuclear Information System (INIS)

    Kruse, Philipp; Brähler, Thimo; Koch, Marco K.

    2014-01-01

    Ruhr-Universitaet Bochum performed in a German funded project validation of in-vessel and containment models of the integral code ASTEC V2, jointly developed by IRSN (France) and GRS (Germany). In this paper selected results of this validation are presented. In the in-vessel part, the main point of interest was the validation of the code capability concerning cladding oxidation and hydrogen generation. The ASTEC calculations of QUENCH experiments QUENCH-03 and QUENCH-11 show satisfactory results, despite of some necessary adjustments in the input deck. Furthermore, the oxidation models based on the Cathcart–Pawel and Urbanic–Heidrick correlations are not suitable for higher temperatures while the ASTEC model BEST-FIT based on the Prater–Courtright approach at high temperature gives reliable enough results. One part of the containment model validation was the assessment of three hydrogen combustion models of ASTEC against the experiment BMC Ix9. The simulation results of these models differ from each other and therefore the quality of the simulations depends on the characteristic of each model. Accordingly, the CPA FRONT model, corresponding to the simplest necessary input parameters, provides the best agreement to the experimental data

  9. The role of CFD combustion modeling in hydrogen safety management – V: Validation for slow deflagrations in homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, Tadej, E-mail: tadej.holler@ijs.si [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Kljenak, Ivo [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, Ed [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2016-12-15

    Highlights: • Validation of the modeling approach for hydrogen deflagration is presented. • Modeling approach is based on two combustion models implemented in ANSYS Fluent. • Experiments with various initial hydrogen concentrations were used for validation. • The effects of heat transfer mechanisms selection were also investigated. • The grid sensitivity analysis was performed as well. - Abstract: The control of hydrogen in the containment is an important safety issue following rapid oxidation of the uncovered reactor core during a severe accident in a Nuclear Power Plant (NPP), because dynamic pressure loads from eventual hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In the set of our previous papers, a CFD-based method to assess the consequence of fast combustion of uniform hydrogen-air mixtures was presented, followed by its validation for hydrogen-air mixtures with diluents and for non-uniform hydrogen-air mixtures. In the present paper, the extension of this model for the slow deflagration regime is presented and validated using the hydrogen deflagration experiments performed in the medium-scale experimental facility THAI. The proposed method is implemented in the CFD software ANSYS Fluent using user defined functions. The paper describes the combustion model and the main results of code validation. It addresses questions regarding turbulence model selection, effect of heat transfer mechanisms, and grid sensitivity, as well as provides insights into the importance of combustion model choice for the slow deflagration regime of hydrogen combustion in medium-scale and large-scale experimental vessels mimicking the NPP containment.

  10. SAS validation and analysis of in-pile TUCOP experiments

    International Nuclear Information System (INIS)

    Morman, J.A.; Tentner, A.M.; Dever, D.J.

    1985-01-01

    The validation of the SAS4A accident analysis code centers on its capability to calculate the wide range of tests performed in the TREAT (Transient Reactor Test Facility) in-pile experiments program. This paper presents the SAS4A analysis of a simulated TUCOP (Transient-Under-Cooled-Over-Power) experiment using seven full-length PFR mixed oxide fuel pins in a flowing sodium loop. Calculations agree well with measured thermal-hydraulic, pin failure time and post-failure fuel motion data. The extent of the agreement confirms the validity of the models used in the SAS4A code to describe TUCOP accidents

  11. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  12. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  13. Validation of KENO V.a: Comparison with critical experiments

    International Nuclear Information System (INIS)

    Jordan, W.C.; Landers, N.F.; Petrie, L.M.

    1986-12-01

    Section 1 of this report documents the validation of KENO V.a against 258 critical experiments. Experiments considered were primarily high or low enriched uranium systems. The results indicate that the KENO V.a Monte Carlo Criticality Program accurately calculates a broad range of critical experiments. A substantial number of the calculations showed a positive or negative bias in excess of 1 1/2% in k-effective (k/sub eff/). Classes of criticals which show a bias include 3% enriched green blocks, highly enriched uranyl fluoride slab arrays, and highly enriched uranyl nitrate arrays. If these biases are properly taken into account, the KENO V.a code can be used with confidence for the design and criticality safety analysis of uranium-containing systems. Sections 2 of this report documents the results of investigation into the cause of the bias observed in Sect. 1. The results of this study indicate that the bias seen in Sect. 1 is caused by code bias, cross-section bias, reporting bias, and modeling bias. There is evidence that many of the experiments used in this validation and in previous validations are not adequately documented. The uncertainty in the experimental parameters overshadows bias caused by the code and cross sections and prohibits code validation to better than about 1% in k/sub eff/. 48 refs., 19 figs., 19 tabs

  14. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  15. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  16. Direct-contact condensers for open-cycle OTEC applications: Model validation with fresh water experiments for structured packings

    Energy Technology Data Exchange (ETDEWEB)

    Bharathan, D.; Parsons, B.K.; Althof, J.A.

    1988-10-01

    The objective of the reported work was to develop analytical methods for evaluating the design and performance of advanced high-performance heat exchangers for use in open-cycle thermal energy conversion (OC-OTEC) systems. This report describes the progress made on validating a one-dimensional, steady-state analytical computer of fresh water experiments. The condenser model represents the state of the art in direct-contact heat exchange for condensation for OC-OTEC applications. This is expected to provide a basis for optimizing OC-OTEC plant configurations. Using the model, we examined two condenser geometries, a cocurrent and a countercurrent configuration. This report provides detailed validation results for important condenser parameters for cocurrent and countercurrent flows. Based on the comparisons and uncertainty overlap between the experimental data and predictions, the model is shown to predict critical condenser performance parameters with an uncertainty acceptable for general engineering design and performance evaluations. 33 refs., 69 figs., 38 tabs.

  17. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  18. Reconceptualising the external validity of discrete choice experiments.

    Science.gov (United States)

    Lancsar, Emily; Swait, Joffre

    2014-10-01

    External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.

  19. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  20. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  1. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    Science.gov (United States)

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  2. The role of CFD combustion modeling in hydrogen safety management-II: Validation based on homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: sathiah@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Haren, Steven van, E-mail: vanharen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Department of Multi-Scale Physics, Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2012-11-15

    Highlights: Black-Right-Pointing-Pointer A CFD based method is proposed for the simulation of hydrogen deflagration. Black-Right-Pointing-Pointer A dynamic grid adaptation method is proposed to resolve turbulent flame brush thickness. Black-Right-Pointing-Pointer The predictions obtained using this method is in good agreement with the static grid method. Black-Right-Pointing-Pointer TFC model results are in good agreement with large-scale homogeneous hydrogen-air experiments. - Abstract: During a severe accident in a PWR, large quantities of hydrogen can be generated and released into the containment. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In a previous article, we presented a CFD based method to determine these pressure loads. This CFD method is based on the application of a turbulent flame speed closure combustion model. The validation analyses in our previous paper demonstrated that it is of utmost importance to apply successive mesh and time step refinement in order to get reliable results. In this article, we first determined to what extent the required computational effort required for our CFD approach can be reduced by the application of adaptive mesh refinement, while maintaining the accuracy requirements. Experiments performed within a small fan stirred explosion bomb were used for this purpose. It could be concluded that adaptive grid adaptation is a reliable and efficient method for usage in hydrogen deflagration analyses. For the two-dimensional validation analyses, the application of dynamic grid adaptation resulted in a reduction of the required computational effort by about one order of magnitude. In a second step, the considered CFD approach including adaptive

  3. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  4. ATHLET validation using accident management experiments

    Energy Technology Data Exchange (ETDEWEB)

    Teschendorff, V.; Glaeser, H.; Steinhoff, F. [Gasellschaft fuer Anlagen - und Reaktorsicherheit (GSR) mbH, Garching (Germany)

    1995-09-01

    The computer code ATHLET is being developed as an advanced best-estimate code for the simulation of leaks and transients in PWRs and BWRs including beyond design basis accidents. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialisation by a steady-state calculation, full-range drift-flux model, and dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The systematic validation of ATHLET is based on a well balanced set of integral and separate effect tests derived from the CSNI proposal emphasising, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities. PKL-III test B 2.1 simulates a cool-down procedure during an emergency power case with three steam generators isolated. Natural circulation under these conditions was investigated in detail in a pressure range of 4 to 2 MPa. The transient was calculated over 22000 s with complicated boundary conditions including manual control actions. The calculations demonstrations the capability to model the following processes successfully: (1) variation of the natural circulation caused by steam generator isolation, (2) vapour formation in the U-tubes of the isolated steam generators, (3) break-down of circulation in the loop containing the isolated steam generator following controlled cool-down of the secondary side, (4) accumulation of vapour in the pressure vessel dome. One conclusion with respect to the suitability of experiments simulating AM procedures for code validation purposes is that complete documentation of control actions during the experiment must be available. Special attention should be given to the documentation of operator actions in the course of the experiment.

  5. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  6. Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rao, Rekha Ranjana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelden, Bion [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Hern, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wyatt, Nicholas B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hileman, Michael Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Urquhart, Alexander [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle Richard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, David Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  7. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  8. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  9. The role of CFD combustion modeling in hydrogen safety management – IV: Validation based on non-homogeneous hydrogen–air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Delft University of Technology, Department of Process and Energy, Section Fluid Mechanics, Mekelweg 2, 2628 CD Delft (Netherlands)

    2016-12-15

    Highlights: • TFC combustion model is further extended to simulate flame propagation in non-homogeneous hydrogen–air mixtures. • TFC combustion model results are in good agreement with large-scale non-homogeneous hydrogen–air experiments. • The model is further extended to account for the non-uniform hydrogen–air–steam mixture for the presence of PARs on hydrogen deflagration. - Abstract: The control of hydrogen in the containment is an important safety issue in NPPs during a loss of coolant accident, because the dynamic pressure loads from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In Sathiah et al. (2012b), we presented a computational fluid dynamics based method to assess the consequence of the combustion of uniform hydrogen–air mixtures. In the present article, the extension of this method to and its validation for non-uniform hydrogen–air mixture is described. The method is implemented in the CFD software ANSYS FLUENT using user defined functions. The extended code is validated against non-uniform hydrogen–air experiments in the ENACCEF facility. It is concluded that the maximum pressure and intermediate peak pressure were predicted within 12% and 18% accuracy. The eigen frequencies of the residual pressure wave phenomena were predicted within 4%. It is overall concluded that the current model predicts the considered ENACCEF experiments well.

  10. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  11. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  12. Instrumented anvil-on-rod impact experiments for validating constitutive strength model for simulating transient dynamic deformation response of metals

    International Nuclear Information System (INIS)

    Martin, M.; Shen, T.; Thadhani, N.N.

    2008-01-01

    Instrumented anvil-on-rod impact experiments were performed to access the applicability of this approach for validating a constitutive strength model for dynamic, transient-state deformation and elastic-plastic wave interactions in vanadium, 21-6-9 stainless steel, titanium, and Ti-6Al-4V. In addition to soft-catching the impacted rod-shaped samples, their transient deformation states were captured by high-speed imaging, and velocity interferometry was used to record the sample back (free) surface velocity and monitor elastic-plastic wave interactions. Simulations utilizing AUTODYN-2D hydrocode with Steinberg-Guinan constitutive equation were used to generate simulated free surface velocity traces and final/transient deformation profiles for comparisons with experiments. The simulations were observed to under-predict the radial strain for bcc vanadium and fcc steel, but over-predict the radial strain for hcp titanium and Ti-6Al-4V. The correlations illustrate the applicability of the instrumented anvil-on-rod impact test as a method for providing robust model validation based on the entire deformation event, and not just the final deformed state

  13. Validating the BISON fuel performance code to integral LWR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R.L., E-mail: Richard.Williamson@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gamble, K.A., E-mail: Kyle.Gamble@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Perez, D.M., E-mail: Danielle.Perez@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Novascone, S.R., E-mail: Stephen.Novascone@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Pastore, G., E-mail: Giovanni.Pastore@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Gardner, R.J., E-mail: Russell.Gardner@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Hales, J.D., E-mail: Jason.Hales@inl.gov [Fuel Modeling and Simulation, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3840 (United States); Liu, W., E-mail: Wenfeng.Liu@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States); Mai, A., E-mail: Anh.Mai@anatech.com [ANATECH Corporation, 5435 Oberlin Dr., San Diego, CA 92121 (United States)

    2016-05-15

    Highlights: • The BISON multidimensional fuel performance code is being validated to integral LWR experiments. • Code and solution verification are necessary prerequisites to validation. • Fuel centerline temperature comparisons through all phases of fuel life are very reasonable. • Accuracy in predicting fission gas release is consistent with state-of-the-art modeling and the involved uncertainties. • Rod diameter comparisons are not satisfactory and further investigation is underway. - Abstract: BISON is a modern finite element-based nuclear fuel performance code that has been under development at Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON's computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Results demonstrate that (1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, (2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and (3) comparison

  14. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    Science.gov (United States)

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  15. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  16. The role of CFD combustion modeling in hydrogen safety management – III: Validation based on homogeneous hydrogen–air–diluent experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Shell Global Solutions Ltd., Brabazon House, Concord Business Park, Threapwood Road, Manchester M220RR (United Kingdom); Komen, Ed [Nuclear Research and Consultancy Group – NRG, P.O. Box 25, 1755 ZG Petten (Netherlands); Roekaerts, Dirk [Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2015-08-15

    Highlights: • A CFD based method proposed in the previous article is used for the simulation of the effect of CO{sub 2}–He dilution on hydrogen deflagration. • A theoretical study is presented to verify whether CO{sub 2}–He diluent can be used as a replacement for H{sub 2}O as diluent. • CFD model used for the validation work is described. • TFC combustion model results are in good agreement with large-scale homogeneous hydrogen–air–CO{sub 2}–He experiments. - Abstract: Large quantities of hydrogen can be generated and released into the containment during a severe accident in a PWR. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In our previous article, a CFD based method to determine these pressure loads was presented. This CFD method is based on the application of a turbulent flame speed closure combustion model. The method was validated against three uniform hydrogen–air deflagration experiments with different blockage ratio performed in the ENACCEF facility. It was concluded that the maximum pressures were predicted within 13% accuracy, while the rate of pressure rise dp/dt was predicted within about 30%. The eigen frequencies of the residual pressure wave phenomena were predicted within a few %. In the present article, we perform additional validation of the CFD based method against three uniform hydrogen–air–CO{sub 2}–He deflagration experiments with three different concentrations of the CO{sub 2}–He diluent. The trends of decrease in the flame velocity, the intermediate peak pressure, the rate of pressure rise dp/dt, and the maximum value of the mean pressure with an increase in the CO{sub 2}–He dilution are captured well in the simulations. From the

  17. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  18. BACCHUS 2: an in situ backfill hydration experiment for model validation

    International Nuclear Information System (INIS)

    Volckaert, G.; Bernier, F.; Alonso, E.; Gens, A.

    1995-01-01

    The BACCHUS 2 experiment is an in situ backfill hydration test performed in the HADES underground research facility situated in the plastic Boom clay layer at 220 m depth. The experiment aims at the optimization and demonstration of an installation procedure for a clay based backfill material. The instrumentation has been optimized in such a way that the results of the experiments can be used for the validation of hydro-mechanical codes such a NOSAT developed at the University of Catalunya Spain (UPC). The experimental set-up consists in a bottom flange and a central filter around which the backfill material was applied. The backfill material consist of a mixture of high density clay pellets and clay powder. The experimental set-up and its instrumentation are described in detail. The results of the hydro-mechanical characterization of the backfill material is summarized. (authors). 8 refs., 16 figs., 1 tab

  19. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  20. An attempt to calibrate and validate a simple ductile failure model against axial-torsion experiments on Al 6061-T651

    Energy Technology Data Exchange (ETDEWEB)

    Reedlunn, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lu, Wei -Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-01-01

    This report details a work in progress. We have attempted to calibrate and validate a Von Mises plasticity model with the Johnson-Cook failure criterion ( Johnson & Cook , 1985 ) against a set of experiments on various specimens of Al 6061-T651. As will be shown, the effort was not successful, despite considerable attention to detail. When the model was com- pared against axial-torsion experiments on tubes, it over predicted failure by 3 x in tension, and never predicted failure in torsion, even when the tube was twisted by 4 x further than the experiment. While this result is unfortunate, it is not surprising. Ductile failure is not well understood. In future work, we will explore whether more sophisticated material mod- els of plasticity and failure will improve the predictions. Selecting the appropriate advanced material model and interpreting the results of said model are not trivial exercises, so it is worthwhile to fully investigate the behavior of a simple plasticity model before moving on to an anisotropic yield surface or a similarly complicated model.

  1. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  2. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  3. Three-dimensional fuel pin model validation by prediction of hydrogen distribution in cladding and comparison with experiment

    Energy Technology Data Exchange (ETDEWEB)

    Aly, A. [North Carolina State Univ., Raleigh, NC (United States); Avramova, Maria [North Carolina State Univ., Raleigh, NC (United States); Ivanov, Kostadin [Pennsylvania State Univ., University Park, PA (United States); Motta, Arthur [Pennsylvania State Univ., University Park, PA (United States); Lacroix, E. [Pennsylvania State Univ., University Park, PA (United States); Manera, Annalisa [Univ. of Michigan, Ann Arbor, MI (United States); Walter, D. [Univ. of Michigan, Ann Arbor, MI (United States); Williamson, R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gamble, K. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-10-29

    To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed by data from hydrogen experiments and PIE data.

  4. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  5. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  6. Modeling a High Explosive Cylinder Experiment

    Science.gov (United States)

    Zocher, Marvin A.

    2017-06-01

    Cylindrical assemblies constructed from high explosives encased in an inert confining material are often used in experiments aimed at calibrating and validating continuum level models for the so-called equation of state (constitutive model for the spherical part of the Cauchy tensor). Such is the case in the work to be discussed here. In particular, work will be described involving the modeling of a series of experiments involving PBX-9501 encased in a copper cylinder. The objective of the work is to test and perhaps refine a set of phenomenological parameters for the Wescott-Stewart-Davis reactive burn model. The focus of this talk will be on modeling the experiments, which turned out to be non-trivial. The modeling is conducted using ALE methodology.

  7. DebrisInterMixing-2.3: a finite volume solver for three-dimensional debris-flow simulations with two calibration parameters – Part 2: Model validation with experiments

    Directory of Open Access Journals (Sweden)

    A. von Boetticher

    2017-11-01

    Full Text Available Here, we present validation tests of the fluid dynamic solver presented in von Boetticher et al. (2016, simulating both laboratory-scale and large-scale debris-flow experiments. The new solver combines a Coulomb viscoplastic rheological model with a Herschel–Bulkley model based on material properties and rheological characteristics of the analyzed debris flow. For the selected experiments in this study, all necessary material properties were known – the content of sand, clay (including its mineral composition and gravel as well as the water content and the angle of repose of the gravel. Given these properties, two model parameters are sufficient for calibration, and a range of experiments with different material compositions can be reproduced by the model without recalibration. One calibration parameter, the Herschel–Bulkley exponent, was kept constant for all simulations. The model validation focuses on different case studies illustrating the sensitivity of debris flows to water and clay content, channel curvature, channel roughness and the angle of repose. We characterize the accuracy of the model using experimental observations of flow head positions, front velocities, run-out patterns and basal pressures.

  8. Validity - a matter of resonant experience

    DEFF Research Database (Denmark)

    Revsbæk, Line

    This paper is about doing interview analysis drawing on researcher’s own lived experience concerning the question of inquiry. The paper exemplifies analyzing case study participants’ experience from the resonant experience of researcher’s own life evoked while listening to recorded interview...... across researcher’s past experience from the case study and her own life. The autobiographic way of analyzing conventional interview material is exemplified with a case of a junior researcher researching newcomer innovation of others, drawing on her own experience of being newcomer in work community...... entry processes. The validity of doing interview analysis drawing on the resonant experience of researcher is argued from a pragmatist perspective....

  9. Integrated multiscale biomaterials experiment and modelling: a perspective

    Science.gov (United States)

    Buehler, Markus J.; Genin, Guy M.

    2016-01-01

    Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126

  10. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  11. CFD validation experiments at the Lockheed-Georgia Company

    Science.gov (United States)

    Malone, John B.; Thomas, Andrew S. W.

    1987-01-01

    Information is given in viewgraph form on computational fluid dynamics (CFD) validation experiments at the Lockheed-Georgia Company. Topics covered include validation experiments on a generic fighter configuration, a transport configuration, and a generic hypersonic vehicle configuration; computational procedures; surface and pressure measurements on wings; laser velocimeter measurements of a multi-element airfoil system; the flowfield around a stiffened airfoil; laser velocimeter surveys of a circulation control wing; circulation control for high lift; and high angle of attack aerodynamic evaluations.

  12. Dynamically Scaled Model Experiment of a Mooring Cable

    Directory of Open Access Journals (Sweden)

    Lars Bergdahl

    2016-01-01

    Full Text Available The dynamic response of mooring cables for marine structures is scale-dependent, and perfect dynamic similitude between full-scale prototypes and small-scale physical model tests is difficult to achieve. The best possible scaling is here sought by means of a specific set of dimensionless parameters, and the model accuracy is also evaluated by two alternative sets of dimensionless parameters. A special feature of the presented experiment is that a chain was scaled to have correct propagation celerity for longitudinal elastic waves, thus providing perfect geometrical and dynamic scaling in vacuum, which is unique. The scaling error due to incorrect Reynolds number seemed to be of minor importance. The 33 m experimental chain could then be considered a scaled 76 mm stud chain with the length 1240 m, i.e., at the length scale of 1:37.6. Due to the correct elastic scale, the physical model was able to reproduce the effect of snatch loads giving rise to tensional shock waves propagating along the cable. The results from the experiment were used to validate the newly developed cable-dynamics code, MooDy, which utilises a discontinuous Galerkin FEM formulation. The validation of MooDy proved to be successful for the presented experiments. The experimental data is made available here for validation of other numerical codes by publishing digitised time series of two of the experiments.

  13. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  14. Development and validation of a viscoelastic and nonlinear liver model for needle insertion

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Yo [Waseda University, Consolidated Research Institute for Advanced Science and Medical Care, Shinjuku, Tokyo (Japan); Onishi, Akinori; Hoshi, Takeharu; Kawamura, Kazuya [Waseda University, Graduate School of Science and Engineering, Shinjuku (Japan); Hashizume, Makoto [Kyushu University Hospital, Center for the Integration of Advanced Medicine and Innovative Technology, Fukuoka (Japan); Fujie, Masakatsu G. [Waseda University, Graduate School of Science and Engineering, Faculty of Science and Engineering, Shinjuku (Japan)

    2009-01-15

    The objective of our work is to develop and validate a viscoelastic and nonlinear physical liver model for organ model-based needle insertion, in which the deformation of an organ is estimated and predicted, and the needle path is determined with organ deformation taken into consideration. First, an overview is given of the development of the physical liver model. The material properties of the liver considering viscoelasticity and nonlinearity are modeled based on the measured data collected from a pig's liver. The method to develop the liver model using FEM is also shown. Second, the experimental method to validate the model is explained. Both in vitro and in vivo experiments that made use of a pig's liver were conducted for comparison with the simulation using the model. Results of the in vitro experiment showed that the model reproduces nonlinear and viscoelastic response of displacement at an internally located point with high accuracy. For a force up to 0.45 N, the maximum error is below 1 mm. Results of the in vivo experiment showed that the model reproduces the nonlinear increase of load upon the needle during insertion. Based on these results, the liver model developed and validated in this work reproduces the physical response of a liver in both in vitro and in vivo situations. (orig.)

  15. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  16. Chemometric and biological validation of a capillary electrophoresis metabolomic experiment of Schistosoma mansoni infection in mice.

    Science.gov (United States)

    Garcia-Perez, Isabel; Angulo, Santiago; Utzinger, Jürg; Holmes, Elaine; Legido-Quigley, Cristina; Barbas, Coral

    2010-07-01

    Metabonomic and metabolomic studies are increasingly utilized for biomarker identification in different fields, including biology of infection. The confluence of improved analytical platforms and the availability of powerful multivariate analysis software have rendered the multiparameter profiles generated by these omics platforms a user-friendly alternative to the established analysis methods where the quality and practice of a procedure is well defined. However, unlike traditional assays, validation methods for these new multivariate profiling tools have yet to be established. We propose a validation for models obtained by CE fingerprinting of urine from mice infected with the blood fluke Schistosoma mansoni. We have analysed urine samples from two sets of mice infected in an inter-laboratory experiment where different infection methods and animal husbandry procedures were employed in order to establish the core biological response to a S. mansoni infection. CE data were analysed using principal component analysis. Validation of the scores consisted of permutation scrambling (100 repetitions) and a manual validation method, using a third of the samples (not included in the model) as a test or prediction set. The validation yielded 100% specificity and 100% sensitivity, demonstrating the robustness of these models with respect to deciphering metabolic perturbations in the mouse due to a S. mansoni infection. A total of 20 metabolites across the two experiments were identified that significantly discriminated between S. mansoni-infected and noninfected control samples. Only one of these metabolites, allantoin, was identified as manifesting different behaviour in the two experiments. This study shows the reproducibility of CE-based metabolic profiling methods for disease characterization and screening and highlights the importance of much needed validation strategies in the emerging field of metabolomics.

  17. Argonne Bubble Experiment Thermal Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-03

    This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiation. It is based on the model used to calculate temperatures and volume fractions in an annular vessel containing an aqueous solution of uranium . The experiment was repeated at several electron beam power levels, but the CFD analysis was performed only for the 12 kW irradiation, because this experiment came the closest to reaching a steady-state condition. The aim of the study is to compare results of the calculation with experimental measurements to determine the validity of the CFD model.

  18. Comparative calculations and validation studies with atmospheric dispersion models

    International Nuclear Information System (INIS)

    Paesler-Sauer, J.

    1986-11-01

    This report presents the results of an intercomparison of different mesoscale dispersion models and measured data of tracer experiments. The types of models taking part in the intercomparison are Gaussian-type, numerical Eulerian, and Lagrangian dispersion models. They are suited for the calculation of the atmospherical transport of radionuclides released from a nuclear installation. For the model intercomparison artificial meteorological situations were defined and corresponding arithmetical problems were formulated. For the purpose of model validation real dispersion situations of tracer experiments were used as input data for model calculations; in these cases calculated and measured time-integrated concentrations close to the ground are compared. Finally a valuation of the models concerning their efficiency in solving the problems is carried out by the aid of objective methods. (orig./HP) [de

  19. Developing rural palliative care: validating a conceptual model.

    Science.gov (United States)

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  20. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  1. Patient Experiences with the Preoperative Assessment Clinic (PEPAC): validation of an instrument to measure patient experiences

    NARCIS (Netherlands)

    Edward, G. M.; Lemaire, L. C.; Preckel, B.; Oort, F. J.; Bucx, M. J. L.; Hollmann, M. W.; de Haes, J. C. J. M.

    2007-01-01

    Background. Presently, no comprehensive and validated questionnaire to measure patient experiences of the preoperative assessment clinic (PAC) is available. We developed and validated the Patient Experiences with the Preoperative Assessment Clinic (PEPAC) questionnaire, which can be used for

  2. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  3. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede

    2017-01-01

    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  4. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander

    2011-01-01

    from different databases, two statistical-narrow-band models and the exponential wide band model. The two statistical-narrow-band models EM2C and RADCAL showed a good agreement with a maximal band transmissivity deviation of 3%. The exponential-wide-band model showed a deviation of 6%. The new line......-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was recommended as a reference model for the validation of simplified CFD models....

  5. The Childbirth Experience Questionnaire (CEQ) - validation of its use in a Danish population

    DEFF Research Database (Denmark)

    Boie, Sidsel; Glavind, Julie; Uldbjerg, Niels

    experience is lacking. The Childbirth Experience Questionnaire (CEQ) was developed in Sweden in 2010 and validated in Swedish women, but never validated in a Danish setting, and population. The purpose of our study was to validate the CEQ as a reliable tool for measuring the childbirth experience in Danish......Title The Childbirth Experience Questionnaire (CEQ) - validation the use in a Danish population Introduction Childbirth experience is arguably as important as measuring birth outcomes such as mode of delivery or perinatal morbidity. A robust, validated, Danish tool for evaluating childbirth...... index of agreement between the two scores. Case description (mandatory for Clinical Report) Results (mandatory for Original Research) Face validity: All respondents stated that it was easy to understand and complete the questionnaire. Construct validity: Statistically significant higher CEQ scores were...

  6. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...

  7. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  8. Mold-filling experiments for validation of modeling encapsulation. Part 1, "wine glass" mold.

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda, Jaime N.; Grillet, Anne Mary; Altobelli, Stephen A. (New Mexico Resonance, Albuquerque, NM); Cote, Raymond O.; Mondy, Lisa Ann

    2005-06-01

    The C6 project 'Encapsulation Processes' has been designed to obtain experimental measurements for discovery of phenomena critical to improving these processes, as well as data required in the verification and validation plan (Rao et al. 2001) for model validation of flow in progressively complex geometries. We have observed and recorded the flow of clear, Newtonian liquids and opaque, rheologically complex suspensions in two mold geometries. The first geometry is a simple wineglass geometry in a cylinder and is reported here in Part 1. The results in a more realistic encapsulation geometry are reported in Part 2.

  9. A validation study for the gas migration modelling of the compacted bentonite using existing experiment data

    International Nuclear Information System (INIS)

    Tawara, Y.; Mori, K.; Tada, K.; Shimura, T.; Sato, S.; Yamamoto, S.; Hayashi, H.

    2010-01-01

    Document available in extended abstract form only. After the field-scaled Gas Migration Test (GMT) was carried out at Grimsel Test Site (GTS) in Switzerland from 1997 through 2005, a study on advanced gas migration modelling has been conducted as a part of R and D programs of the RWMC (Radioactive Waste Management funding and Research Center) to evaluate long-term behaviour of the Engineered Barrier System (EBS) for the TRU waste disposal system in Japan. One of main objectives of this modelling study is to provide the qualified models and parameters in order to predict long-term gas migration behaviour in compacted bentonite. In addition, from a perspective of coupled THMC (Thermal, Hydrological, Mechanical and Chemical) processes, the specific processes which may have considerable impact to the gas migration behaviour are discussed by means of scoping calculations. Literature survey was conducted to collect experimental data related to gas migration in compacted bentonite in order to discuss an applicability of the existing gas migration models in the bentonite. The well-known flow rate controlled-gas injection experiment by Horseman, et al. and the pressure-controlled-gas injection test using several data with wide range of clay density and water content by Graham, et al, were selected. These literatures show the following characteristic behaviour of gas migration in high compacted and water-saturated bentonite. The observed gas flow rate from the outlet in the experiment by Horseman et al. was numerically reproduced by using the different conceptual models and computer codes, and then an applicability of the models and the identified key parameters such as relative permeability and capillary pressure were discussed. Helium gas was repeatedly injected into fully water-saturated and isotropically consolidated MX-80 bentonite (dry density: 1.6 Mg/m 3 ) in the experiment. One of the most important conclusions from this experiment is that it's impossible for

  10. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  11. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H Oh; Eung S Kim

    2011-09-01

    Idaho National Laboratory carried out air ingress experiments as part of validating computational fluid dynamics (CFD) calculations. An isothermal test loop was designed and set to understand the stratified-flow phenomenon, which is important as the initial air flow into the lower plenum of the very high temperature gas cooled reactor (VHTR) when a large break loss-of-coolant accident occurs. The unique flow characteristics were focused on the VHTR air-ingress accident, in particular, the flow visualization of the stratified flow in the inlet pipe to the vessel lower plenum of the General Atomic’s Gas Turbine-Modular Helium Reactor (GT-MHR). Brine and sucrose were used as heavy fluids, and water was used to represent a light fluid, which mimics a counter current flow due to the density difference between the stimulant fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between simulant fluids was established even for very small density differences. The CFD calculations were compared with experimental data. A grid sensitivity study on CFD models was also performed using the Richardson extrapolation and the grid convergence index method for the numerical accuracy of CFD calculations . As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  12. Gamma streaming experiments for validation of Monte Carlo code

    International Nuclear Information System (INIS)

    Thilagam, L.; Mohapatra, D.K.; Subbaiah, K.V.; Iliyas Lone, M.; Balasubramaniyan, V.

    2012-01-01

    In-homogeneities in shield structures lead to considerable amount of leakage radiation (streaming) increasing the radiation levels in accessible areas. Development works on experimental as well as computational methods for quantifying this streaming radiation are still continuing. Monte Carlo based radiation transport code, MCNP is usually a tool for modeling and analyzing such problems involving complex geometries. In order to validate this computational method for streaming analysis, it is necessary to carry out some experimental measurements simulating these inhomogeneities like ducts and voids present in the bulk shields for typical cases. The data thus generated will be analysed by simulating the experimental set up employing MCNP code and optimized input parameters for the code in finding solutions for similar radiation streaming problems will be formulated. Comparison of experimental data obtained from radiation streaming experiments through ducts will give a set of thumb rules and analytical fits for total radiation dose rates within and outside the duct. The present study highlights the validation of MCNP code through the gamma streaming experiments carried out with the ducts of various shapes and dimensions. Over all, the present study throws light on suitability of MCNP code for the analysis of gamma radiation streaming problems for all duct configurations considered. In the present study, only dose rate comparisons have been made. Studies on spectral comparison of streaming radiation are in process. Also, it is planned to repeat the experiments with various shield materials. Since the penetrations and ducts through bulk shields are unavoidable in an operating nuclear facility the results on this kind of radiation streaming simulations and experiments will be very useful in the shield structure optimization without compromising the radiation safety

  13. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    Science.gov (United States)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  14. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  15. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  16. Validation of model predictions of pore-scale fluid distributions during two-phase flow

    Science.gov (United States)

    Bultreys, Tom; Lin, Qingyang; Gao, Ying; Raeini, Ali Q.; AlRatrout, Ahmed; Bijeljic, Branko; Blunt, Martin J.

    2018-05-01

    Pore-scale two-phase flow modeling is an important technology to study a rock's relative permeability behavior. To investigate if these models are predictive, the calculated pore-scale fluid distributions which determine the relative permeability need to be validated. In this work, we introduce a methodology to quantitatively compare models to experimental fluid distributions in flow experiments visualized with microcomputed tomography. First, we analyzed five repeated drainage-imbibition experiments on a single sample. In these experiments, the exact fluid distributions were not fully repeatable on a pore-by-pore basis, while the global properties of the fluid distribution were. Then two fractional flow experiments were used to validate a quasistatic pore network model. The model correctly predicted the fluid present in more than 75% of pores and throats in drainage and imbibition. To quantify what this means for the relevant global properties of the fluid distribution, we compare the main flow paths and the connectivity across the different pore sizes in the modeled and experimental fluid distributions. These essential topology characteristics matched well for drainage simulations, but not for imbibition. This suggests that the pore-filling rules in the network model we used need to be improved to make reliable predictions of imbibition. The presented analysis illustrates the potential of our methodology to systematically and robustly test two-phase flow models to aid in model development and calibration.

  17. Cooling tower plume - model and experiment

    Science.gov (United States)

    Cizek, Jan; Gemperle, Jiri; Strob, Miroslav; Nozicka, Jiri

    The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.

  18. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  19. Model and experiences of initiating collaboration with traditional healers in validation of ethnomedicines for HIV/AIDS in Namibia

    Directory of Open Access Journals (Sweden)

    Chinsembu Kazhila C

    2009-10-01

    Full Text Available Abstract Many people with Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome (HIV/AIDS in Namibia have access to antiretroviral drugs but some still use traditional medicines to treat opportunistic infections and offset side-effects from antiretroviral medication. Namibia has a rich biodiversity of indigenous plants that could contain novel anti-HIV agents. However, such medicinal plants have not been identified and properly documented. Various ethnomedicines used to treat HIV/AIDS opportunistic infections have not been scientifically validated for safety and efficacy. These limitations are mostly attributable to the lack of collaboration between biomedical scientists and traditional healers. This paper presents a five-step contextual model for initiating collaboration with Namibian traditional healers in order that candidate plants that may contain novel anti-HIV agents are identified, and traditional medicines used to treat HIV/AIDS opportunistic infections are subjected to scientific validation. The model includes key structures and processes used to initiate collaboration with traditional healers in Namibia; namely, the National Biosciences Forum, a steering committee with the University of Namibia (UNAM as the focal point, a study tour to Zambia and South Africa where other collaborative frameworks were examined, commemorations of the African Traditional Medicine Day (ATMD, and consultations with stakeholders in north-eastern Namibia. Experiences from these structures and processes are discussed. All traditional healers in north-eastern Namibia were willing to collaborate with UNAM in order that their traditional medicines could be subjected to scientific validation. The current study provides a framework for future collaboration with traditional healers and the selection of candidate anti-HIV medicinal plants and ethnomedicines for scientific testing in Namibia.

  20. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  1. An Examination and Validation of an Adapted Youth Experience Scale for University Sport

    Science.gov (United States)

    Rathwell, Scott; Young, Bradley W.

    2016-01-01

    Limited tools assess positive development through university sport. Such a tool was validated in this investigation using two independent samples of Canadian university athletes. In Study 1, 605 athletes completed 99 survey items drawn from the Youth Experience Scale (YES 2.0), and separate a priori measurement models were evaluated (i.e., 99…

  2. The 'model omnitron' proposed experiment

    International Nuclear Information System (INIS)

    Sestero, A.

    1997-05-01

    The Model Omitron is a compact tokamak experiment which is designed by the Fusion Engineering Unit of ENEA and CITIF CONSORTIUM. The building of Model Omitron would allow for full testing of Omitron engineering, and partial testing of Omitron physics -at about 1/20 of the cost that has been estimated for the larger parent machine. In particular, due to the unusually large ohmic power densities (up to 100 times the nominal value in the Frascati FTU experiment), in Model Omitron the radial energy flux is reaching values comparable or higher than envisaged of the larger ignition experiments Omitron, Ignitor and Iter. Consequently, conditions are expected to occur at the plasma border in the scrape-off layer of Model Omitron, which are representative of the quoted larger experiments. Moreover, since all this will occur under ohmic heating alone, one will hopefully be able to derive an energy transport model fo the ohmic heating regime that is valid over a range of plasma parameters (in particular, of the temperature parameter) wider than it was possible before. In the Model Omitron experiment, finally - by reducing the plasma current and/or the toroidal field down to, say, 1/3 or 1/4 of the nominal values -additional topics can be tackled, such as: large safety-factor configurations (of interest for improving confinement), large aspect-ratio configurations (of interest for the investigation of advanced concepts in tokamaks), high beta (with RF heating -also of interest for the investigation of advanced concepts in tokamaks), long pulse discharges (of interest for demonstrating stationary conditions in the current profile)

  3. Establishment and validation of the model of molten pool in fast reactor

    International Nuclear Information System (INIS)

    Zhou Shufeng; Luo Rui; Wang Zhou; Shi Xiaobo; Yang Xianyong

    2007-01-01

    Running under the beyond design base accidental condition, sodium boiling and dry-out will soon be brought about in LMFBR. If not stopped timely, the fuel pins of the subassembly will be melt and broken to form a molten pool at the bottom of the subassembly. to present a reasonable analysis about the molten pool accident, a method of establishing model according to the mechanism is selected, by which an integral model of the molten pool is established. Validated on the three power groups of BF1 experiments which belong to the France SCARABEE series experimenters, the model shows good results. After compared with the models of GEYSER and BF2 experiments which had been validated before, some conclusions about mechanism of molten pool are derived. Moreover, through comparing the relative parameters such as the discharged heat and the increment of temperature etc., a reasonable analysis about the type of heat transfer is present, on the basis of which some conclusions are derived as well. (authors)

  4. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS AT IDAHO NATIONAL LABORATORY: DESCRIPTION AND SUMMARY OF DATA

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2010-01-01

    Idaho National Laboratory performed air ingress experiments as part of validating computational fluid dynamics code (CFD). An isothermal stratified flow experiment was designed and set to understand stratified flow phenomena in the very high temperature gas cooled reactor (VHTR) and to provide experimental data for validating computer codes. The isothermal experiment focused on three flow characteristics unique in the VHTR air-ingress accident: stratified flow in the horizontal pipe, stratified flow expansion at the pipe and vessel junction, and stratified flow around supporting structures. Brine and sucrose were used as heavy fluids and water was used as light fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between heavy and light fluids is generated even for very small density differences. The code was validated by conducting blind CFD simulations and comparing the results to the experimental data. A grid sensitivity study was also performed based on the Richardson extrapolation and the grid convergence index method for modeling confidence. As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  5. Modeling validation to structural flaws in the foundations of oil tanks

    International Nuclear Information System (INIS)

    Couto, Larissa Goncalves; Leite, Sandro Passos

    2014-01-01

    This paper presents the modeling of an experiment used to study the application of backscattered neutrons in the identification of structural flaws in the foundations of oil tanks. This modeling was a preliminary validation procedure of the method of calculation, performed with the radiation transport code MCNP, to study the application of backscattered neutrons as inspection tool. (author)

  6. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  7. Construct validity of the ovine model in endoscopic sinus surgery training.

    Science.gov (United States)

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  8. Cooling tower plume - model and experiment

    Directory of Open Access Journals (Sweden)

    Cizek Jan

    2017-01-01

    Full Text Available The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.

  9. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  10. Development and Validation of an Instrument for Assessing Patient Experience of Chronic Illness Care

    Directory of Open Access Journals (Sweden)

    José Joaquín Mira

    2016-08-01

    Full Text Available Introduction: The experience of chronic patients with the care they receive, fuelled by the focus on patient-centeredness and the increasing evidence on its positive relation with other dimensions of quality, is being acknowledged as a key element in improving the quality of care. There are a dearth of accepted tools and metrics to assess patient experience from the patient’s perspective that have been adapted to the new chronic care context: continued, systemic, with multidisciplinary teams and new technologies. Methods: Development and validation of a scale conducting a literature review, expert panel, pilot and field studies with 356 chronic primary care patients, to assess content and face validities and reliability. Results: IEXPAC is an 11+1 item scale with adequate metric properties measured by Alpha Chronbach, Goodness of fit index, and satisfactory convergence validity around three factors named: productive interactions, new relational model and person’s self-management. Conclusions: IEXPAC allows measurement of the patient experience of chronic illness care. Together with other indicators, IEXPAC can determine the quality of care provided according to the Triple Aim framework, facilitating health systems reorientation towards integrated patient-centred care.

  11. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  12. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  13. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi

    2016-02-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  14. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Francis, Lijo; Laleg-Kirati, Taous-Meriem

    2016-01-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  15. Validation of fracture flow models in the Stripa project

    International Nuclear Information System (INIS)

    Herbert, A.; Dershowitz, W.; Long, J.; Hodgkinson, D.

    1991-01-01

    One of the objectives of Phase III of the Stripa Project is to develop and evaluate approaches for the prediction of groundwater flow and nuclide transport in a specific unexplored volume of the Stripa granite and make a comparison with data from field measurements. During the first stage of the project, a prediction of inflow to the D-holes, an array of six parallel closely spaced 100m boreholes, was made based on data from six other boreholes. This data included fracture geometry, stress, single borehole geophysical logging, crosshole and reflection radar and seismic tomogram, head monitoring and single hole packer test measurements. Maps of fracture traces on the drift walls have also been made. The D-holes are located along a future Validation Drift which will be excavated. The water inflow to the D-holes has been measured in an experiment called the Simulated Drift Experiment. The paper reviews the Simulated Drift Experiment validation exercise. Following a discussion of the approach to validation, the characterization data and its preliminary interpretation are summarised and commented upon. That work has proved feasible to carry through all the complex and interconnected tasks associated with the gathering and interpretation of characterization data, the development and application of complex models, and the comparison with measured inflows. This exercise has provided detailed feed-back to the experimental and theoretical work required for measurements and predictions of flow into the Validation Drift. Computer codes used: CHANGE, FRACMAN, MAFIC, NAPSAC and TRINET. 2 figs., 2 tabs., 19 refs

  16. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  17. Experimental validation of a mathematical model for seabed liquefaction in waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2011-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt ( d50 = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range 7.7-18 cm with the water depth 55 cm and the wave period 1.6 s enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data was used to validate the model. A numerical...

  18. Validation of a two-fluid model used for the simulation of dense fluidized beds; Validation d`un modele a deux fluides applique a la simulation des lits fluidises denses

    Energy Technology Data Exchange (ETDEWEB)

    Boelle, A.

    1997-02-17

    A two-fluid model applied to the simulation of gas-solid dense fluidized beds is validated on micro scale and on macro scale. Phase coupling is carried out in the momentum and energy transport equation of both phases. The modeling is built on the kinetic theory of granular media in which the gas action has been taken into account in order to get correct expressions of transport coefficients. A description of hydrodynamic interactions between particles in high Stokes number flow is also incorporated in the model. The micro scale validation uses Lagrangian numerical simulations viewed as numerical experiments. The first validation case refers to a gas particle simple shear flow. It allows to validate the competition between two dissipation mechanisms: drag and particle collisions. The second validation case is concerted with sedimenting particles in high Stokes number flow. It allows to validate our approach of hydrodynamic interactions. This last case had led us to develop an original Lagrangian simulation with a two-way coupling between the fluid and the particles. The macro scale validation uses the results of Eulerian simulations of dense fluidized bed. Bed height, particles circulation and spontaneous created bubbles characteristics are studied and compared to experimental measurement, both looking at physical and numerical parameters. (author) 159 refs.

  19. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  20. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    Science.gov (United States)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  1. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2016-01-01

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  2. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L. [Utah State Univ., Logan, UT (United States). Dept. of Mechanical and Aerospace Engineering

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  3. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  4. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  5. Numerical Simulation of Tuff Dissolution and Precipitation Experiments: Validation of Thermal-Hydrologic-Chemical (THC) Coupled-Process Modeling

    Science.gov (United States)

    Dobson, P. F.; Kneafsey, T. J.

    2001-12-01

    As part of an ongoing effort to evaluate THC effects on flow in fractured media, we performed a laboratory experiment and numerical simulations to investigate mineral dissolution and precipitation. To replicate mineral dissolution by condensate in fractured tuff, deionized water equilibrated with carbon dioxide was flowed for 1,500 hours through crushed Yucca Mountain tuff at 94° C. The reacted water was collected and sampled for major dissolved species, total alkalinity, electrical conductivity, and pH. The resulting steady-state fluid composition had a total dissolved solids content of about 140 mg/L; silica was the dominant dissolved constituent. A portion of the steady-state reacted water was flowed at 10.8 mL/hr into a 31.7-cm tall, 16.2-cm wide vertically oriented planar fracture with a hydraulic aperture of 31 microns in a block of welded Topopah Spring tuff that was maintained at 80° C at the top and 130° C at the bottom. The fracture began to seal within five days. A 1-D plug-flow model using the TOUGHREACT code developed at Berkeley Lab was used to simulate mineral dissolution, and a 2-D model was developed to simulate the flow of mineralized water through a planar fracture, where boiling conditions led to mineral precipitation. Predicted concentrations of the major dissolved constituents for the tuff dissolution were within a factor of 2 of the measured average steady-state compositions. The fracture-plugging simulations result in the precipitation of amorphous silica at the base of the boiling front, leading to a hundred-fold decrease in fracture permeability in less than 6 days, consistent with the laboratory experiment. These results help validate the use of the TOUGHREACT code for THC modeling of the Yucca Mountain system. The experiment and simulations indicate that boiling and concomitant precipitation of amorphous silica could cause significant reductions in fracture porosity and permeability on a local scale. The TOUGHREACT code will be used

  6. CFD and FEM modeling of PPOOLEX experiments

    Energy Technology Data Exchange (ETDEWEB)

    Paettikangas, T.; Niemi, J.; Timperi, A. (VTT Technical Research Centre of Finland (Finland))

    2011-01-15

    Large-break LOCA experiment performed with the PPOOLEX experimental facility is analysed with CFD calculations. Simulation of the first 100 seconds of the experiment is performed by using the Euler-Euler two-phase model of FLUENT 6.3. In wall condensation, the condensing water forms a film layer on the wall surface, which is modelled by mass transfer from the gas phase to the liquid water phase in the near-wall grid cell. The direct-contact condensation in the wetwell is modelled with simple correlations. The wall condensation and direct-contact condensation models are implemented with user-defined functions in FLUENT. Fluid-Structure Interaction (FSI) calculations of the PPOOLEX experiments and of a realistic BWR containment are also presented. Two-way coupled FSI calculations of the experiments have been numerically unstable with explicit coupling. A linear perturbation method is therefore used for preventing the numerical instability. The method is first validated against numerical data and against the PPOOLEX experiments. Preliminary FSI calculations are then performed for a realistic BWR containment by modeling a sector of the containment and one blowdown pipe. For the BWR containment, one- and two-way coupled calculations as well as calculations with LPM are carried out. (Author)

  7. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  8. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  9. Modeling the Effects of Argument Length and Validity on Inductive and Deductive Reasoning

    Science.gov (United States)

    Rotello, Caren M.; Heit, Evan

    2009-01-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were…

  10. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  11. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  12. Thermo-mechanical analyses and model validation in the HAW test field. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Heijdra, J J; Broerse, J; Prij, J

    1995-01-01

    An overview is given of the thermo-mechanical analysis work done for the design of the High Active Waste experiment and for the purpose of validation of the used models through comparison with experiments. A brief treatise is given on the problems of validation of models used for the prediction of physical behaviour which cannot be determined with experiments. The analysis work encompasses investigations into the initial state of stress in the field, the constitutive relations, the temperature rise, and the pressure on the liner tubes inserted in the field to guarantee the retrievability of the radioactive sources used for the experiment. The measurements of temperatures, deformations, and stresses are described and an evaluation is given of the comparison of measured and calculated data. An attempt has been made to qualify or even quantify the discrepancies, if any, between measurements and calculations. It was found that the model for the temperature calculations performed adequately. For the stresses the general tendency was good, however, large discrepancies exist mainly due to inaccuracies in the measurements. For the deformations again the general tendency of the model predictions was in accordance with the measurements. However, from the evaluation it appears that in spite of the efforts to estimate the correct initial rock pressure at the location of the experiment, this pressure has been underestimated. The evaluation has contributed to a considerable increase in confidence in the models and gives no reason to question the constitutive model for rock salt. However, due to the quality of the measurements of the stress and the relatively short period of the experiments no quantitatively firm support for the constitutive model is acquired. Collections of graphs giving the measured and calculated data are attached as appendices. (orig.).

  13. Thermo-mechanical analyses and model validation in the HAW test field. Final report

    International Nuclear Information System (INIS)

    Heijdra, J.J.; Broerse, J.; Prij, J.

    1995-01-01

    An overview is given of the thermo-mechanical analysis work done for the design of the High Active Waste experiment and for the purpose of validation of the used models through comparison with experiments. A brief treatise is given on the problems of validation of models used for the prediction of physical behaviour which cannot be determined with experiments. The analysis work encompasses investigations into the initial state of stress in the field, the constitutive relations, the temperature rise, and the pressure on the liner tubes inserted in the field to guarantee the retrievability of the radioactive sources used for the experiment. The measurements of temperatures, deformations, and stresses are described and an evaluation is given of the comparison of measured and calculated data. An attempt has been made to qualify or even quantify the discrepancies, if any, between measurements and calculations. It was found that the model for the temperature calculations performed adequately. For the stresses the general tendency was good, however, large discrepancies exist mainly due to inaccuracies in the measurements. For the deformations again the general tendency of the model predictions was in accordance with the measurements. However, from the evaluation it appears that in spite of the efforts to estimate the correct initial rock pressure at the location of the experiment, this pressure has been underestimated. The evaluation has contributed to a considerable increase in confidence in the models and gives no reason to question the constitutive model for rock salt. However, due to the quality of the measurements of the stress and the relatively short period of the experiments no quantitatively firm support for the constitutive model is acquired. Collections of graphs giving the measured and calculated data are attached as appendices. (orig.)

  14. Contribution to the verification and the validation of an unsteady two-phase flow model

    International Nuclear Information System (INIS)

    Liu, Yujie

    2013-01-01

    This thesis contributes to the verification and the validation of the Baer-Nunziato (BN) model, to model water hammer phenomena in industrial piping systems. It consists of two parts, the first is to model water hammer flows with the BN model in Eulerian representation and the second is to extend this model to the ALE (Arbitrary Lagrangian Eulerian) formalism so as to take into account fluid-structure interaction (FSI). To model water hammer flows, closure laws of the BN model concerning the interfacial/source terms and the equations of state (EOS) were first studied. Then the whole system was simulated with a fractional step method including two steps, one for the resolution of the convective part, the other for the source terms. For the convective part, the Rusanov scheme was first checked, and some stability problems have been observed. Thus a more stable fractional step scheme has been proposed and verified. Regarding the source terms, four non-instantaneous relaxation schemes which represent the return to equilibrium of pressure, the transfers of momentum, heat and mass were successively applied. These schemes have been extended to 'generalized Stiffened Gas' EOS in order to represent phase-change. After regaining some typical phenomena associated with water hammer flows, the BN model was validated with the Simpson experiment, a classical water hammer test case, and the Canon experience, a rapid decompression of fluid in a high pressure duct. Moreover, the model was compared with two homogeneous models on these two experiments. Finally, an ALE version of the BN model was implemented, and verified on a case of wave propagation in a 'single' phase flow and a two-phase flow in a flexible pipe. The variation of wave propagation speed due to the coupling between the fluid and the structure has been well retrieved. The validation was performed on an experiment which examines the response of a pipe filled with water, subjected to a violent pressure peak (140 bar

  15. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  16. The role of CFD combustion modelling in hydrogen safety management – VI: Validation for slow deflagration in homogeneous hydrogen-air-steam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cutrono Rakhimov, A., E-mail: cutrono@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Visser, D.C., E-mail: visser@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, T., E-mail: tadej.holler@ijs.si [Jožef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, E.M.J., E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2017-01-15

    Highlights: • Deflagration of hydrogen-air-steam homogeneous mixtures is modeled in a medium-scale containment. • Adaptive mesh refinement is applied on flame front positions. • Steam effect influence on combustion modeling capabilities is investigated. • Mean pressure rise is predicted with 18% under-prediction when steam is involved. • Peak pressure is evaluated with 5% accuracy when steam is involved. - Abstract: Large quantities of hydrogen can be generated during a severe accident in a water-cooled nuclear reactor. When released in the containment, the hydrogen can create a potential deflagration risk. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor. Therefore, accurate prediction of these pressure loads is an important safety issue. In previous papers, we validated a Computational Fluid Dynamics (CFD) based method to determine the pressure loads from a fast deflagration. The combustion model applied in the CFD method is based on the Turbulent Flame Speed Closure (TFC). In our last paper, we presented the extension of this combustion model, Extended Turbulent Flame Speed Closure (ETFC), and its validation against hydrogen deflagration experiments in the slow deflagration regime. During a severe accident, cooling water will enter the containment as steam. Therefore, the effect of steam on hydrogen deflagration is important to capture in a CFD model. The primary objectives of the present paper are to further validate the TFC and ETFC combustion models, and investigate their capability to predict the effect of steam. The peak pressures, the trends of the flame velocity, and the pressure rise with an increase in the initial steam dilution are captured reasonably well by both combustion models. In addition, the ETFC model appeared to be more robust to mesh resolution changes. The mean pressure rise is evaluated with 18% under-prediction and the peak pressure is evaluated with 5

  17. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  18. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  19. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  20. Modelling the Grimsel migration field experiments at PSI

    International Nuclear Information System (INIS)

    Heer, W.

    1997-01-01

    For several years tracer migration experiments have been performed at Nagra's Grimsel Test Site as a joint undertaking of Nagra, PNC and PSI. The aims of modelling the migration experiments are (1) to better understand the nuclide transport through crystalline rock; (2) to gain information on validity of methods and correlating parameters; (3) to improve models for safety assessments. The PSI modelling results, presented here, show a consistent picture for the investigated tracers (the non-sorbing uranine, the weakly sorbing sodium, the moderately sorbing strontium and the more strongly sorbing cesium). They represent an important step in building up confidence in safety assessments for radioactive waste repositories. (author) 5 figs., 1 tab., 12 refs

  1. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  3. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  4. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  5. Validation of CTF Droplet Entrainment and Annular/Mist Closure Models using Riso Steam/Water Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    This report summarizes the work done to validate the droplet entrainment and de-entrainment models as well as two-phase closure models in the CTF code by comparison with experimental data obtained at Riso National Laboratory. The Riso data included a series of over 250 steam/water experiments that were performed in both tube and annulus geometries over a range of various pressures and outlet qualities. Experimental conditions were set so that the majority of cases were in the annular/mist ow regime. Measurements included liquid lm ow rate, droplet ow rate, lm thickness, and two-phase pressure drop. CTF was used to model 180 of the tubular geometry cases, matching experimental geometry, outlet pressure, and outlet ow quality to experimental values. CTF results were compared to the experimental data at the outlet of the test section in terms of vapor and entrained liquid ow fractions, pressure drop per unit length, and liquid lm thickness. The entire process of generating CTF input decks, running cases, extracting data, and generating comparison plots was scripted using Python and Matplotlib for a completely automated validation process. All test cases and scripting tools have been committed to the COBRA-TF master repository and selected cases have been added to the continuous testing system to serve as regression tests. The dierences between the CTF- and experimentally-calculated ow fraction values were con- sistent with previous calculations by Wurtz, who applied the same entrainment correlation to the same data. It has been found that CTF's entrainment/de-entrainment predictive capability in the annular/mist ow regime for this particular facility is comparable to the licensed industry code, COBRAG. While lm and droplet predictions are generally good, it has been found that accuracy is diminished at lower ow qualities. This nding is consistent with the noted deciencies in the Wurtz entrainment model employed by CTF. The CTF predicted two-phase pressure drop in

  6. Validation of transport models for use in repository performance assessments: a view illustrated for INTRAVAL test case 1b

    International Nuclear Information System (INIS)

    Jackson, C.P.; Lever, D.A.; Sumner, P.J.

    1991-03-01

    We present our views on validation. We consider that validation is slightly different for general models and specific models. We stress the importance of presenting for review the case for (or against) a model. We outline a formal framework for validation, which helps to ensure that all the issues are addressed. Our framework includes calibration, testing predictions, comparison with alternative models, which we consider particularly important, analysis of discrepancies, presentation, consideration of implications and suggested improved experiments. We illustrate the approach by application to an INTRAVAL test case based on laboratory experiments. Three models were considered: a simple model that included the effects of advection, dispersion and equilibrium sorption, a model that also included the effects of rock-matrix diffusion, and a model with kinetic sorption. We show that the model with rock-matrix diffusion is the only one to provide a good description of the data. We stress the implications of extrapolating to larger length and time scales for repository performance assessments. (author)

  7. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  8. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  9. TU-D-201-05: Validation of Treatment Planning Dose Calculations: Experience Working with MPPG 5.a

    Energy Technology Data Exchange (ETDEWEB)

    Xue, J; Park, J; Kim, L; Wang, C [MD Anderson Cancer Center at Cooper, Camden, NJ (United States); Balter, P; Ohrt, J; Kirsner, S; Ibbott, G [UT MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: Newly published medical physics practice guideline (MPPG 5.a.) has set the minimum requirements for commissioning and QA of treatment planning dose calculations. We present our experience in the validation of a commercial treatment planning system based on MPPG 5.a. Methods: In addition to tests traditionally performed to commission a model-based dose calculation algorithm, extensive tests were carried out at short and extended SSDs, various depths, oblique gantry angles and off-axis conditions to verify the robustness and limitations of a dose calculation algorithm. A comparison between measured and calculated dose was performed based on validation tests and evaluation criteria recommended by MPPG 5.a. An ion chamber was used for the measurement of dose at points of interest, and diodes were used for photon IMRT/VMAT validations. Dose profiles were measured with a three-dimensional scanning system and calculated in the TPS using a virtual water phantom. Results: Calculated and measured absolute dose profiles were compared at each specified SSD and depth for open fields. The disagreement is easily identifiable with the difference curve. Subtle discrepancy has revealed the limitation of the measurement, e.g., a spike at the high dose region and an asymmetrical penumbra observed on the tests with an oblique MLC beam. The excellent results we had (> 98% pass rate on 3%/3mm gamma index) on the end-to-end tests for both IMRT and VMAT are attributed to the quality beam data and the good understanding of the modeling. The limitation of the model and the uncertainty of measurement were considered when comparing the results. Conclusion: The extensive tests recommended by the MPPG encourage us to understand the accuracy and limitations of a dose algorithm as well as the uncertainty of measurement. Our experience has shown how the suggested tests can be performed effectively to validate dose calculation models.

  10. Validation of single-fluid and two-fluid magnetohydrodynamic models of the helicity injected torus spheromak experiment with the NIMROD code

    International Nuclear Information System (INIS)

    Akcay, Cihan; Victor, Brian S.; Jarboe, Thomas R.; Kim, Charlson C.

    2013-01-01

    We present a comparison study of 3-D pressureless resistive MHD (rMHD) and 3-D presureless two-fluid MHD models of the Helicity Injected Torus with Steady Inductive helicity injection (HIT-SI). HIT-SI is a current drive experiment that uses two geometrically asymmetric helicity injectors to generate and sustain toroidal plasmas. The comparable size of the collisionless ion skin depth d i to the resistive skin depth predicates the importance of the Hall term for HIT-SI. The simulations are run with NIMROD, an initial-value, 3-D extended MHD code. The modeled plasma density and temperature are assumed uniform and constant. The helicity injectors are modeled as oscillating normal magnetic and parallel electric field boundary conditions. The simulations use parameters that closely match those of the experiment. The simulation output is compared to the formation time, plasma current, and internal and surface magnetic fields. Results of the study indicate 2fl-MHD shows quantitative agreement with the experiment while rMHD only captures the qualitative features. The validity of each model is assessed based on how accurately it reproduces the global quantities as well as the temporal and spatial dependence of the measured magnetic fields. 2fl-MHD produces the current amplification (I tor /I inj ) and formation time τ f demonstrated by HIT-SI with similar internal magnetic fields. rMHD underestimates (I tor /I inj ) and exhibits much a longer τ f . Biorthogonal decomposition (BD), a powerful mathematical tool for reducing large data sets, is employed to quantify how well the simulations reproduce the measured surface magnetic fields without resorting to a probe-by-probe comparison. BD shows that 2fl-MHD captures the dominant surface magnetic structures and the temporal behavior of these features better than rMHD

  11. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik (ed.)

    2016-04-15

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  12. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik

    2016-04-01

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  13. Roll-up of validation results to a target application.

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard Guy

    2013-09-01

    Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

  14. A model to predict element redistribution in unsaturated soil: Its simplification and validation

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Stephens, M.E.; Davis, P.A.; Wojciechowski, L.

    1991-01-01

    A research model has been developed to predict the long-term fate of contaminants entering unsaturated soil at the surface through irrigation or atmospheric deposition, and/or at the water table through groundwater. The model, called SCEMR1 (Soil Chemical Exchange and Migration of Radionuclides, Version 1), uses Darcy's law to model water movement, and the soil solid/liquid partition coefficient, K d , to model chemical exchange. SCEMR1 has been validated extensively on controlled field experiments with several soils, aeration statuses and the effects of plants. These validation results show that the model is robust and performs well. Sensitivity analyses identified soil K d , annual effective precipitation, soil type and soil depth to be the four most important model parameters. SCEMR1 consumes too much computer time for incorporation into a probabilistic assessment code. Therefore, we have used SCEMR1 output to derive a simple assessment model. The assessment model reflects the complexity of its parent code, and provides a more realistic description of containment transport in soils than would a compartment model. Comparison of the performance of the SCEMR1 research model, the simple SCEMR1 assessment model and the TERRA compartment model on a four-year soil-core experiment shows that the SCEMR1 assessment model generally provides conservative soil concentrations. (15 refs., 3 figs.)

  15. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  16. Validation analysis of pool fire experiment (Run-F7) using SPHINCS code

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Tajima, Yuji

    1998-04-01

    SPHINCS (Sodium Fire Phenomenology IN multi-Cell System) code has been developed for the safety analysis of sodium fire accident in a Fast Breeder Reactor. The main features of the SPHINCS code with respect to the sodium pool fire phenomena are multi-dimensional modeling of the thermal behavior in sodium pool and steel liner, modeling of the extension of sodium pool area based on the sodium mass conservation, and equilibrium model for the chemical reaction of pool fire on the flame sheet at the surface of sodium pool during. Therefore, the SPHINCS code is capable of temperature evaluation of the steel liner in detail during the small and/or medium scale sodium leakage accidents. In this study, Run-F7 experiment in which the sodium leakage rate is 11.8 kg/hour has been analyzed. In the experiment the diameter of the sodium pool is approximately 60 cm and the maximum steel liner temperature was 616 degree C. The analytical results tell us the agreement between the SPHINCS analysis and the experiment is excellent with respect to the time history and spatial distribution of the liner temperature, sodium pool extension behavior, as well as atmosphere gas temperature. It is concluded that the pool fire modeling of the SPHINCS code has been validated for this experiment. The SPHINCS code is currently applicable to the sodium pool fire phenomena and the temperature evaluation of the steel liner. The experiment series are continued to check some parameters, i.e., sodium leakage rate and the height of sodium leakage. Thus, the author will analyze the subsequent experiments to check the influence of the parameters and applies SPHINCS to the sodium fire consequence analysis of fast reactor. (author)

  17. Langmuir probe-based observables for plasma-turbulence code validation and application to the TORPEX basic plasma physics experiment

    International Nuclear Information System (INIS)

    Ricci, Paolo; Theiler, C.; Fasoli, A.; Furno, I.; Labit, B.; Mueller, S. H.; Podesta, M.; Poli, F. M.

    2009-01-01

    The methodology for plasma-turbulence code validation is discussed, with focus on the quantities to use for the simulation-experiment comparison, i.e., the validation observables, and application to the TORPEX basic plasma physics experiment [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)]. The considered validation observables are deduced from Langmuir probe measurements and are ordered into a primacy hierarchy, according to the number of model assumptions and to the combinations of measurements needed to form each of them. The lowest levels of the primacy hierarchy correspond to observables that require the lowest number of model assumptions and measurement combinations, such as the statistical and spectral properties of the ion saturation current time trace, while at the highest levels, quantities such as particle transport are considered. The comparison of the observables at the lowest levels in the hierarchy is more stringent than at the highest levels. Examples of the use of the proposed observables are applied to a specific TORPEX plasma configuration characterized by interchange-driven turbulence.

  18. Validation of infrared thermography in serotonin-induced itch model in rats

    DEFF Research Database (Denmark)

    Dagnæs-Hansen, Frederik; Jasemian, Yousef; Gazerani, Parisa

    The number of scratching bouts is generally used as a standard method in animal models of itch. The aim of the present study was to validate the application of infrared thermography (IR-Th) in a serotonin-induced itch model in rats. Adult Sprague-Dawley male rats (n = 24) were used in 3 consecutive...... experiments. The first experiment evaluated vasomotor response (IR-Th) and scratching behavior (number of bouts) induced by intradermal serotonin (10 μl, 2%). Isotonic saline (control: 10 μl, 0.9%) and Methysergide (antagonist: 10 μl, 0.047 mg/ml) were used. The second experiment evaluated the dose......-response effect of intradermal serotonin (1%, 2% and 4%) on local temperature. The third experiment evaluated the anesthetized rats to test the local vasomotor responses in absent of scratching. Serotonin elicited significant scratching and lowered the local temperature at the site of injection. A dose...

  19. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  20. A Large-Scale Multibody Manipulator Soft Sensor Model and Experiment Validation

    Directory of Open Access Journals (Sweden)

    Wu Ren

    2014-01-01

    Full Text Available Stress signal is difficult to obtain in the health monitoring of multibody manipulator. In order to solve this problem, a soft sensor method is presented. In the method, stress signal is considered as dominant variable and angle signal is regarded as auxiliary variable. By establishing the mathematical relationship between them, a soft sensor model is proposed. In the model, the stress information can be deduced by angle information which can be easily measured for such structures by experiments. Finally, test of ground and wall working conditions is done on a multibody manipulator test rig. The results show that the stress calculated by the proposed method is closed to the test one. Thus, the stress signal is easier to get than the traditional method. All of these prove that the model is correct and the method is feasible.

  1. [Caregiver's health: adaption and validation in a Spanish population of the Experience of Caregiving Inventory (ECI)].

    Science.gov (United States)

    Crespo-Maraver, Mariacruz; Doval, Eduardo; Fernández-Castro, Jordi; Giménez-Salinas, Jordi; Prat, Gemma; Bonet, Pere

    2018-04-04

    To adapt and to validate the Experience of Caregiving Inventory (ECI) in a Spanish population, providing empirical evidence of its internal consistency, internal structure and validity. Psychometric validation of the adapted version of the ECI. One hundred and seventy-two caregivers (69.2% women), mean age 57.51 years (range: 21-89) participated. Demographic and clinical data, standardized measures (ECI, suffering scale of SCL-90-R, Zarit burden scale) were used. The two scales of negative evaluation of the ECI most related to serious mental disorders (disruptive behaviours [DB] and negative symptoms [NS]) and the two scales of positive appreciation (positive personal experiences [PPE], and good aspects of the relationship [GAR]) were analyzed. Exploratory structural equation modelling was used to analyze the internal structure. The relationship between the ECI scales and the SCL-90-R and Zarit scores was also studied. The four-factor model presented a good fit. Cronbach's alpha (DB: 0.873; NS: 0.825; PPE: 0.720; GAR: 0.578) showed a higher homogeneity in the negative scales. The SCL-90-R scores correlated with the negative ECI scales, and none of the ECI scales correlated with the Zarit scale. The Spanish version of the ECI can be considered a valid, reliable, understandable and feasible self-report measure for its administration in the health and community context. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  3. First experiments results about the engineering model of Rapsodie

    International Nuclear Information System (INIS)

    Chalot, A.; Ginier, R.; Sauvage, M.

    1964-01-01

    This report deals with the first series of experiments carried out on the engineering model of Rapsodie and on an associated sodium facility set in a laboratory hall of Cadarache. It conveys more precisely: 1/ - The difficulties encountered during the erection and assembly of the engineering model and a compilation of the results of the first series of experiments and tests carried out on this installation (loading of the subassemblies preheating, thermal chocks...). 2/ - The experiments and tests carried out on the two prototypes control rod drive mechanisms which brought to the choice for the design of the definitive drive mechanism. As a whole, the results proved the validity of the general design principles adopted for Rapsodie. (authors) [fr

  4. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesian model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.

  5. Benchmarking Multilayer-HySEA model for landslide generated tsunami. HTHMP validation process.

    Science.gov (United States)

    Macias, J.; Escalante, C.; Castro, M. J.

    2017-12-01

    Landslide tsunami hazard may be dominant along significant parts of the coastline around the world, in particular in the USA, as compared to hazards from other tsunamigenic sources. This fact motivated NTHMP about the need of benchmarking models for landslide generated tsunamis, following the same methodology already used for standard tsunami models when the source is seismic. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory data sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. A total of 7 benchmarks. The Multilayer-HySEA model including non-hydrostatic effects has been used to perform all the benchmarking problems dealing with laboratory experiments proposed in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017 by NTHMP. The aim of this presentation is to show some of the latest numerical results obtained with the Multilayer-HySEA (non-hydrostatic) model in the framework of this validation effort.Acknowledgements. This research has been partially supported by the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and University of Malaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).

  6. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments

    Directory of Open Access Journals (Sweden)

    Gyöngyi Munkácsy

    2016-01-01

    Full Text Available No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal–Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E−06. Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E−04. There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  7. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  9. Modelization of ratcheting in biaxial experiments

    International Nuclear Information System (INIS)

    Guionnet, C.

    1989-08-01

    A new unified viscoplastic constitutive equation has been developed in order to interpret ratcheting experiments on mechanical structures of fast reactors. The model is based essentially on a generalized Armstrong Frederick equation for the kinematic variable; the coefficients of the dynamic recovery term in this equation is a function of both instantaneous and accumulated inelastic strain which is allowed to vary in an appropriate manner in order to reproduce the experimental ratcheting rate. The validity of the model is verified by comparing predictions with experimental results for austenitic stainless steel (17-12 SPH) tubular specimens subjected to cyclic torsional loading under constant tensile stress at 600 0 C [fr

  10. Methane emissions from rice paddies. Experiments and modelling

    International Nuclear Information System (INIS)

    Van Bodegom, P.M.

    2000-01-01

    This thesis describes model development and experimentation on the comprehension and prediction of methane (CH4) emissions from rice paddies. The large spatial and temporal variability in CH4 emissions and the dynamic non-linear relationships between processes underlying CH4 emissions impairs the applicability of empirical relations. Mechanistic concepts are therefore starting point of analysis throughout the thesis. The process of CH4 production was investigated by soil slurry incubation experiments at different temperatures and with additions of different electron donors and acceptors. Temperature influenced conversion rates and the competitiveness of microorganisms. The experiments were used to calibrate and validate a mechanistic model on CH4 production that describes competition for acetate and H2/CO2, inhibition effects and chemolithotrophic reactions. The redox sequence leading eventually to CH4 production was well predicted by the model, calibrating only the maximum conversion rates. Gas transport through paddy soil and rice plants was quantified by experiments in which the transport of SF6 was monitored continuously by photoacoustics. A mechanistic model on gas transport in a flooded rice system based on diffusion equations was validated by these experiments and could explain why most gases are released via plant mediated transport. Variability in root distribution led to highly variable gas transport. Experiments showed that CH4 oxidation in the rice rhizosphere was oxygen (O2) limited. Rice rhizospheric O2 consumption was dominated by chemical iron oxidation, and heterotrophic and methanotrophic respiration. The most abundant methanotrophs and heterotrophs were isolated and kinetically characterised. Based upon these experiments it was hypothesised that CH4 oxidation mainly occurred at microaerophilic, low acetate conditions not very close to the root surface. A mechanistic rhizosphere model that combined production and consumption of O2, carbon and iron

  11. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  12. Two-phase CFD PTS validation in an extended range of thermohydraulics conditions covered by the COSI experiment

    International Nuclear Information System (INIS)

    Coste, P.; Ortolan, A.

    2014-01-01

    Highlights: • Models for large interfaces in two-phase CFD were developed for PTS. • The COSI experiment is used for NEPTUNE C FD integral validation. • COSI is a PWR cold leg scaled 1/100 for volume. • Fifty runs are calculated, covering a large range of flow configurations. • The CFD predicting capability is analysed using global and local measurements. - Abstract: In the context of the Pressurized Water Reactors (PWR) life duration safety studies, some models were developed to address the Pressurized Thermal Shock (PTS) from the two-phase CFD angle, dealing with interfaces much larger than cells size and with direct contact condensation. Such models were implemented in NEPTUNE C FD, a 3D transient Eulerian two-fluid model. The COSI experiment is used for its integral validation. It represents a cold leg scaled 1/100 for volume and power from a 900 MW PWR under a large range of LOCA PTS conditions. In this study, the CFD is evaluated in the whole range of parameters and flow configurations covered by the experiment. In a first step, a single choice of mesh and CFD models parameters is fixed and justified. In a second step, fifty runs are calculated. The CFD predicting capability is analysed, comparing the liquid temperature and the total condensation rate with the experiment, discussing their dependency on the inlet cold liquid rate, on the liquid level in the cold leg and on the difference between co-current and counter-current runs. It is shown that NEPTUNE C FD 1.0.8 calculates with a fair agreement a large range of flow configurations related to ECCS injection and steam condensation

  13. Tracer travel time and model validation

    International Nuclear Information System (INIS)

    Tsang, Chin-Fu.

    1988-01-01

    The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs

  14. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  15. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    Science.gov (United States)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  16. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  17. CFD Validation Experiment of a Mach 2.5 Axisymmetric Shock-Wave/Boundary-Layer Interaction

    Science.gov (United States)

    Davis, David O.

    2015-01-01

    Experimental investigations of specific flow phenomena, e.g., Shock Wave Boundary-Layer Interactions (SWBLI), provide great insight to the flow behavior but often lack the necessary details to be useful as CFD validation experiments. Reasons include: 1.Undefined boundary conditions Inconsistent results 2.Undocumented 3D effects (CL only measurements) 3.Lack of uncertainty analysis While there are a number of good subsonic experimental investigations that are sufficiently documented to be considered test cases for CFD and turbulence model validation, the number of supersonic and hypersonic cases is much less. This was highlighted by Settles and Dodsons [1] comprehensive review of available supersonic and hypersonic experimental studies. In all, several hundred studies were considered for their database.Of these, over a hundred were subjected to rigorous acceptance criteria. Based on their criteria, only 19 (12 supersonic, 7 hypersonic) were considered of sufficient quality to be used for validation purposes. Aeschliman and Oberkampf [2] recognized the need to develop a specific methodology for experimental studies intended specifically for validation purposes.

  18. The EGS Collab Project: Stimulation Investigations for Geothermal Modeling Analysis and Validation

    Science.gov (United States)

    Blankenship, D.; Kneafsey, T. J.

    2017-12-01

    The US DOE's EGS Collab project team is establishing a suite of intermediate-scale ( 10-20 m) field test beds for coupled stimulation and interwell flow tests. The multiple national laboratory and university team is designing the tests to compare measured data to models to improve measurement and modeling toolsets available for use in field sites and investigations such as DOE's Frontier Observatory for Research in Geothermal Energy (FORGE) Project. Our tests will be well-controlled, in situexperiments focused on rock fracture behavior, seismicity, and permeability enhancement. Pre- and post-test modeling will allow for model prediction and validation. High-quality, high-resolution geophysical and other fracture characterization data will be collected, analyzed, and compared with models and field observations to further elucidate the basic relationships between stress, induced seismicity, and permeability enhancement. Coring through the stimulated zone after tests will provide fracture characteristics that can be compared to monitoring data and model predictions. We will also observe and quantify other key governing parameters that impact permeability, and attempt to understand how these parameters might change throughout the development and operation of an Enhanced Geothermal System (EGS) project with the goal of enabling commercial viability of EGS. The Collab team will perform three major experiments over the three-year project duration. Experiment 1, intended to investigate hydraulic fracturing, will be performed in the Sanford Underground Research Facility (SURF) at 4,850 feet depth and will build on kISMET Project findings. Experiment 2 will be designed to investigate hydroshearing. Experiment 3 will investigate changes in fracturing strategies and will be further specified as the project proceeds. The tests will provide quantitative insights into the nature of stimulation (e.g., hydraulic fracturing, hydroshearing, mixed-mode fracturing, thermal fracturing

  19. Lagrangian Stochastic Dispersion Model IMS Model Suite and its Validation against Experimental Data

    International Nuclear Information System (INIS)

    Bartok, J.

    2010-01-01

    The dissertation presents IMS Lagrangian Dispersion Model, which is a 'new generation' Slovak dispersion model of long-range transport, developed by MicroStep-MIS. It solves trajectory equation for a vast number of Lagrangian 'particles' and stochastic equation that simulates the effects of turbulence. Model contains simulation of radioactive decay (full decay chains of more than 300 nuclides), and dry and wet deposition. Model was integrated into IMS Model Suite, a system in which several models and modules can run and cooperate, e.g. LAM model WRF preparing fine resolution meteorological data for dispersion. The main theme of the work is validation of dispersion model against large scale international campaigns CAPTEX and ETEX, which are two of the largest tracer experiments. Validation addressed treatment of missing data, data interpolation into comparable temporal and spatial representation. The best model results were observed for ETEX I, standard results for CAPTEXes and worst results for ETEX II, known in modelling community for its meteorological conditions that can be hardly resolved by models. The IMS Lagrangian Dispersion Model was identified as capable long range dispersion model for slowly- or nonreacting chemicals and radioactive matter. Influence of input data on simulation quality is discussed within the work. Additional modules were prepared according to praxis requirement: a) Recalculation of concentrations of radioactive pollutant into effective doses form inhalation, immersion in the plume and deposition. b) Dispersion of mineral dust was added and tested in desert locality, where wind and soil moisture were firstly analysed and forecast by WRF. The result was qualitatively verified in case study against satellite observations. (author)

  20. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  1. Validation of models in an imaging infrared simulation

    CSIR Research Space (South Africa)

    Willers, C

    2007-10-01

    Full Text Available threeprocessesfortransformingtheinformationbetweentheentities. Reality/ Problem Entity Conceptual Model Computerized Model Model Validation ModelVerification Model Qualification Computer Implementation Analysisand Modelling Simulationand Experimentation “Substantiationthata....C.Refsgaard ,ModellingGuidelines-terminology andguidingprinciples, AdvancesinWaterResources, Vol27,No1,January2004,?pp.71-82(12),Elsevier. et.al. [5]N.Oreskes,et.al.,Verification,Validation,andConfirmationof NumericalModelsintheEarthSciences,Science,Vol263, Number...

  2. Validation of ASTEC V2 models for the behaviour of corium in the vessel lower head

    International Nuclear Information System (INIS)

    Carénini, L.; Fleurot, J.; Fichot, F.

    2014-01-01

    The paper is devoted to the presentation of validation cases carried out for the models describing the corium behaviour in the “lower plenum” of the reactor vessel implemented in the V2.0 version of the ASTEC integral code, jointly developed by IRSN (France) and GRS (Germany). In the ASTEC architecture, these models are grouped within the single ICARE module and they are all activated in typical accident scenarios. Therefore, it is important to check the validity of each individual model, as long as experiments are available for which a single physical process is involved. The results of ASTEC applications against the following experiments are presented: FARO (corium jet fragmentation), LIVE (heat transfer between a molten pool and the vessel), MASCA (separation and stratification of corium non miscible phases) and OLHF (mechanical failure of the vessel). Compared to the previous ASTEC V1.3 version, the validation matrix is extended. This work allows determining recommended values for some model parameters (e.g. debris particle size in the fragmentation model and criterion for debris bed liquefaction). Almost all the processes governing the corium behaviour, its thermal interaction with the vessel wall and the vessel failure are modelled in ASTEC and these models have been assessed individually with satisfactory results. The main uncertainties appear to be related to the calculation of transient evolutions

  3. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  4. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  5. Experimental validation of TASS/SMR-S critical flow model for the integral reactor SMART

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Si Won; Ra, In Sik; Kim, Kun Yeup [ACT Co., Daejeon (Korea, Republic of); Chung, Young Jong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    An advanced integral PWR, SMART (System- Integrated Modular Advanced ReacTor) is being developed in KAERI. It has a compact size and a relatively small power rating (330MWt) compared to a conventional reactor. Because new concepts are applied to SMART, an experimental and analytical validation is necessary for the safety evaluation of SMART. The analytical safety validation is being accomplished by a safety analysis code for an integral reactor, TASS/SMR-S developed by KAERI. TASS/SMR-S uses a lumped parameter one dimensional node and path modeling for the thermal hydraulic calculation and it uses point kinetics for the reactor power calculation. It has models for a general usage such as a core heat transfer model, a wall heat structure model, a critical flow model, component models, and it also has many SMART specific models such as an once through helical coiled steam generator model, and a condensate heat transfer model. To ensure that the TASS/SMR-S code has the calculation capability for the safety evaluation of SMART, the code should be validated for the specific models with the separate effect test experimental results. In this study, TASS/SMR-S critical flow model is evaluated as compared with SMD (Super Moby Dick) experiment

  6. Validation experiments of nuclear characteristics of the fast-thermal system HERBE

    International Nuclear Information System (INIS)

    Pesic, M.; Zavaljevski, N.; Marinkovic, P.; Stefanovis, D.; Nikolic, D.; Avdic, S.

    1992-01-01

    In 1988/90 a coupled fast-thermal system HERBE at RB reactor, based on similar facilities, is designed and realized. Fast core of HERBE is built of natural U fuel in RB reactor center surrounded by the neutron filter and neutron converter located in an independent Al tank. Fast zone is surrounded by thermal neutron core driver. Designed nuclear characteristics of HERBE core are validated in the experiments described in the paper. HERBE cell parameters were calculated with developed computer codes: VESNA and DENEB. HERBE system criticality calculation are performed with 4G 2D RZ computer codes GALER and TWENTY GRAND, 1D multi-group AVERY code and 3D XYZ few-group TRITON computer code. The experiments for determination of critical level, dρ/dH, and reactivity of safety rods are accomplished in order to validate calculation results. Specific safety experiment is performed in aim to determine reactivity of flooded fast zone in possible accident. A very good agreements with calculation results are obtained and the validation procedures are presented. It is expected that HERBE will offer qualitative new opportunities for work with fast neutrons at RB reactor including nuclear data determination. (author)

  7. Turbulence Models: Data from Other Experiments: FAITH Hill 3-D Separated Flow

    Data.gov (United States)

    National Aeronautics and Space Administration — Exp: FAITH Hill 3-D Separated Flow. This web page provides data from experiments that may be useful for the validation of turbulence models. This resource is...

  8. Validation of ASTECV2.1 based on the QUENCH-08 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Gómez-García-Toraño, Ignacio, E-mail: ignacio.torano@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Sánchez-Espinoza, Víctor-Hugo; Stieglitz, Robert [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Stuckert, Juri [Karlsruhe Institute of Technology, Institute for Applied Materials-Applied Materials Physics (IAM-AWP), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Laborde, Laurent; Belon, Sébastien [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), Nuclear Safety Division/Safety Research/Severe Accident Department, Saint Paul Lez Durance 13115 (France)

    2017-04-01

    Highlights: • ASTECV2.1 can reproduce QUENCH-08 experimental trends e.g. hydrogen generation. • Radial temperature gradient and heat transfer through argon gap are underestimated. • Mesh sizes lower than 55 mm needed to capture the strong axial temperature gradient. • Minor variations of external electrical resistance strongly affect bundle heat-up. • Modelling of a bypass and inclusion of currents partially overcome discrepancies. - Abstract: The Fukushima accidents have shown that further improvements of Severe Accident Management Guidelines (SAMGs) are still necessary. Hence, the enhancement of severe accident codes and their validation based on integral experiments is pursued worldwide. In particular, the capabilities of the European integral severe accident ASTECV2.1 code are being extended within the CESAM project through the improvement of physical models, code numerics and an extensive code validation. Among the different strategies encompassed in the plant SAMGs, one of the most important ones to prevent core damage is the injection of water into the overheated core (reflooding). However, under certain conditions, reflooding may trigger a sharp hydrogen generation that may jeopardize the containment. Within this work, ASTECV2.1 models describing the early in-vessel phase of the severe accident and its termination by core reflooding are validated against data from the QUENCH test facility. The QUENCH-08, involving the injection of 15 g/s (about 0.6 g/s/rod) of saturated steam at a bundle temperature of 2073 K, has been selected for this comparison. Results show that ASTECV2.1 is able to reproduce the experimental temperatures and oxide thicknesses at representative bundle locations. The predicted total hydrogen generation (76 g) is similar to the experimental one (84 g). In addition, the choices of an axial mesh size lower than 55 mm and of an external electrical resistance of a 7 mΩ/rod have been justified with parametric analyses. Finally, new

  9. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M J

    1998-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  10. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M.J.

    1997-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  11. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist

    2013-01-01

    BACKGROUND: A validated model describing the nitritation-anammox process in a granular sequencing batch reactor (SBR) system is an important tool for: a) design of future experiments and b) prediction of process performance during optimization, while applying process control, or during system scale......-up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... screening of the parameter space proposed by Sin et al. (2008) - to find the best fit of the model to dynamic data. Finally, the calibrated model was validated with an independent data set. CONCLUSION: The presented calibration procedure is the first customized procedure for this type of system...

  12. Computational Fluid Dynamics Modeling of the Human Pulmonary Arteries with Experimental Validation.

    Science.gov (United States)

    Bordones, Alifer D; Leroux, Matthew; Kheyfets, Vitaly O; Wu, Yu-An; Chen, Chia-Yuan; Finol, Ender A

    2018-05-21

    Pulmonary hypertension (PH) is a chronic progressive disease characterized by elevated pulmonary arterial pressure, caused by an increase in pulmonary arterial impedance. Computational fluid dynamics (CFD) can be used to identify metrics representative of the stage of PH disease. However, experimental validation of CFD models is often not pursued due to the geometric complexity of the model or uncertainties in the reproduction of the required flow conditions. The goal of this work is to validate experimentally a CFD model of a pulmonary artery phantom using a particle image velocimetry (PIV) technique. Rapid prototyping was used for the construction of the patient-specific pulmonary geometry, derived from chest computed tomography angiography images. CFD simulations were performed with the pulmonary model with a Reynolds number matching those of the experiments. Flow rates, the velocity field, and shear stress distributions obtained with the CFD simulations were compared to their counterparts from the PIV flow visualization experiments. Computationally predicted flow rates were within 1% of the experimental measurements for three of the four branches of the CFD model. The mean velocities in four transversal planes of study were within 5.9 to 13.1% of the experimental mean velocities. Shear stresses were qualitatively similar between the two methods with some discrepancies in the regions of high velocity gradients. The fluid flow differences between the CFD model and the PIV phantom are attributed to experimental inaccuracies and the relative compliance of the phantom. This comparative analysis yielded valuable information on the accuracy of CFD predicted hemodynamics in pulmonary circulation models.

  13. Rate-based modelling and validation of a pilot absorber using MDEA enhanced with carbonic anhydrase (CA)

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Gladis, Arne; Woodley, John

    2017-01-01

    solvent-regeneration energy demand.The focus of this work is to develop a rate-based model for CO2 absorption using MDEA enhanced with CA and to validate it against pilot-scale absorption experiments. In this work, we compare model predictions to measured temperature and CO2 concentration profiles...

  14. Modeling and experiments of biomass combustion in a large-scale grate boiler

    DEFF Research Database (Denmark)

    Yin, Chungen; Rosendahl, Lasse; Kær, Søren Knudsen

    2007-01-01

    is inherently more difficult due to the complexity of the solid biomass fuel bed on the grate, the turbulent reacting flow in the combustion chamber and the intensive interaction between them. This paper presents the CFD validation efforts for a modern large-scale biomass-fired grate boiler. Modeling...... and experiments are both done for the grate boiler. The comparison between them shows an overall acceptable agreement in tendency. However at some measuring ports, big discrepancies between the modeling and the experiments are observed, mainly because the modeling-based boundary conditions (BCs) could differ...

  15. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  16. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    International Nuclear Information System (INIS)

    Westin, J.; Henriksson, M.; Paettikangas, T.; Toppila, T.; Raemae, T.; Kudinov, P.; Anglart, H.

    2009-08-01

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in Aelvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  17. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    Energy Technology Data Exchange (ETDEWEB)

    Westin, J.; Henriksson, M. (Vattenfall Research and Development AB (Sweden)); Paettikangas, T. (VTT (Finland)); Toppila, T.; Raemae, T. (Fortum Nuclear Services Ltd (Finland)); Kudinov, P. (KTH Nuclear Power Safety (Sweden)); Anglart, H. (KTH Nuclear Reactor Technology (Sweden))

    2009-08-15

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in AElvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  18. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. The structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ

    Directory of Open Access Journals (Sweden)

    Pieter Schaap

    2016-09-01

    Full Text Available Orientation: Best practice frameworks suggest that an assessment practitioner’s choice of an assessment tool should be based on scientific evidence that underpins the appropriate and just use of the instrument. This is a context-specific validity study involving a classified psychological instrument against the background of South African regulatory frameworks and contemporary validity theory principles. Research purpose: The aim of the study was to explore the structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ administered to employees in the automotive assembly plant of a South African automotive manufacturing company. Motivation for the study: Although the WLQ has been used by registered health practitioners and numerous researchers, evidence to support the structural validity is lacking. This study, therefore, addressed the need for context-specific empirical support for the validity of score inferences in respect of employees in a South African automotive manufacturing plant. Research design, approach and method: The research was conducted using a convenience sample (N = 217 taken from the automotive manufacturing company where the instrument was used. Reliability and factor analyses were carried out to explore the structural validity of the WLQ. Main findings: The reliability of the WLQ appeared to be acceptable, and the assumptions made about unidimensionality were mostly confirmed. One of the proposed higher-order structural models of the said questionnaire administered to the sample group was confirmed, whereas the other one was partially confirmed. Practical/managerial implications: The conclusion reached was that preliminary empirical grounds existed for considering the continued use of the WLQ (with some suggested refinements by the relevant company, provided the process of accumulating a body of validity evidence continued. Contribution/value-add: This study identified some of the difficulties

  20. The Role of Laboratory Experiments in the Validation of Field Data

    DEFF Research Database (Denmark)

    Mouneyrac, Catherine; Lagarde, Fabienne; Chatel, Amelie

    2017-01-01

    The ubiquitous presence and persistency of microplastics (MPs) in aquatic environments are of particular concern, since they constitute a potential threat to marine organisms and ecosystems. However, evaluating this threat and the impacts of MP on aquatic organisms is challenging. MPs form a very...... and to what degree these complexities are addressed in the current literature, to: (1) evaluate how well laboratory studies, investigated so far, represent environmentally relevant processes and scenarios and (2) suggest directions for future research The Role of Laboratory Experiments in the Validation...... of Field Data | Request PDF. Available from: https://www.researchgate.net/publication/310360438_The_Role_of_Laboratory_Experiments_in_the_Validation_of_Field_Data [accessed Jan 15 2018]....

  1. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  2. Construction and Initial Validation of the Multiracial Experiences Measure (MEM)

    Science.gov (United States)

    Yoo, Hyung Chol; Jackson, Kelly; Guevarra, Rudy P.; Miller, Matthew J.; Harrington, Blair

    2015-01-01

    This article describes the development and validation of the Multiracial Experiences Measure (MEM): a new measure that assesses uniquely racialized risks and resiliencies experienced by individuals of mixed racial heritage. Across two studies, there was evidence for the validation of the 25-item MEM with 5 subscales including Shifting Expressions, Perceived Racial Ambiguity, Creating Third Space, Multicultural Engagement, and Multiracial Discrimination. The 5-subscale structure of the MEM was supported by a combination of exploratory and confirmatory factor analyses. Evidence of criterion-related validity was partially supported with MEM subscales correlating with measures of racial diversity in one’s social network, color-blind racial attitude, psychological distress, and identity conflict. Evidence of discriminant validity was supported with MEM subscales not correlating with impression management. Implications for future research and suggestions for utilization of the MEM in clinical practice with multiracial adults are discussed. PMID:26460977

  3. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  4. Validation of MORET 4 perturbation against 'physical' type fission products experiments

    International Nuclear Information System (INIS)

    Anno, Jacques; Jacquet, Olivier; Miss, Joachim

    2003-01-01

    After shortly recalling one among the many pertinent recent features of the French criticality CRISTAL package i.e. the perturbation algorithm (so called MORET 4 'Perturbation' or MP), this paper presents original MP validations. Numerical and experimental validations are made using close fission products (FP) experiments. As results, it is shown that, all being equal, MP can detect FP's absorption cross-section variations in the range 0.3-1.2%. (author)

  5. Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics model. Code-Saturne validation with the Prairie Grass experiment/Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics software

    International Nuclear Information System (INIS)

    Coulon, Fanny

    2010-09-01

    A validation of Code-Saturne, a computational fluids dynamics model developed by EDF, is proposed for stable conditions. The goal is to guarantee the performance of the model in order to use it for impacts study. A comparison with the Prairie Grass data field experiment and with two Gaussian plume models will be done [fr

  6. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  7. Modelling of the photooxidation of toluene: conceptual ideas for validating detailed mechanisms

    Directory of Open Access Journals (Sweden)

    V. Wagner

    2003-01-01

    Full Text Available Toluene photooxidation is chosen as an example to examine how simulations of smog-chamber experiments can be used to unravel shortcomings in detailed mechanisms and to provide information on complex reaction systems that will be crucial for the design of future validation experiments. The mechanism used in this study is extracted from the Master Chemical Mechanism Version 3 (MCM v3 and has been updated with new modules for cresol and g-dicarbonyl chemistry. Model simulations are carried out for a toluene-NOx experiment undertaken at the European Photoreactor (EUPHORE. The comparison of the simulation with the experimental data reveals two fundamental shortcomings in the mechanism: OH production is too low by about 80%, and the ozone concentration at the end of the experiment is over-predicted by 55%. The radical budget was analysed to identify the key intermediates governing the radical transformation in the toluene system. Ring-opening products, particularly conjugated g-dicarbonyls, were identified as dominant radical sources in the early stages of the experiment. The analysis of the time evolution of radical production points to a missing OH source that peaks when the system reaches highest reactivity. First generation products are also of major importance for the ozone production in the system. The analysis of the radical budget suggests two options to explain the concurrent under-prediction of OH and over-prediction of ozone in the model: 1 missing oxidation processes that produce or regenerate OH without or with little NO to NO2 conversion or 2 NO3 chemistry that sequesters reactive nitrogen oxides into stable nitrogen compounds and at the same time produces peroxy radicals. Sensitivity analysis was employed to identify significant contributors to ozone production and it is shown how this technique, in combination with ozone isopleth plots, can be used for the design of validation experiments.

  8. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability?

    Science.gov (United States)

    Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P

    2017-12-01

    The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.

  9. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  10. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model

    Science.gov (United States)

    Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    Objective To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Design Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Measurements Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Results Two of the seven factors, ‘organizational motivation’ and ‘meeting user needs,’ were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. Limitations The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. Conclusion The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term. PMID:20962135

  11. Validation Study of Unnotched Charpy and Taylor-Anvil Impact Experiments using Kayenta

    Energy Technology Data Exchange (ETDEWEB)

    Kamojjala, Krishna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lacy, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chu, Henry S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brannon, Rebecca [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    Validation of a single computational model with multiple available strain-to-failure fracture theories is presented through experimental tests and numerical simulations of the standardized unnotched Charpy and Taylor-anvil impact tests, both run using the same material model (Kayenta). Unnotched Charpy tests are performed on rolled homogeneous armor steel. The fracture patterns using Kayenta’s various failure options that include aleatory uncertainty and scale effects are compared against the experiments. Other quantities of interest include the average value of the absorbed energy and bend angle of the specimen. Taylor-anvil impact tests are performed on Ti6Al4V titanium alloy. The impact speeds of the specimen are 321 m/s and 393 m/s. The goal of the numerical work is to reproduce the damage patterns observed in the laboratory. For the numerical study, the Johnson-Cook failure model is used as the ductile fracture criterion, and aleatory uncertainty is applied to rate-dependence parameters to explore its effect on the fracture patterns.

  12. Validation of TGLF in C-Mod and DIII-D using machine learning and integrated modeling tools

    Science.gov (United States)

    Rodriguez-Fernandez, P.; White, Ae; Cao, Nm; Creely, Aj; Greenwald, Mj; Grierson, Ba; Howard, Nt; Meneghini, O.; Petty, Cc; Rice, Je; Sciortino, F.; Yuan, X.

    2017-10-01

    Predictive models for steady-state and perturbative transport are necessary to support burning plasma operations. A combination of machine learning algorithms and integrated modeling tools is used to validate TGLF in C-Mod and DIII-D. First, a new code suite, VITALS, is used to compare SAT1 and SAT0 models in C-Mod. VITALS exploits machine learning and optimization algorithms for the validation of transport codes. Unlike SAT0, the SAT1 saturation rule contains a model to capture cross-scale turbulence coupling. Results show that SAT1 agrees better with experiments, further confirming that multi-scale effects are needed to model heat transport in C-Mod L-modes. VITALS will next be used to analyze past data from DIII-D: L-mode ``Shortfall'' plasma and ECH swing experiments. A second code suite, PRIMA, allows for integrated modeling of the plasma response to Laser Blow-Off cold pulses. Preliminary results show that SAT1 qualitatively reproduces the propagation of cold pulses after LBO injections and SAT0 does not, indicating that cross-scale coupling effects play a role in the plasma response. PRIMA will be used to ``predict-first'' cold pulse experiments using the new LBO system at DIII-D, and analyze existing ECH heat pulse data. Work supported by DE-FC02-99ER54512, DE-FC02-04ER54698.

  13. Preliminary - discrete fracture network modelling of tracer migration experiments at the SCV site

    International Nuclear Information System (INIS)

    Dershowitz, W.S.; Wallmann, P.; Geier, J.E.; Lee, G.

    1991-09-01

    This report describes a numerical modelling study of solute transport within the Site Characterization and Validation (SCV) block at the Stripa site. The study was carried out with the FracMan/MAFIC package, utilizing statistics from stages 3 and 4 of the Stripa phase 3 Site Characterization and Validation project. Simulations were carried out to calibrate fracture solute transport properties against observations in the first stage of saline injection radar experiments. These results were then used to predict the performance of planned tracer experiments, using both particle tracking network solute transport, and pathways analysis approaches. Simulations were also carried out to predict results of the second stage of saline injection radar experiments. (au) (34 refs.)

  14. Five year experience in management of perforated peptic ulcer and validation of common mortality risk prediction models - are existing models sufficient? A retrospective cohort study.

    Science.gov (United States)

    Anbalakan, K; Chua, D; Pandya, G J; Shelat, V G

    2015-02-01

    Emergency surgery for perforated peptic ulcer (PPU) is associated with significant morbidity and mortality. Accurate and early risk stratification is important. The primary aim of this study is to validate the various existing MRPMs and secondary aim is to audit our experience of managing PPU. 332 patients who underwent emergency surgery for PPU at a single intuition from January 2008 to December 2012 were studied. Clinical and operative details were collected. Four MRPMs: American Society of Anesthesiology (ASA) score, Boey's score, Mannheim peritonitis index (MPI) and Peptic ulcer perforation (PULP) score were validated. Median age was 54.7 years (range 17-109 years) with male predominance (82.5%). 61.7% presented within 24 h of onset of abdominal pain. Median length of stay was 7 days (range 2-137 days). Intra-abdominal collection, leakage, re-operation and 30-day mortality rates were 8.1%, 2.1%, 1.2% and 7.2% respectively. All the four MRPMs predicted intra-abdominal collection and mortality; however, only MPI predicted leak (p = 0.01) and re-operation (p = 0.02) rates. The area under curve for predicting mortality was 75%, 72%, 77.2% and 75% for ASA score, Boey's score, MPI and PULP score respectively. Emergency surgery for PPU has low morbidity and mortality in our experience. MPI is the only scoring system which predicts all - intra-abdominal collection, leak, reoperation and mortality. All four MRPMs had a similar and fair accuracy to predict mortality, however due to geographic and demographic diversity and inherent weaknesses of exiting MRPMs, quest for development of an ideal model should continue. Copyright © 2015 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  16. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  17. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways.

    Science.gov (United States)

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-05-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 mum aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy.

  18. Alteration of 'R7T7' type nuclear glasses: statistical approach, experimental validation, local evolution model

    International Nuclear Information System (INIS)

    Thierry, F.

    2003-02-01

    The aim of this work is to propose an evolution of nuclear (R7T7-type) glass alteration modeling. The first part of this thesis is about development and validation of the 'r(t)' model. This model which predicts the decrease of alteration rates in confined conditions is based upon a coupling between a first-order dissolution law and a diffusion barrier effect of the alteration gel layer. The values and the uncertainties regarding the main adjustable parameters of the model (α, Dg and C*) have been determined from a systematic study of the available experimental data. A program called INVERSION has been written for this purpose. This work lead to characterize the validity domain of the 'r(t)' model and to parametrize it. Validation experiments have been undertaken, confirming the validity of the parametrization over 200 days. A new model is proposed in the second part of this thesis. It is based on an inhibition of glass dissolution reaction by silicon coupled with a local description of silicon retention in the alteration gel layer. This model predicts the evolutions of boron and silicon concentrations in solution as well as the concentrations and retention profiles in the gel layer. These predictions have been compared to measurements of retention profiles by the secondary ion mass spectrometry (SIMS) method. The model has been validated on fractions of gel layer which reactivity present low or moderate disparities. (author)

  19. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  20. Validation of a Mathematical Model for Green Algae (Raphidocelis Subcapitata Growth and Implications for a Coupled Dynamical System with Daphnia Magna

    Directory of Open Access Journals (Sweden)

    Michael Stemkovski

    2016-05-01

    Full Text Available Toxicity testing in populations probes for responses in demographic variables to anthropogenic or natural chemical changes in the environment. Importantly, these tests are primarily performed on species in isolation of adjacent tropic levels in their ecosystem. The development and validation of coupled species models may aid in predicting adverse outcomes at the ecosystems level. Here, we aim to validate a model for the population dynamics of the green algae Raphidocelis subcapitata, a planktonic species that is often used as a primary food source in toxicity experiments for the fresh water crustacean Daphnia magna. We collected longitudinal data from three replicate population experiments of R. subcapitata. We used this data with statistical model comparison tests and uncertainty quantification techniques to compare the performance of four models: the Logistic model, the Bernoulli model, the Gompertz model, and a discretization of the Logistic model. Overall, our results suggest that the logistic model is the most accurate continuous model for R. subcapitata population growth. We then implement the numerical discretization showing how the continuous logistic model for algae can be coupled to a previously validated discrete-time population model for D. magna.

  1. Grimsel Test Site: modelling radionuclide migration field experiments

    International Nuclear Information System (INIS)

    Heer, W.; Hadermann, J.

    1994-09-01

    In the migration field experiments at Nagra's Grimsel Test Site, the processes of nuclide transport through a well defined fractured shear-zone in crystalline rock are being investigated. For these experiments, model calculations have been performed to obtain indications on validity and limitation of the model applied and the data deduced under field conditions. The model consists of a hydrological part, where the dipole flow fields of the experiments are determined, and a nuclide transport part, where the flow field driven nuclide propagation through the shear-zone is calculated. In addition to the description of the model, analytical expressions are given to guide the interpretation of experimental results. From the analysis of experimental breakthrough curves for conservative uranine, weakly sorbing sodium and more stronger sorbing strontium tracers, the following main results can be derived: i) The model is able to represent the breakthrough curves of the migration field experiments to a high degree of accuracy, ii) The process of matrix diffusion is manifest through the tails of the breakthrough curves decreasing with time as t -3/2 and through the special shape of the tail ends, both confirmed by the experiments, iii) For nuclide sorbing rapidly, not too strongly, linearly, and exhibiting a reversible cation exchange process on fault gouge, the laboratory sorption coefficient can reasonably well be extrapolated to field conditions. Adequate care in selecting and preparing the rock samples is, of course, a necessary requirement. Using the parameters determined in the previous analysis, predictions are made for experiments in a smaller an faster flow field. For conservative uranine and weakly sorbing sodium, the agreement of predicted and measured breakthrough curves is good, for the more stronger sorbing strontium reasonable, confirming that the model describes the main nuclide transport processes adequately. (author) figs., tabs., 29 refs

  2. A Validation Study of the Adolescent Dissociative Experiences Scale

    Science.gov (United States)

    Keck Seeley, Susan. M.; Perosa, Sandra, L.; Perosa, Linda, M.

    2004-01-01

    Objective: The purpose of this study was to further the validation process of the Adolescent Dissociative Experiences Scale (A-DES). In this study, a 6-item Likert response format with descriptors was used when responding to the A-DES rather than the 11-item response format used in the original A-DES. Method: The internal reliability and construct…

  3. Fast Running Urban Dispersion Model for Radiological Dispersal Device (RDD) Releases: Model Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gowardhan, Akshay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Donetti, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Belles, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Eme, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Homann, Steven [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Simpson, Matthew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Nasstrom, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC)

    2017-05-24

    Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a more detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).

  4. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    Science.gov (United States)

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  5. Validation of a CFD analysis model for the calculation of CANDU6 moderator temperature distribution

    International Nuclear Information System (INIS)

    Yoon, Churl; Rhee, Bo Wook; Min, Byung Joo

    2001-01-01

    A validation of a 3D CFD model for predicting local subcooling of moderator in the vicinity of calandria tubes in a CANDU reactor is performed. The small scale moderator experiments performed at Sheridan Park Experimental Laboratory (SPEL) in Ontario, Canada is used for the validation. Also a comparison is made between previous DFD analyses based on 2DMOTH and PHOENICS, and the current model analysis for the same SPEL experiment. For the current model, a set of grid structures for the same geometry as the experimental test section is generated and the momentum, heat and continuity equations are solved by CFX-4.3, a CFD code developed by AEA technology. The matrix of calandria tubes is simplified by the porous media approach. The standard κ-ε turbulence model associated with logarithmic wall treatment and SIMPLEC algorithm on the body fitted grid are used and buoyancy effects are accounted for by the Boussinesq approximation. For the test conditions simulated in this study, the flow pattern identified is a buoyancy-dominated flow, which is generated by the interaction between the dominant buoyancy force by heating and inertial momentum forces by the inlet jets. As a result, the current CFD moderator analysis model predicts the moderator temperature reasonably, and the maximum error against the experimental data is kept at less than 2.0 .deg. C over the whole domain. The simulated velocity field matches with the visualization of SPEL experiments quite well

  6. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  7. Intercomparison and validation of operational coastal-scale models, the experience of the project MOMAR.

    Science.gov (United States)

    Brandini, C.; Coudray, S.; Taddei, S.; Fattorini, M.; Costanza, L.; Lapucci, C.; Poulain, P.; Gerin, R.; Ortolani, A.; Gozzini, B.

    2012-04-01

    The need for regional governments to implement operational systems for the sustainable management of coastal waters, in order to meet the requirements imposed by legislation (e.g. EU directives such as WFD, MSFD, BD and relevant national legislation) often lead to the implementation of coastal measurement networks and to the construction of computational models that surround and describe parts of regional seas without falling in the classic definition of regional/coastal models. Although these operational models may be structured to cover parts of different oceanographic basins, they can have considerable advantages and highlight relevant issues, such as the role of narrow channels, straits and islands in coastal circulation, as both in physical and biogeochemical processes such as in the exchanges of water masses among basins. Two models of this type were made in the context of cross-border European project MOMAR: an operational model of the Tuscan Archipelago sea and one around the Corsica coastal waters, which are both located between the Tyrrhenian and the Algerian-Ligurian-Provençal basins. Although these two models were based on different computer codes (MARS3D and ROMS), they have several elements in common, such as a 400 m resolution, boundary conditions from the same "father" model, and an important area of overlap, the Corsica channel, which has a key role in the exchange of water masses between the two oceanographic basins. In this work we present the results of the comparison of these two ocean forecasting systems in response to different weather and oceanographic forcing. In particular, we discuss aspects related to the validation of the two systems, and a systematic comparison between the forecast/hindcast based on such hydrodynamic models, as regards to both operational models available at larger scale, both to in-situ measurements made by fixed or mobile platforms. In this context we will also present the results of two oceanographic cruises in the

  8. Validation of MCCI models implemented in ASTEC MEDICIS on OECD CCI-2 and CCI-3 experiments and further consideration on reactor cases

    Energy Technology Data Exchange (ETDEWEB)

    Agethen, K.; Koch, M.K., E-mail: agethen@lee.rub.de, E-mail: koch@lee.rub.de [Ruhr-Universitat Bochum, Energy Systems and Energy Economics, Reactor Simulation and Safety Group, Bochum (Germany)

    2014-07-01

    Within a severe accident in a light water reactor a loss of coolant can result in core melting and vessel failure. Afterwards, molten core material may discharge into the containment cavity and interact with the concrete basemat. Due to concrete erosion gases are released, which lead to exothermic oxidation reactions with the metals in the corium and to formation of combustible mixtures. In this work the MEDICIS module of the Accident Source Term Evaluation Code (ASTEC) is validated on experiments of the OECD CCI programme. The primary focus is set on the CCI-2 experiment with limestone common sand (LCS) concrete, in which nearly homogenous erosion appeared, and the CCI-3 experiment with siliceous concrete, in which increased lateral erosion occurred. These experiments enable the analysis of heat transfer depending on the axial and radial orientation from the interior of the melt to the surrounding surfaces and the impact of top flooding. For the simulation of both tests, two existing models in MEDICIS are used and analysed. Results of simulations show a good agreement of ablation behaviour, layer temperature and energy balance with experimental results. Furthermore the issue of a quasi-steady state in the energy balance for the long term appeared. Finally the basic data are scaled up to a generic reactor scenario, which shows that this quasi-steady state similarly occurred. (author)

  9. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... useful directions in which the model could be improved....

  10. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    Science.gov (United States)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data

  11. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    simulations of these explosive events and their effects . These codes are continuously improving, but still require validation against experimental data to...contents of this report are not to be used for advertising , publication, or promotional purposes. Citation of trade names does not constitute an...12 Figure 18. Ninety-five percent confidence intervals on measured peak pressure. ............................ 14 Figure 19. Ninety-five percent

  12. Modeling and conduct of turbine missile concrete impact experiments

    International Nuclear Information System (INIS)

    Woodfin, R.L.

    1981-01-01

    The overall objective of the subject experiments was to provide full scale data on the response of reinforced concrete containment walls to impact and penetration by postulated turbine-produced missiles. These data can be used to validate analytical or scale modeling methods and to assess the applicability of current design formulas to penetration by large, irregularly shaped missiles. These data and results will be used in providing more realistic estimates of turbine missile damage probability in nuclear power plants with a non-peninsula arrangement. This paper describes the derivation of the test matrix and the method of conducting the experiments. (orig./HP)

  13. Development and validation of the Bullying and Cyberbullying Scale for Adolescents: A multi-dimensional measurement model.

    Science.gov (United States)

    Thomas, Hannah J; Scott, James G; Coates, Jason M; Connor, Jason P

    2018-05-03

    Intervention on adolescent bullying is reliant on valid and reliable measurement of victimization and perpetration experiences across different behavioural expressions. This study developed and validated a survey tool that integrates measurement of both traditional and cyber bullying to test a theoretically driven multi-dimensional model. Adolescents from 10 mainstream secondary schools completed a baseline and follow-up survey (N = 1,217; M age  = 14 years; 66.2% male). The Bullying and cyberbullying Scale for Adolescents (BCS-A) developed for this study comprised parallel victimization and perpetration subscales, each with 20 items. Additional measures of bullying (Olweus Global Bullying and the Forms of Bullying Scale [FBS]), as well as measures of internalizing and externalizing problems, school connectedness, social support, and personality, were used to further assess validity. Factor structure was determined, and then, the suitability of items was assessed according to the following criteria: (1) factor interpretability, (2) item correlations, (3) model parsimony, and (4) measurement equivalence across victimization and perpetration experiences. The final models comprised four factors: physical, verbal, relational, and cyber. The final scale was revised to two 13-item subscales. The BCS-A demonstrated acceptable concurrent and convergent validity (internalizing and externalizing problems, school connectedness, social support, and personality), as well as predictive validity over 6 months. The BCS-A has sound psychometric properties. This tool establishes measurement equivalence across types of involvement and behavioural forms common among adolescents. An improved measurement method could add greater rigour to the evaluation of intervention programmes and also enable interventions to be tailored to subscale profiles. © 2018 The British Psychological Society.

  14. Validation of a multiparameter model to investigate torrefied biomass pelletization behavior

    DEFF Research Database (Denmark)

    Puig Arnavat, Maria; Ahrenfeldt, Jesper; Henriksen, Ulrik Birk

    2017-01-01

    The present study aims to apply and validate a simple model describing the forces that are built up along the dies of a pellet press matrix to the pelletization of torrefied biomass. The model combines a theoretical background with the use of a single pellet press to describe the pelletizing...... behavior of torrefied material in an industrial scale pellet mill. Wet torrefaction and dry torrefaction pretreatments are considered in the study. Both torrefaction concepts produce a fuel with enhanced properties including a lower moisture content, higher calorific value, and better friability. The fuel...... and to avoid time consuming as well as expensive trial and error experiments....

  15. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  16. Utilisation of real-scale renewable energy test facility for validation of generic wind turbine and wind power plant controller models

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Gevorgian, Vahan; Wallen, Robb

    2016-01-01

    This article presents an example of application of a modern test facility conceived for experiments regarding the integration of renewable energy in the power system. The capabilities of the test facility are used to validate dynamic simulation models of wind power plants and their controllers....... The models are based on standard and generic blocks. The successful validation of events related to the control of active power (control phenomena in...

  17. Bayesian model calibration of ramp compression experiments on Z

    Science.gov (United States)

    Brown, Justin; Hund, Lauren

    2017-06-01

    Bayesian model calibration (BMC) is a statistical framework to estimate inputs for a computational model in the presence of multiple uncertainties, making it well suited to dynamic experiments which must be coupled with numerical simulations to interpret the results. Often, dynamic experiments are diagnosed using velocimetry and this output can be modeled using a hydrocode. Several calibration issues unique to this type of scenario including the functional nature of the output, uncertainty of nuisance parameters within the simulation, and model discrepancy identifiability are addressed, and a novel BMC process is proposed. As a proof of concept, we examine experiments conducted on Sandia National Laboratories' Z-machine which ramp compressed tantalum to peak stresses of 250 GPa. The proposed BMC framework is used to calibrate the cold curve of Ta (with uncertainty), and we conclude that the procedure results in simple, fast, and valid inferences. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    the effect of the contact surface on the measurement . For gauge locations where a clearly defined initial peak is not present, Figure 24 for example...these explosive events and their effects . These codes are continuously improving, but still require validation against experimental data to...DISCLAIMER: The contents of this report are not to be used for advertising , publication, or promotional purposes. Citation of trade names does not

  19. Non-destructive measurements of nuclear wastes. Validation and industrial operating experience

    International Nuclear Information System (INIS)

    Saas, A.; Tchemitciieff, E.

    1993-01-01

    After a short survey of the means employed for the non-destructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation, and control

  20. Explicating Experience: Development of a Valid Scale of Past Hazard Experience for Tornadoes.

    Science.gov (United States)

    Demuth, Julie L

    2018-03-23

    People's past experiences with a hazard theoretically influence how they approach future risks. Yet, past hazard experience has been conceptualized and measured in wide-ranging, often simplistic, ways, resulting in mixed findings about its relationship with risk perception. This study develops a scale of past hazard experiences, in the context of tornadoes, that is content and construct valid. A conceptual definition was developed, a set of items were created to measure one's most memorable and multiple tornado experiences, and the measures were evaluated through two surveys of the public who reside in tornado-prone areas. Four dimensions emerged of people's most memorable experience, reflecting their awareness of the tornado risk that day, their personalization of the risk, the intrusive impacts on them personally, and impacts experienced vicariously through others. Two dimensions emerged of people's multiple experiences, reflecting common types of communication received and negative emotional responses. These six dimensions are novel in that they capture people's experience across the timeline of a hazard as well as intangible experiences that are both direct and indirect. The six tornado experience dimensions were correlated with tornado risk perceptions measured as cognitive-affective and as perceived probability of consequences. The varied experience-risk perception results suggest that it is important to understand the nuances of these concepts and their relationships. This study provides a foundation for future work to continue explicating past hazard experience, across different risk contexts, and for understanding its effect on risk assessment and responses. © 2018 Society for Risk Analysis.

  1. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  2. A Validation of Subchannel Based CHF Prediction Model for Rod Bundles

    International Nuclear Information System (INIS)

    Hwang, Dae-Hyun; Kim, Seong-Jin

    2015-01-01

    is concerned, however, the experimental uncertainty should be reflected in evaluating the subchannel thermal hydraulic parameters which are not measured during CHF experiments. In the traditional design of PWR cores, the influence of CHF experiment uncertainty is not explicitly considered in the limit DNBR. It may be acceptable when the uncertainty of an empirical CHF correlation is considerably larger than the experimental uncertainty. However, it should be noted that the influence of experimental uncertainty may depend on various factors such as the accuracy of CHF model, quality of the test facility, uncertainty of subchannel analysis code, and the number of available CHF data. A validation procedure for a subchannel based CHF prediction model was examined by employing a CHF lookup table method and rod bundle CHF data simulating SMART fuel bundles

  3. Contaminant transport model validation: The Oak Ridge Reservation

    International Nuclear Information System (INIS)

    Lee, R.R.; Ketelle, R.H.

    1988-09-01

    In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs

  4. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  5. Alteration of 'R7T7' type nuclear glasses: statistical approach, experimental validation, local evolution model; Alteration des verres nucleaires de type 'R7T7': demarche statistique, validation experimentale, modele local d'evolution

    Energy Technology Data Exchange (ETDEWEB)

    Thierry, F

    2003-02-01

    The aim of this work is to propose an evolution of nuclear (R7T7-type) glass alteration modeling. The first part of this thesis is about development and validation of the 'r(t)' model. This model which predicts the decrease of alteration rates in confined conditions is based upon a coupling between a first-order dissolution law and a diffusion barrier effect of the alteration gel layer. The values and the uncertainties regarding the main adjustable parameters of the model ({alpha}, Dg and C*) have been determined from a systematic study of the available experimental data. A program called INVERSION has been written for this purpose. This work lead to characterize the validity domain of the 'r(t)' model and to parametrize it. Validation experiments have been undertaken, confirming the validity of the parametrization over 200 days. A new model is proposed in the second part of this thesis. It is based on an inhibition of glass dissolution reaction by silicon coupled with a local description of silicon retention in the alteration gel layer. This model predicts the evolutions of boron and silicon concentrations in solution as well as the concentrations and retention profiles in the gel layer. These predictions have been compared to measurements of retention profiles by the secondary ion mass spectrometry (SIMS) method. The model has been validated on fractions of gel layer which reactivity present low or moderate disparities. (author)

  6. Gas transfer under breaking waves: experiments and an improved vorticity-based model

    Directory of Open Access Journals (Sweden)

    V. K. Tsoukala

    2008-07-01

    Full Text Available In the present paper a modified vorticity-based model for gas transfer under breaking waves in the absence of significant wind forcing is presented. A theoretically valid and practically applicable mathematical expression is suggested for the assessment of the oxygen transfer coefficient in the area of wave-breaking. The proposed model is based on the theory of surface renewal that expresses the oxygen transfer coefficient as a function of both the wave vorticity and the Reynolds wave number for breaking waves. Experimental data were collected in wave flumes of various scales: a small-scale experiments were carried out using both a sloping beach and a rubble-mound breakwater in the wave flume of the Laboratory of Harbor Works, NTUA, Greece; b large-scale experiments were carried out with a sloping beach in the wind-wave flume of Delft Hydraulics, the Netherlands, and with a three-layer rubble mound breakwater in the Schneideberg Wave Flume of the Franzius Institute, University of Hannover, Germany. The experimental data acquired from both the small- and large-scale experiments were in good agreement with the proposed model. Although the apparent transfer coefficients from the large-scale experiments were lower than those determined from the small-scale experiments, the actual oxygen transfer coefficients, as calculated using a discretized form of the transport equation, are in the same order of magnitude for both the small- and large-scale experiments. The validity of the proposed model is compared to experimental results from other researchers. Although the results are encouraging, additional research is needed, to incorporate the influence of bubble mediated gas exchange, before these results are used for an environmental friendly design of harbor works, or for projects involving waste disposal at sea.

  7. Evaluation of three atmospheric dispersion models using tracer release experiment data

    International Nuclear Information System (INIS)

    Daoo, V.J.; Oza, R.B.; Pandit, G.G.; Sadasivan, S.; Venkat Raj, V.

    2004-01-01

    Performance of three atmospheric dispersion models viz: (1) Gaussian Plume Model (GPM), (2) Equi-Distance PUFF Model (EDPUFFM) and (3) Particle Trajectory Model (PTM) is evaluated using field data collected from a tracer (SF 6 ) release experiment. The experiment was conducted within the campus of the Bhabha Atomic Research Centre (BARC), located at Trombay, Mumbai, India. The three models used are currently in operation at the BARC. The first one is a standard, well-documented empirical model while the other two models have been developed at the Bhabha Atomic Research Centre. The PTM is a numerical model while the EDPUFFM is a hybrid model combining both the numerical and analytical techniques. The procedure for evaluation is as per the recommendations of 1980 AMS (American Meteorological Society) workshop on atmospheric dispersion models performance evaluation. In addition, linear regression analysis has also been carried out. The regression analysis reveals that on an average, the EDPUFFM and the GPM predictions are higher by a factor of about 1.5 while the PTM predictions are lower by a factor of about 4. Comparison of various performance measures reveals that the performance of the EDPUFFM is marginally better than that of the GPM while the PTM performance is comparatively poor. The uncertainty factors obtained in this study, especially for higher concentration range ( > 100 ppt) are similar to those obtained in other validation study carried out elsewhere to validate the GPM predictions. However, for lower concentration range and for the conditions after the source is switched off, all the three models perform poorly in predicting the concentration. (author)

  8. Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.

    Science.gov (United States)

    Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian

    2016-05-01

    to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  10. Regulatory perspectives on model validation in high-level radioactive waste management programs: A joint NRC/SKI white paper

    Energy Technology Data Exchange (ETDEWEB)

    Wingefors, S.; Andersson, J.; Norrby, S. [Swedish Nuclear Power lnspectorate, Stockholm (Sweden). Office of Nuclear Waste Safety; Eisenberg, N.A.; Lee, M.P.; Federline, M.V. [U.S. Nuclear Regulatory Commission, Washington, DC (United States). Office of Nuclear Material Safety and Safeguards; Sagar, B.; Wittmeyer, G.W. [Center for Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)

    1999-03-01

    experience in this area, this White Paper presents the views of members of the two organisations regarding how, and to what degree, validation might be accomplished in the models used to estimate the performance of HLW repositories.

  11. Regulatory perspectives on model validation in high-level radioactive waste management programs: A joint NRC/SKI white paper

    International Nuclear Information System (INIS)

    Wingefors, S.; Andersson, J.; Norrby, S.

    1999-03-01

    experience in this area, this White Paper presents the views of members of the two organisations regarding how, and to what degree, validation might be accomplished in the models used to estimate the performance of HLW repositories

  12. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  13. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  14. A model of fluid and solute exchange in the human: validation and implications.

    Science.gov (United States)

    Bert, J L; Gyenge, C C; Bowen, B D; Reed, R K; Lund, T

    2000-11-01

    In order to understand better the complex, dynamic behaviour of the redistribution and exchange of fluid and solutes administered to normal individuals or to those with acute hypovolemia, mathematical models are used in addition to direct experimental investigation. Initial validation of a model developed by our group involved data from animal experiments (Gyenge, C.C., Bowen, B.D., Reed, R.K. & Bert, J.L. 1999b. Am J Physiol 277 (Heart Circ Physiol 46), H1228-H1240). For a first validation involving humans, we compare the results of simulations with a wide range of different types of data from two experimental studies. These studies involved administration of normal saline or hypertonic saline with Dextran to both normal and 10% haemorrhaged subjects. We compared simulations with data including the dynamic changes in plasma and interstitial fluid volumes VPL and VIT respectively, plasma and interstitial colloid osmotic pressures PiPL and PiIT respectively, haematocrit (Hct), plasma solute concentrations and transcapillary flow rates. The model predictions were overall in very good agreement with the wide range of experimental results considered. Based on the conditions investigated, the model was also validated for humans. We used the model both to investigate mechanisms associated with the redistribution and transport of fluid and solutes administered following a mild haemorrhage and to speculate on the relationship between the timing and amount of fluid infusions and subsequent blood volume expansion.

  15. Site selection and directional models of deserts used for ERBE validation targets

    Science.gov (United States)

    Staylor, W. F.

    1986-01-01

    Broadband shortwave and longwave radiance measurements obtained from the Nimbus 7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara, Gibson, and Saudi Deserts. These deserts will serve as in-flight validation targets for the Earth Radiation Budget Experiment being flown on the Earth Radiation Budget Satellite and two National Oceanic and Atmospheric Administration polar satellites. The directional reflectance model derived for the deserts was a function of the sum and product of the cosines of the solar and viewing zenith angles, and thus reciprocity existed between these zenith angles. The emittance model was related by a power law of the cosine of the viewing zenith angle.

  16. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  17. Development of the Galaxy Chronic Obstructive Pulmonary Disease (COPD) Model Using Data from ECLIPSE: Internal Validation of a Linked-Equations Cohort Model.

    Science.gov (United States)

    Briggs, Andrew H; Baker, Timothy; Risebrough, Nancy A; Chambers, Mike; Gonzalez-McQuire, Sebastian; Ismaila, Afisi S; Exuzides, Alex; Colby, Chris; Tabberer, Maggie; Muellerova, Hana; Locantore, Nicholas; Rutten van Mölken, Maureen P M H; Lomas, David A

    2017-05-01

    The recent joint International Society for Pharmacoeconomics and Outcomes Research / Society for Medical Decision Making Modeling Good Research Practices Task Force emphasized the importance of conceptualizing and validating models. We report a new model of chronic obstructive pulmonary disease (COPD) (part of the Galaxy project) founded on a conceptual model, implemented using a novel linked-equation approach, and internally validated. An expert panel developed a conceptual model including causal relationships between disease attributes, progression, and final outcomes. Risk equations describing these relationships were estimated using data from the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints (ECLIPSE) study, with costs estimated from the TOwards a Revolution in COPD Health (TORCH) study. Implementation as a linked-equation model enabled direct estimation of health service costs and quality-adjusted life years (QALYs) for COPD patients over their lifetimes. Internal validation compared 3 years of predicted cohort experience with ECLIPSE results. At 3 years, the Galaxy COPD model predictions of annual exacerbation rate and annual decline in forced expiratory volume in 1 second fell within the ECLIPSE data confidence limits, although 3-year overall survival was outside the observed confidence limits. Projections of the risk equations over time permitted extrapolation to patient lifetimes. Averaging the predicted cost/QALY outcomes for the different patients within the ECLIPSE cohort gives an estimated lifetime cost of £25,214 (undiscounted)/£20,318 (discounted) and lifetime QALYs of 6.45 (undiscounted/5.24 [discounted]) per ECLIPSE patient. A new form of model for COPD was conceptualized, implemented, and internally validated, based on a series of linked equations using epidemiological data (ECLIPSE) and cost data (TORCH). This Galaxy model predicts COPD outcomes from treatment effects on disease attributes such as lung function

  18. Validity And Practicality of Experiment Integrated Guided Inquiry-Based Module on Topic of Colloidal Chemistry for Senior High School Learning

    Science.gov (United States)

    Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.

    2018-04-01

    This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.

  19. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  20. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  1. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  2. Developing a model for validation and prediction of bank customer ...

    African Journals Online (AJOL)

    Credit risk is the most important risk of banks. The main approaches of the bank to reduce credit risk are correct validation using the final status and the validation model parameters. High fuel of bank reserves and lost or outstanding facilities of banks indicate the lack of appropriate validation models in the banking network.

  3. INPUT DATA OF BURNING WOOD FOR CFD MODELLING USING SMALL-SCALE EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Petr Hejtmánek

    2017-12-01

    Full Text Available The paper presents an option how to acquire simplified input data for modelling of burning wood in CFD programmes. The option lies in combination of data from small- and molecular-scale experiments in order to describe the material as a one-reaction material property. Such virtual material would spread fire, develop the fire according to surrounding environment and it could be extinguished without using complex reaction molecular description. Series of experiments including elemental analysis, thermogravimetric analysis and difference thermal analysis, and combustion analysis were performed. Then the FDS model of burning pine wood in a cone calorimeter was built. In the model where those values were used. The model was validated to HRR (Heat Release Rate from the real cone calorimeter experiment. The results show that for the purpose of CFD modelling the effective heat of combustion, which is one of the basic material property for fire modelling affecting the total intensity of burning, should be used. Using the net heat of combustion in the model leads to higher values of HRR in comparison to the real experiment data. Considering all the results shown in this paper, it was shown that it is possible to simulate burning of wood using the extrapolated data obtained in small-size experiments.

  4. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  5. A validation of DRAGON based on lattice experiments

    International Nuclear Information System (INIS)

    Marleau, G.

    1996-01-01

    Here we address the validation of DRAGON using the Chalk River Laboratory experimental database which has already been used for the validation of other codes. Because of the large variety of information for different fuel and moderator types compiled on this database, the most basic modules of DRAGON are thoroughly tested. The general behaviour observed with DRAGON is very good. Its main weakness is seen in the self-shielding ,calculation where the correction applied to the inner fuel pin seems to be overevaluated with respect to the outer fuel pins. One question which is left open this paper concerns the need for inserting end-regions in the DRAGON cells when the heterogeneous B, leakage model is used. (author)

  6. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  7. PIV Validation of 3D Multicomponent Model for Cold Spray Within Nitrogen and Helium Supersonic Flow Field

    Science.gov (United States)

    Faizan-Ur-Rab, M.; Zahiri, S. H.; Masood, S. H.; Jahedi, M.; Nagarajah, R.

    2017-06-01

    This study presents the validation of a developed three-dimensional multicomponent model for cold spray process using two particle image velocimetry (PIV) experiments. The k- ɛ type 3D model developed for spherical titanium particles was validated with the measured titanium particle velocity within a nitrogen and helium supersonic jet. The 3D model predicted lower values of particle velocity than the PIV experimental study that used irregularly shaped titanium particles. The results of the 3D model were consistent with the PIV experiment that used spherical titanium powder. The 3D model simulation of particle velocity within the helium and nitrogen jet was coupled with an estimation of titanium particle temperature. This was achieved with the consideration of the fact that cold spray particle temperature is difficult and expensive to measure due to considerably lower temperature of particles than thermal spray. The model predicted an interesting pattern of particle size distribution with respect to the location of impact with a concentration of finer particles close to the jet center. It is believed that the 3D model outcomes for particle velocity, temperature and location could be a useful tool to optimize system design, deposition process and mechanical properties of the additively manufactured cold spray structures.

  8. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  9. Modelling of macrosegregation in steel ingots: benchmark validation and industrial application

    International Nuclear Information System (INIS)

    Li Wensheng; Shen Houfa; Liu Baicheng; Shen Bingzhen

    2012-01-01

    The paper presents the recent progress made by the authors on modelling of macrosegregation in steel ingots. A two-phase macrosegregation model was developed that incorporates descriptions of heat transfer, melt convection, solute transport, and solid movement on the process scale with microscopic relations for grain nucleation and growth. The formation of pipe shrinkage at the ingot top is also taken into account in the model. Firstly, a recently proposed numerical benchmark test of macrosegregation was used to verify the model. Then, the model was applied to predict the macrosegregation in a benchmark industrial-scale steel ingot. The predictions were validated against experimental data from the literature. Furthermore, macrosegregation experiment of an industrial 53-t steel ingot was performed. The simulation results were compared with the measurements. It is indicated that the typical macrosegregation patterns encountered in steel ingots, including a positively segregated zone in the hot top and a negative segregation in the bottom part of the ingot, are well reproduced with the model.

  10. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  11. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  12. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  13. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  14. Study of archaeological analogs for the validation of nuclear glass long-term behavior models

    International Nuclear Information System (INIS)

    Verney-Carron, A.

    2008-10-01

    Fractured archaeological glass blocks collected from a shipwreck discovered in the Mediterranean Sea near Embiez Island (Var) were investigated because of their morphological analogy with vitrified nuclear waste and of a known and stable environment. These glasses are fractured due to a fast cooling after they were melted (like nuclear glass) and have been altered for 1800 years in seawater. This work results in the development and the validation of a geochemical model able to simulate the alteration of a fractured archaeological glass block over 1800 years. The kinetics associated with the different mechanisms (interdiffusion and dissolution) and the thermodynamic parameters of the model were determined by leaching experiments. The model implemented in HYTEC software was used to simulate crack alteration over 1800 years. The consistency between simulated alteration thicknesses and measured data on glass blocks validate the capacity of the model to predict long-term alteration. This model is able to account for the results from the characterization of crack network and its state of alteration. The cracks in the border zone are the most altered due to a fast renewal of the leaching solution, whereas internal cracks are thin because of complex interactions between glass alteration and transport of elements in solution (influence of initial crack aperture and of the crack sealing). The lowest alteration thicknesses, as well as their variability, can be explained. The analog behavior of archaeological and nuclear glasses from leaching experiments makes possible the transposition of the model to nuclear glass in geological repository. (author)

  15. Design and experiments with scale model of a ship with dynamic positioning system

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Carlos Eduardo S.; Morishita, Helio M.; Moratelli Junior, Lazaro; Lago, Glenan A.; Tannuri, Eduardo A. [Universidade de Sao Paulo (USP), SP (Brazil)

    2008-07-01

    Dynamic Positioning Systems (DPS) are used to keep a floating vessel on a specific position or follow pre-defined path through the action of controlled propellers. This paper describes a facility used to experimentally analyze DPS and to validate a numerical simulator. It is composed by a scale model of a DP tanker with 3 thrusters, a measurement system based on computational vision and a control software with the same DP algorithms used in industrial systems. Simple wind and current generators were also implemented. This work shows preliminary results of experiments, which has been useful to calibrate the simulator and to validate the mathematical model. (author)

  16. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Giron, Nicholas Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150°C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  17. Application of a computational situation assessment model to human system interface design and experimental validation of its effectiveness

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Koh, Kwang-Yong; Seong, Poong-Hyun

    2013-01-01

    Highlights: ► We validate the effectiveness of a proposed procedure thru an experiment. ► The proposed procedure addresses the salient coding of the key information. ► It was found that salience coding affects operators’ attention significantly. ► The first observation to the key information quickly guided to the correct situation awareness. ► It was validated the proposed procedure is effective for better situation awareness. - Abstract: To evaluate the effects of human cognitive characteristics on situation awareness, a computational situation assessment model of nuclear power plant operators has been developed, as well as a procedure to apply the developed model to the design of human system interfaces (HSIs). The concept of the proposed procedure is to identify the key information source, which is expected to guarantee fast and accurate diagnosis when operators attend to it. The developed computational model is used to search the diagnostic paths and the key information source. In this study, an experiment with twelve trained participants was executed to validate the effectiveness of the proposed procedure. Eighteen scenarios covering various accidents were administered twice for each subject, and experimental data were collected and analyzed. As a result of the data analysis, it was validated that the salience level of information sources significantly influences the attention of operators, and the first observation of the key information sources leads operators to a quick and correct situation assessment. Therefore, we conclude that the proposed procedure for applying the developed model to HSI design is effective

  18. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  19. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  20. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  1. Applicability of U.S. Army tracer test data to model validation needs of ERDA

    International Nuclear Information System (INIS)

    Shearer, D.L.; Minott, D.H.

    1976-06-01

    This report covers the first phase of an atmospheric dispersion model validation project sponsored by the Energy Research and Development Administration (ERDA). The project will employ dispersion data generated during an extensive series of field tracer experiments that were part of a meteorological research program which was conducted by the U. S. Army Dugway Proving Ground, Utah, from the late 1950's to the early 1970's. The tests were conducted at several locations in the U. S., South America, Germany, and Norway chosen to typify the effects of certain environmental factors on atmospheric dispersion. The purpose of the Phase I work of this project was to identify applicable portions of the Army data, obtain and review that data, and make recommendations for its uses for atmospheric dispersion model validations. This report presents key information in three formats. The first is a tabular listing of the Army dispersion test reports summarizing the test data contained in each report. This listing is presented in six separate tables with each tabular list representing a different topical area that is based on model validation requirements and the nature of the Army data base. The second format for presenting key information is a series of discussions of the Army test information assigned to each of the six topical areas. These discussions relate the extent and quality of the available data, as well as its prospective use for model validation. The third format is a series of synopses for each Army test report

  2. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  3. Characterization of a CLYC detector and validation of the Monte Carlo Simulation by measurement experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Suk; Ye, Sung Joon [Seoul National University, Seoul (Korea, Republic of); Smith, Martin B.; Koslowsky, Martin R. [Bubble Technology Industries Inc., Chalk River (Canada); Kwak, Sung Woo [Korea Institute of Nuclear Nonproliferation And Control (KINAC), Daejeon (Korea, Republic of); Kim Gee Hyun [Sejong University, Seoul (Korea, Republic of)

    2017-03-15

    Simultaneous detection of neutrons and gamma rays have become much more practicable, by taking advantage of good gamma-ray discrimination properties using pulse shape discrimination (PSD) technique. Recently, we introduced a commercial CLYC system in Korea, and performed an initial characterization and simulation studies for the CLYC detector system to provide references for the future implementation of the dual-mode scintillator system in various studies and applications. We evaluated a CLYC detector with 95% 6Li enrichment using various gamma-ray sources and a 252Cf neutron source, with validation of our Monte Carlo simulation results via measurement experiments. Absolute full-energy peak efficiency values were calculated for gamma-ray sources and neutron source using MCNP6 and compared with measurement experiments of the calibration sources. In addition, behavioral characteristics of neutrons were validated by comparing simulations and experiments on neutron moderation with various polyethylene (PE) moderator thicknesses. Both results showed good agreements in overall characteristics of the gamma and neutron detection efficiencies, with consistent ⁓20% discrepancy. Furthermore, moderation of neutrons emitted from {sup 252}Cf showed similarities between the simulation and the experiment, in terms of their relative ratios depending on the thickness of the PE moderator. A CLYC detector system was characterized for its energy resolution and detection efficiency, and Monte Carlo simulations on the detector system was validated experimentally. Validation of the simulation results in overall trend of the CLYC detector behavior will provide the fundamental basis and validity of follow-up Monte Carlo simulation studies for the development of our dual-particle imager using a rotational modulation collimator.

  4. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  5. Development and validation of a 10-year-old child ligamentous cervical spine finite element model.

    Science.gov (United States)

    Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H

    2013-12-01

    Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.

  6. First experience from in-core sensor validation based on correlation and neuro-fuzzy techniques

    International Nuclear Information System (INIS)

    Figedy, S.

    2011-01-01

    In this work new types of nuclear reactor in-core sensor validation methods are outlined. The first one is based on combination of correlation coefficients and mutual information indices, which reflect the correlation of signals in linear and nonlinear regions. The method may be supplemented by wavelet transform based signal features extraction and pattern recognition by artificial neural networks and also fuzzy logic based decision making. The second one is based on neuro-fuzzy modeling of residuals between experimental values and their theoretical counterparts obtained from the reactor core simulator calculations. The first experience with this approach is described and further improvements to enhance the outcome reliability are proposed (Author)

  7. ASTEC V2 severe accident integral code: Fission product modelling and validation

    International Nuclear Information System (INIS)

    Cantrel, L.; Cousin, F.; Bosland, L.; Chevalier-Jabet, K.; Marchetto, C.

    2014-01-01

    One main goal of the severe accident integral code ASTEC V2, jointly developed since almost more than 15 years by IRSN and GRS, is to simulate the overall behaviour of fission products (FP) in a damaged nuclear facility. ASTEC applications are source term determinations, level 2 Probabilistic Safety Assessment (PSA2) studies including the determination of uncertainties, accident management studies and physical analyses of FP experiments to improve the understanding of the phenomenology. ASTEC is a modular code and models of a part of the phenomenology are implemented in each module: the release of FPs and structural materials from degraded fuel in the ELSA module; the transport through the reactor coolant system approximated as a sequence of control volumes in the SOPHAEROS module; and the radiochemistry inside the containment nuclear building in the IODE module. Three other modules, CPA, ISODOP and DOSE, allow respectively computing the deposition rate of aerosols inside the containment, the activities of the isotopes as a function of time, and the gaseous dose rate which is needed to model radiochemistry in the gaseous phase. In ELSA, release models are semi-mechanistic and have been validated for a wide range of experimental data, and noticeably for VERCORS experiments. For SOPHAEROS, the models can be divided into two parts: vapour phase phenomena and aerosol phase phenomena. For IODE, iodine and ruthenium chemistry are modelled based on a semi-mechanistic approach, these FPs can form some volatile species and are particularly important in terms of potential radiological consequences. The models in these 3 modules are based on a wide experimental database, resulting for a large part from international programmes, and they are considered at the state of the art of the R and D knowledge. This paper illustrates some FPs modelling capabilities of ASTEC and computed values are compared to some experimental results, which are parts of the validation matrix

  8. Validation of mathematical models to describe fluid dynamics of a cold riser by gamma ray attenuation

    International Nuclear Information System (INIS)

    Melo, Ana Cristina Bezerra Azedo de

    2004-12-01

    The fluid dynamic behavior of a riser in a cold type FCC model was investigated by means of catalyst concentration distribution measured with gamma attenuation and simulated with a mathematical model. In the riser of the cold model, MEF, 0,032 m in diameter, 2,30 m in length the fluidized bed, whose components are air and FCC catalyst, circulates. The MEF is operated by automatic control and instruments for measuring fluid dynamic variables. An axial catalyst concentration distribution was measured using an Am-241 gamma source and a NaI detector coupled to a multichannel provided with a software for data acquisition and evaluation. The MEF was adapted for a fluid dynamic model validation which describes the flow in the riser, for example, by introducing an injector for controlling the solid flow in circulation. Mathematical models were selected from literature, analyzed and tested to simulate the fluid dynamic of the riser. A methodology for validating fluid dynamic models was studied and implemented. The stages of the work were developed according to the validation methodology, such as data planning experiments, study of the equations which describe the fluidodynamic, computational solvers application and comparison with experimental data. Operational sequences were carried out keeping the MEF conditions for measuring catalyst concentration and simultaneously measuring the fluid dynamic variables, velocity of the components and pressure drop in the riser. Following this, simulated and experimental values were compared and statistical data treatment done, aiming at the required precision to validate the fluid dynamic model. The comparison tests between experimental and simulated data were carried out under validation criteria. The fluid dynamic behavior of the riser was analyzed and the results and the agreement with literature were discussed. The adopt model was validated under the MEF operational conditions, for a 3 to 6 m/s gas velocity in the riser and a slip

  9. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  10. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pbex measurements

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.

    2012-01-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ( 137 Cs) and excess lead-210 ( 210 Pb ex ) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from 137 Cs and 210 Pb ex measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. - Highlights: ► Soil erosion is an important threat to the long-term sustainability of agriculture.

  11. Examining students' views about validity of experiments: From introductory to Ph.D. students

    Science.gov (United States)

    Hu, Dehui; Zwickl, Benjamin M.

    2018-06-01

    We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.

  12. International integral experiments databases in support of nuclear data and code validation

    International Nuclear Information System (INIS)

    Briggs, J. Blair; Gado, Janos; Hunter, Hamilton; Kodeli, Ivan; Salvatores, Massimo; Sartori, Enrico

    2002-01-01

    The OECD/NEA Nuclear Science Committee (NSC) has identified the need to establish international databases containing all the important experiments that are available for sharing among the specialists. The NSC has set up or sponsored specific activities to achieve this. The aim is to preserve them in an agreed standard format in computer accessible form, to use them for international activities involving validation of current and new calculational schemes including computer codes and nuclear data libraries, for assessing uncertainties, confidence bounds and safety margins, and to record measurement methods and techniques. The databases so far established or in preparation related to nuclear data validation cover the following areas: SINBAD - A Radiation Shielding Experiments database encompassing reactor shielding, fusion blanket neutronics, and accelerator shielding. ICSBEP - International Criticality Safety Benchmark Experiments Project Handbook, with more than 2500 critical configurations with different combination of materials and spectral indices. IRPhEP - International Reactor Physics Experimental Benchmarks Evaluation Project. The different projects are described in the following including results achieved, work in progress and planned. (author)

  13. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  14. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    Directory of Open Access Journals (Sweden)

    Jae Sang Moon

    2017-12-01

    Full Text Available Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES. Stochastic characteristics of these LES waked wind velocity field, including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study’s overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.

  15. Generator Dynamic Model Validation and Parameter Calibration Using Phasor Measurements at the Point of Connection

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry; Yang, Steve

    2013-05-01

    Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.

  16. Time Sharing Between Robotics and Process Control: Validating a Model of Attention Switching.

    Science.gov (United States)

    Wickens, Christopher Dow; Gutzwiller, Robert S; Vieane, Alex; Clegg, Benjamin A; Sebok, Angelia; Janes, Jess

    2016-03-01

    The aim of this study was to validate the strategic task overload management (STOM) model that predicts task switching when concurrence is impossible. The STOM model predicts that in overload, tasks will be switched to, to the extent that they are attractive on task attributes of high priority, interest, and salience and low difficulty. But more-difficult tasks are less likely to be switched away from once they are being performed. In Experiment 1, participants performed four tasks of the Multi-Attribute Task Battery and provided task-switching data to inform the role of difficulty and priority. In Experiment 2, participants concurrently performed an environmental control task and a robotic arm simulation. Workload was varied by automation of arm movement and both the phases of environmental control and existence of decision support for fault management. Attention to the two tasks was measured using a head tracker. Experiment 1 revealed the lack of influence of task priority and confirmed the differing roles of task difficulty. In Experiment 2, the percentage attention allocation across the eight conditions was predicted by the STOM model when participants rated the four attributes. Model predictions were compared against empirical data and accounted for over 95% of variance in task allocation. More-difficult tasks were performed longer than easier tasks. Task priority does not influence allocation. The multiattribute decision model provided a good fit to the data. The STOM model is useful for predicting cognitive tunneling given that human-in-the-loop simulation is time-consuming and expensive. © 2016, Human Factors and Ergonomics Society.

  17. Novel intrinsic-based submodel for char particle gasification in entrained-flow gasifiers: Model development, validation and illustration

    International Nuclear Information System (INIS)

    Schulze, S.; Richter, A.; Vascellari, M.; Gupta, A.; Meyer, B.; Nikrityuk, P.A.

    2016-01-01

    Highlights: • Model resolving intra-particle species transport for char conversion was formulated. • TGA experiments of char particle conversion in gas flow were conducted. • The experimental results for char conversion validated the model. • CFD simulations of endothermic reactor with developed model were carried out. - Abstract: The final carbon conversion rate is of critical importance in the efficiency of gasifiers. Therefore, comprehensive modeling of char particle conversion is of primary interest for designing new gasifiers. This work presents a novel intrinsic-based submodel for the gasification of a char particle moving in a hot flue gas environment considering CO 2 and H 2 O as inlet species. The first part of the manuscript describes the model and its derivation. Validations against experiments carried out in this work for German lignite char are reported in the second part. The comparison between submodel predictions and experimental data shows good agreement. The importance of char porosity change during gasification is demonstrated. The third part presents the results of CFD simulations using the new submodel and a surface-based submodel for a generic endothermic gasifier. The focus of CFD simulations is to demonstrate the crucial role of intrinsic based heterogeneous reactions in the adequate prediction of carbon conversion rates.

  18. Hohlraum modeling for opacity experiments on the National Ignition Facility

    Science.gov (United States)

    Dodd, E. S.; DeVolder, B. G.; Martin, M. E.; Krasheninnikova, N. S.; Tregillis, I. L.; Perry, T. S.; Heeter, R. F.; Opachich, Y. P.; Moore, A. S.; Kline, J. L.; Johns, H. M.; Liedahl, D. A.; Cardenas, T.; Olson, R. E.; Wilde, B. H.; Urbatsch, T. J.

    2018-06-01

    This paper discusses the modeling of experiments that measure iron opacity in local thermodynamic equilibrium (LTE) using laser-driven hohlraums at the National Ignition Facility (NIF). A previous set of experiments fielded at Sandia's Z facility [Bailey et al., Nature 517, 56 (2015)] have shown up to factors of two discrepancies between the theory and experiment, casting doubt on the validity of the opacity models. The purpose of the new experiments is to make corroborating measurements at the same densities and temperatures, with the initial measurements made at a temperature of 160 eV and an electron density of 0.7 × 1022 cm-3. The X-ray hot spots of a laser-driven hohlraum are not in LTE, and the iron must be shielded from a direct line-of-sight to obtain the data [Perry et al., Phys. Rev. B 54, 5617 (1996)]. This shielding is provided either with the internal structure (e.g., baffles) or external wall shapes that divide the hohlraum into a laser-heated portion and an LTE portion. In contrast, most inertial confinement fusion hohlraums are simple cylinders lacking complex gold walls, and the design codes are not typically applied to targets like those for the opacity experiments. We will discuss the initial basis for the modeling using LASNEX, and the subsequent modeling of five different hohlraum geometries that have been fielded on the NIF to date. This includes a comparison of calculated and measured radiation temperatures.

  19. Design and implementation of new design of numerical experiments for non linear models

    International Nuclear Information System (INIS)

    Gazut, St.

    2007-03-01

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  20. Mathematical model formulation and validation of water and solute transport in whole hamster pancreatic islets.

    Science.gov (United States)

    Benson, James D; Benson, Charles T; Critser, John K

    2014-08-01

    Optimization of cryopreservation protocols for cells and tissues requires accurate models of heat and mass transport. Model selection often depends on the configuration of the tissue. Here, a mathematical and conceptual model of water and solute transport for whole hamster pancreatic islets has been developed and experimentally validated incorporating fundamental biophysical data from previous studies on individual hamster islet cells while retaining whole-islet structural information. It describes coupled transport of water and solutes through the islet by three methods: intracellularly, intercellularly, and in combination. In particular we use domain decomposition techniques to couple a transmembrane flux model with an interstitial mass transfer model. The only significant undetermined variable is the cellular surface area which is in contact with the intercellularly transported solutes, Ais. The model was validated and Ais determined using a 3×3 factorial experimental design blocked for experimental day. Whole islet physical experiments were compared with model predictions at three temperatures, three perfusing solutions, and three islet size groups. A mean of 4.4 islets were compared at each of the 27 experimental conditions and found to correlate with a coefficient of determination of 0.87±0.06 (mean ± SD). Only the treatment variable of perfusing solution was found to be significant (p<0.05). We have devised a model that retains much of the intrinsic geometric configuration of the system, and thus fewer laboratory experiments are needed to determine model parameters and thus to develop new optimized cryopreservation protocols. Additionally, extensions to ovarian follicles and other concentric tissue structures may be made. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    Science.gov (United States)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  2. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  3. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  4. Ceramic bar impact experiments for improved material model

    International Nuclear Information System (INIS)

    Brar, N.S.; Proud, W.G.; Rajendran, A.M.

    2004-01-01

    Ceramic bar-on-bar (uniaxial stress) experiments are performed to extend uniaxial strain deformation states imposed in flyer plate impact experiments. A number of investigators engaged in modeling the bar-on-bar experiments have varying degrees of success in capturing the observed fracture modes in bars and correctly simulating the measured in-situ axial stress or free surface velocity histories. The difficulties encountered are related to uncertainties in understanding the dominant failure mechanisms as a function of different stress states imposed in bar impacts. Free surface velocity of the far end of the target AD998 bar were measured using a VISAR in a series of bar-on-bar impact experiments at nominal impact speeds of 100 m/s, 220 m/s, and 300 m/s. Velocity history data at an impact of 100 m/s show the material response as elastic. At higher impact velocities of 200 m/s and 300 m/s the velocity history data suggest an inelastic material response. A high-speed (Imacon) camera was employed to examine the fracture and failure of impactor and target bars. High speed photographs provide comprehensive data on geometry of damage and failure patterns as a function of time to check the validity of a particular constitutive material model for AD998 alumina used in numerical simulations of fracture and failure of the bars on impact

  5. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  6. Development and validation of a mass casualty conceptual model.

    Science.gov (United States)

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  7. Modeling Root Growth, Crop Growth and N Uptake of Winter Wheat Based on SWMS_2D: Model and Validation

    Directory of Open Access Journals (Sweden)

    Dejun Yang

    Full Text Available ABSTRACT Simulations for root growth, crop growth, and N uptake in agro-hydrological models are of significant concern to researchers. SWMS_2D is one of the most widely used physical hydrologically related models. This model solves equations that govern soil-water movement by the finite element method, and has a public access source code. Incorporating key agricultural components into the SWMS_2D model is of practical importance, especially for modeling some critical cereal crops such as winter wheat. We added root growth, crop growth, and N uptake modules into SWMS_2D. The root growth model had two sub-models, one for root penetration and the other for root length distribution. The crop growth model used was adapted from EU-ROTATE_N, linked to the N uptake model. Soil-water limitation, nitrogen limitation, and temperature effects were all considered in dry-weight modeling. Field experiments for winter wheat in Bouwing, the Netherlands, in 1983-1984 were selected for validation. Good agreements were achieved between simulations and measurements, including soil water content at different depths, normalized root length distribution, dry weight and nitrogen uptake. This indicated that the proposed new modules used in the SWMS_2D model are robust and reliable. In the future, more rigorous validation should be carried out, ideally under 2D situations, and attention should be paid to improve some modules, including the module simulating soil N mineralization.

  8. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  9. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  10. A comparison of measurements and calculations for the Stripa validation drift inflow experiment

    International Nuclear Information System (INIS)

    Hodgkinson, D.P.; Cooper, N.S.

    1992-01-01

    This data presents a comparison of measurements and predictions for groundwater flow to the validation drift and remaining portions of the D-holes in the Site Characterisation and Validation (SCV) block. The comparison was carried out of behalf of the Stripa task force on fracture flow modelling. The paper summarises the characterisation data and their preliminary interpretation, and reviews the fracture flow modelling approaches and predictions made by teams from AEA Technology/Fracflow, Golder Associated and Lawrence Berkely Laboratory. The predictions are compared with the inflow measurements on the basis of the validation process and criteria defined by the Task Force. The results of all three modelling groups meet the validation criteria, with the predictions of the inflow being of the same order of magnitude as the observations. Also the AEA/Fracflow and Golder approaches allow the inflow pattern to be predicted and this too is reproduced with reasonable accuracy. The successful completion of this project demonstrates the feasibility of discrete fracture flow modelling, and in particular the ability to collect and analyse all the necessary characterization data in a timely and economic manner. (32 refs.) (au)

  11. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  12. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  13. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  14. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  15. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  16. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  17. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  18. Hypergraph-Based Recognition Memory Model for Lifelong Experience

    Science.gov (United States)

    2014-01-01

    Cognitive agents are expected to interact with and adapt to a nonstationary dynamic environment. As an initial process of decision making in a real-world agent interaction, familiarity judgment leads the following processes for intelligence. Familiarity judgment includes knowing previously encoded data as well as completing original patterns from partial information, which are fundamental functions of recognition memory. Although previous computational memory models have attempted to reflect human behavioral properties on the recognition memory, they have been focused on static conditions without considering temporal changes in terms of lifelong learning. To provide temporal adaptability to an agent, in this paper, we suggest a computational model for recognition memory that enables lifelong learning. The proposed model is based on a hypergraph structure, and thus it allows a high-order relationship between contextual nodes and enables incremental learning. Through a simulated experiment, we investigate the optimal conditions of the memory model and validate the consistency of memory performance for lifelong learning. PMID:25371665

  19. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

    International Nuclear Information System (INIS)

    Kirk Nordstrom, D.

    2012-01-01

    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  20. EXQ: development and validation of a multiple-item scale for assessing customer experience quality

    OpenAIRE

    Klaus, Philipp

    2010-01-01

    Positioned in the deliberations related to service marketing, the conceptualisation of service quality, current service quality measurements, and the importance of the evolving construct of customer experience, this thesis develops and validates a measurement for customer experience quality (EXQ) in the context of repeat purchases of mortgage buyers in the United Kingdom. The thesis explores the relationship between the customer experience quality and the important marketing ou...

  1. An independent verification and validation of the Future Theater Level Model conceptual model

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  2. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  3. Fundamental validation of simulation method for thermal stratification in upper plenum of fast reactors. Analysis of sodium experiment

    International Nuclear Information System (INIS)

    Ohno, Shuji; Ohshima, Hiroyuki; Sugahara, Akihiro; Ohki, Hiroshi

    2010-01-01

    Three-dimensional thermal-hydraulic analyses have been carried out for a sodium experiment in a relatively simple axis-symmetric geometry using a commercial CFD code in order to validate simulating methods for thermal stratification behavior in an upper plenum of sodium-cooled fast reactor. Detailed comparison between simulated results and experimental measurement has demonstrated that the code reproduced fairly well the fundamental thermal stratification behaviors such as vertical temperature gradient and upward movement of a stratification interface when utilizing high-order discretization scheme and appropriate mesh size. Furthermore, the investigation has clarified the influence of RANS type turbulence models on phenomena predictability; i.e. the standard k-ε model, the RNG k-ε model and the Reynolds Stress Model. (author)

  4. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    Science.gov (United States)

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, large validation cohort. The model requires all 3 criteria to be met to signal a need for examinations, but some infants with a birth weight or gestational age above the thresholds developed severe ROP. Most of these infants who were not detected by the CO-ROP model had obvious deviation in expected weight trajectories or nonphysiologic weight gain. These findings suggest that the CO-ROP model needs to be revised before considering implementation into clinical practice.

  5. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation

  6. Validation of two-phase flow code THYC on VATICAN experiment

    International Nuclear Information System (INIS)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B.

    1997-01-01

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project > has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  7. Validation of two-phase flow code THYC on VATICAN experiment

    Energy Technology Data Exchange (ETDEWEB)

    Maurel, F.; Portesse, A.; Rimbert, P.; Thomas, B. [EDF/DER, Dept. TTA, 78 - Chatou (France)

    1997-12-31

    As part of a comprehensive program for THYC validation (THYC is a 3-dimensional two-phase flow computer code for PWR core configuration), an experimental project <> has been initiated by the Direction des Etudes et Recherches of Electricite de France. Two mock-ups tested in Refrigerant-114, VATICAN-1 (with simple space grids) and VATICAN-2 (with mixing grids) were set up to investigate void fraction distributions using a single beam gamma densitometer. First, experiments were conducted with the VATICAN-1 mock-up. A set of constitutive laws to be used in rod bundles was determined but some doubts still remain for friction losses closure laws for oblique flow over tubes. From VATICAN-2 tests, calculations were performed using the standard set of correlations. Comparison with the experimental data shows an underprediction of void fraction by THYC in disturbed regions. Analyses highlight the poor treatment of axial relative velocity in these regions. A fitting of the radial and axial relative velocity values in the disturbed region improves the prediction of void fraction by the code but without any physical explanation. More analytical experiments should be carried out to validate friction losses closure laws for oblique flows and relative velocity downstream of a mixing grid. (author)

  8. Effect of Drawer Master Modeling of ZPPR15 Phase A Reactor Physics Experiment on Integral Parameter

    International Nuclear Information System (INIS)

    Yoo, Jae Woon; Kim, Sang Ji

    2011-01-01

    As a part of an International-Nuclear Engineering Research Initiative (I-NERI) Project, KAERI and ANL are analyzing the ZPPR-15 reactor physics experiments. The ZPPR-15 experiments were carried out in support of the Integral Fast Reactor (IFR) project. Because of lack of the experimental data, verifying and validating the core neutronics analysis code for metal fueled sodium cooled fast reactors (SFR) has been one of the big concerns. KAERI is developing the metal fuel loaded SFR and plans to construct the demonstration SFR by around 2028. Database built through this project and its result of analysis will play an important role in validating the SFR neutronics characteristics. As the first year work of I-NERI project, KAERI analyzed ZPPR-15 Phase A experiment among four phases (Phase A to D). The effect of a drawer master modeling on the integral parameter was investigated. The approximated benchmark configurations for each loading were constructed to be used for validating a deterministic code

  9. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  10. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  11. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  12. Validation of numerical model for cook stove using Reynolds averaged Navier-Stokes based solver

    Science.gov (United States)

    Islam, Md. Moinul; Hasan, Md. Abdullah Al; Rahman, Md. Mominur; Rahaman, Md. Mashiur

    2017-12-01

    Biomass fired cook stoves, for many years, have been the main cooking appliance for the rural people of developing countries. Several researches have been carried out to the find efficient stoves. In the present study, numerical model of an improved household cook stove is developed to analyze the heat transfer and flow behavior of gas during operation. The numerical model is validated with the experimental results. Computation of the numerical model is executed the using non-premixed combustion model. Reynold's averaged Navier-Stokes (RaNS) equation along with the κ - ɛ model governed the turbulent flow associated within the computed domain. The computational results are in well agreement with the experiment. Developed numerical model can be used to predict the effect of different biomasses on the efficiency of the cook stove.

  13. Assessing decentering: validation, psychometric properties, and clinical usefulness of the Experiences Questionnaire in a Spanish sample.

    Science.gov (United States)

    Soler, Joaquim; Franquesa, Alba; Feliu-Soler, Albert; Cebolla, Ausias; García-Campayo, Javier; Tejedor, Rosa; Demarzo, Marcelo; Baños, Rosa; Pascual, Juan Carlos; Portella, Maria J

    2014-11-01

    Decentering is defined as the ability to observe one's thoughts and feelings in a detached manner. The Experiences Questionnaire (EQ) is a self-report instrument that originally assessed decentering and rumination. The purpose of this study was to evaluate the psychometric properties of the Spanish version of EQ-Decentering and to explore its clinical usefulness. The 11-item EQ-Decentering subscale was translated into Spanish and psychometric properties were examined in a sample of 921 adult individuals, 231 with psychiatric disorders and 690 without. The subsample of nonpsychiatric participants was also split according to their previous meditative experience (meditative participants, n=341; and nonmeditative participants, n=349). Additionally, differences among these three subgroups were explored to determine clinical validity of the scale. Finally, EQ-Decentering was administered twice in a group of borderline personality disorder, before and after a 10-week mindfulness intervention. Confirmatory factor analysis indicated acceptable model fit, sbχ(2)=243.8836 (p.46; and divergent validity: r<-.35). The scale detected changes in decentering after a 10-session intervention in mindfulness (t=-4.692, p<.00001). Differences among groups were significant (F=134.8, p<.000001), where psychiatric participants showed the lowest scores compared to nonpsychiatric meditative and nonmeditative participants. The Spanish version of the EQ-Decentering is a valid and reliable instrument to assess decentering either in clinical and nonclinical samples. In addition, the findings show that EQ-Decentering seems an adequate outcome instrument to detect changes after mindfulness-based interventions. Copyright © 2014. Published by Elsevier Ltd.

  14. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  15. Cost model validation: a technical and cultural approach

    Science.gov (United States)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  16. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  17. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  18. Validating neural-network refinements of nuclear mass models

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  19. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  20. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  1. Cylindrical magnetization model for glass-coated microwires with circumferential anisotropy: Comparison with experiments and skin effect

    Energy Technology Data Exchange (ETDEWEB)

    Torrejon, J., E-mail: torrejondiaz.jacob@nims.go.jp [Laboratoire de Physique des Solides, Univ. Paris-Sud, CNRS UMR 8502, 91405 Orsay (France); Instituto de Ciencia de Materiales, CSIC, 28049 Madrid (Spain); Thiaville, A. [Laboratoire de Physique des Solides, Univ. Paris-Sud, CNRS UMR 8502, 91405 Orsay (France); Adenot-Engelvin, A.-L. [CEA, DAM, Le Ripault, 37260 Monts (France); Vazquez, M. [Instituto de Ciencia de Materiales, CSIC, 28049 Madrid (Spain)

    2014-05-01

    The present manuscript represents the third part of a series of studies about a continuous micromagnetic model for amorphous microwires with non-uniform magnetic structure (Torrejon et al., J. Magn. Magn. Mater. 323 (2011) 283; Torrejon et al., J. Magn. Magn. Mater. 333 (2013) 144). Here we compare the predictions of this model with experiments, and show the validity of this approach when a uniform magnetic structure in the microwire cannot be considered. The analyzed microwires exhibit ultrasoft magnetic behaviour and negative magnetostriction, with a non-uniform magnetic structure composed of an axially magnetized inner core exchange-coupled with a circumferentially magnetized outer shell. The static properties were obtained by magnetometry. The high frequency response, axial permeability, was measured from a conventional single coil permeameter connected to a network analyzer. The microwave response is strongly affected by skin effect, which therefore needs to be taken into account for comparison with theory. The validity of the continuous model is proved through the experimental dependence of the permeability on axial static field. Finally, the efficient dynamic magnetization is evaluated from the imaginary component of permeability. - Highlights: • We model magnetic properties of microwires with circumferential anisotropy. • Skin effect correction has to be considered for small microwires. • Validity of model is proved by permeability dependence on axial static field. • Wires with small volume of the core can be well described by macrospin approach. • The exchange-coupled continuous core-shell model is compared to experiments.

  2. Influence of delayed neutron parameter calculation accuracy on results of modeled WWER scram experiments

    International Nuclear Information System (INIS)

    Artemov, V.G.; Gusev, V.I.; Zinatullin, R.E.; Karpov, A.S.

    2007-01-01

    Using modeled WWER cram rod drop experiments, performed at the Rostov NPP, as an example, the influence of delayed neutron parameters on the modeling results was investigated. The delayed neutron parameter values were taken from both domestic and foreign nuclear databases. Numerical modeling was carried out on the basis of SAPFIR 9 5andWWERrogram package. Parameters of delayed neutrons were acquired from ENDF/B-VI and BNAB-78 validated data files. It was demonstrated that using delay fraction data from different databases in reactivity meters led to significantly different reactivity results. Based on the results of numerically modeled experiments, delayed neutron parameters providing the best agreement between calculated and measured data were selected and recommended for use in reactor calculations (Authors)

  3. Validation analysis of probabilistic models of dietary exposure to food additives.

    Science.gov (United States)

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  4. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  5. Modelling and validation of Proton exchange membrane fuel cell (PEMFC)

    Science.gov (United States)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.

    2018-01-01

    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  6. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  7. Cultural consensus modeling to measure transactional sex in Swaziland: Scale building and validation.

    Science.gov (United States)

    Fielding-Miller, Rebecca; Dunkle, Kristin L; Cooper, Hannah L F; Windle, Michael; Hadley, Craig

    2016-01-01

    Transactional sex is associated with increased risk of HIV and gender based violence in southern Africa and around the world. However the typical quantitative operationalization, "the exchange of gifts or money for sex," can be at odds with a wide array of relationship types and motivations described in qualitative explorations. To build on the strengths of both qualitative and quantitative research streams, we used cultural consensus models to identify distinct models of transactional sex in Swaziland. The process allowed us to build and validate emic scales of transactional sex, while identifying key informants for qualitative interviews within each model to contextualize women's experiences and risk perceptions. We used logistic and multinomial logistic regression models to measure associations with condom use and social status outcomes. Fieldwork was conducted between November 2013 and December 2014 in the Hhohho and Manzini regions. We identified three distinct models of transactional sex in Swaziland based on 124 Swazi women's emic valuation of what they hoped to receive in exchange for sex with their partners. In a clinic-based survey (n = 406), consensus model scales were more sensitive to condom use than the etic definition. Model consonance had distinct effects on social status for the three different models. Transactional sex is better measured as an emic spectrum of expectations within a relationship, rather than an etic binary relationship type. Cultural consensus models allowed us to blend qualitative and quantitative approaches to create an emicly valid quantitative scale grounded in qualitative context. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Validation of elastic cross section models for space radiation applications

    Energy Technology Data Exchange (ETDEWEB)

    Werneth, C.M., E-mail: charles.m.werneth@nasa.gov [NASA Langley Research Center (United States); Xu, X. [National Institute of Aerospace (United States); Norman, R.B. [NASA Langley Research Center (United States); Ford, W.P. [The University of Tennessee (United States); Maung, K.M. [The University of Southern Mississippi (United States)

    2017-02-01

    The space radiation field is composed of energetic particles that pose both acute and long-term risks for astronauts in low earth orbit and beyond. In order to estimate radiation risk to crew members, the fluence of particles and biological response to the radiation must be known at tissue sites. Given that the spectral fluence at the boundary of the shielding material is characterized, radiation transport algorithms may be used to find the fluence of particles inside the shield and body, and the radio-biological response is estimated from experiments and models. The fidelity of the radiation spectrum inside the shield and body depends on radiation transport algorithms and the accuracy of the nuclear cross sections. In a recent study, self-consistent nuclear models based on multiple scattering theory that include the option to study relativistic kinematics were developed for the prediction of nuclear cross sections for space radiation applications. The aim of the current work is to use uncertainty quantification to ascertain the validity of the models as compared to a nuclear reaction database and to identify components of the models that can be improved in future efforts.

  10. Cross-validation of an employee safety climate model in Malaysia.

    Science.gov (United States)

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  11. Development and validation of a model for CANDU-6 SDS2 poison injection analysis

    International Nuclear Information System (INIS)

    Lee, B. W.; Jung, C. J.; Min, B. J.; Yoon, H. J.; Choi, J. H.; Jang, D. S.

    2002-01-01

    In CANDU-6 reactor there are two independent reactor shutdown systems. The shutdown system no. 2(SDS2) injects the liquid poison into the moderator tank by high pressure via small holes on the 6 nozzle pipes and stops the nuclear chain reaction. To ensure the safe shutdown of a reactor loaded with either DUPIC or SEU fuels it is necessary for the poison curtains generated by jets provide quick, and enough negative reactivity to the reactor during the early stage of the accident. In order to produce the neutron cross section necessary to perform this work, the poison concentration distribution during the transient is necessary. The motivation for this work arose from the fact that the computer code package for performing this task is not transfered to Korea yet. In this study, a set of models for analyzing the transient poison concentration induced by this high pressure poison injection jet activated upon the reactor trip in a CANDU-6 reactor moderator tank has been developed and used to generate the poison concentration distribution of the poison curtains induced by the high pressure jets injected into the vacant region between the pressure tube banks. The poison injection rate through the jet holes drilled on the nozzle pipes is obtained by a 1-D transient hydrodynamic code called, ALITRIG, and this injection rate is used to provide the inlet boundary condition to a 3-D CFD model of the moderator tank based on CFX4.3, a commercial CFD code developed by AEA technology, to simulate the formation of the poison jet curtain inside the moderator tank. For the validation, a simulation for a generic CANDU-6 SDS2 design poison jet growth experiment was made to evaluate this model's capability against experiment. As no concentration field was measured and only the growth of the poison jet height was obtained by high speed camera, the validation was limited as such. The result showed that if one assume the jet front corresponds to 200 ppm of the poison the model succeed to

  12. Development and validation of a critical gradient energetic particle driven Alfven eigenmode transport model for DIII-D tilted neutral beam experiments

    Science.gov (United States)

    Waltz, R. E.; Bass, E. M.; Heidbrink, W. W.; VanZeeland, M. A.

    2015-11-01

    Recent experiments with the DIII-D tilted neutral beam injection (NBI) varying the beam energetic particle (EP) source profiles have provided strong evidence that unstable Alfven eigenmodes (AE) drive stiff EP transport at a critical EP density gradient [Heidbrink et al 2013 Nucl. Fusion 53 093006]. Here the critical gradient is identified by the local AE growth rate being equal to the local ITG/TEM growth rate at the same low toroidal mode number. The growth rates are taken from the gyrokinetic code GYRO. Simulation show that the slowing down beam-like EP distribution has a slightly lower critical gradient than the Maxwellian. The ALPHA EP density transport code [Waltz and Bass 2014 Nucl. Fusion 54 104006], used to validate the model, combines the low-n stiff EP critical density gradient AE mid-core transport with the Angioni et al (2009 Nucl. Fusion 49 055013) energy independent high-n ITG/TEM density transport model controling the central core EP density profile. For the on-axis NBI heated DIII-D shot 146102, while the net loss to the edge is small, about half the birth fast ions are transported from the central core r/a  <  0.5 and the central density is about half the slowing down density. These results are in good agreement with experimental fast ion pressure profiles inferred from MSE constrained EFIT equilibria.

  13. SCALE Validation Experience Using an Expanded Isotopic Assay Database for Spent Nuclear Fuel

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Radulescu, Georgeta; Ilas, Germina

    2009-01-01

    The availability of measured isotopic assay data to validate computer code predictions of spent fuel compositions applied in burnup-credit criticality calculations is an essential component for bias and uncertainty determination in safety and licensing analyses. In recent years, as many countries move closer to implementing or expanding the use of burnup credit in criticality safety for licensing, there has been growing interest in acquiring additional high-quality assay data. The well-known open sources of assay data are viewed as potentially limiting for validating depletion calculations for burnup credit due to the relatively small number of isotopes measured (primarily actinides with relatively few fission products), sometimes large measurement uncertainties, incomplete documentation, and the limited burnup and enrichment range of the fuel samples. Oak Ridge National Laboratory (ORNL) recently initiated an extensive isotopic validation study that includes most of the public data archived in the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) electronic database, SFCOMPO, and new datasets obtained through participation in commercial experimental programs. To date, ORNL has analyzed approximately 120 different spent fuel samples from pressurized-water reactors that span a wide enrichment and burnup range and represent a broad class of assembly designs. The validation studies, completed using SCALE 5.1, are being used to support a technical basis for expanded implementation of burnup credit for spent fuel storage facilities, and other spent fuel analyses including radiation source term, dose assessment, decay heat, and waste repository safety analyses. This paper summarizes the isotopic assay data selected for this study, presents validation results obtained with SCALE 5.1, and discusses some of the challenges and experience associated with evaluating the results. Preliminary results obtained using SCALE 6 and ENDF

  14. Experiment of Laser Pointing Stability on Different Surfaces to validate Micrometric Positioning Sensor

    CERN Document Server

    AUTHOR|(SzGeCERN)721924; Mainaud Durand, Helene; Piedigrossi, Didier; Sandomierski, Jacek; Sosin, Mateusz; Geiger, Alain; Guillaume, Sebastien

    2014-01-01

    CLIC requires 10 μm precision and accuracy over 200m for the pre-alignment of beam related components. A solution based on laser beam as straight line reference is being studied at CERN. It involves camera/shutter assemblies as micrometric positioning sensors. To validate the sensors, it is necessary to determine an appropriate material for the shutter in terms of laser pointing stability. Experiments are carried out with paper, metal and ceramic surfaces. This paper presents the standard deviations of the laser spot coordinates obtained on the different surfaces, as well as the measurement error. Our experiments validate the choice of paper and ceramic for the shutter of the micrometric positioning sensor. It also provides an estimate of the achievable precision and accuracy of the determination of the laser spot centre with respect to the shutter coordinate system defined by reference targets.

  15. Characterization and validation of an in silico toxicology model to predict the mutagenic potential of drug impurities*

    Energy Technology Data Exchange (ETDEWEB)

    Valerio, Luis G., E-mail: luis.valerio@fda.hhs.gov [Science and Research Staff, Office of Pharmaceutical Science, Center for Drug Evaluation and Research, U.S. Food and Drug Administration, 10903 New Hampshire Avenue, Silver Spring, MD 20993–0002 (United States); Cross, Kevin P. [Leadscope, Inc., 1393 Dublin Road, Columbus, OH, 43215–1084 (United States)

    2012-05-01

    Control and minimization of human exposure to potential genotoxic impurities found in drug substances and products is an important part of preclinical safety assessments of new drug products. The FDA's 2008 draft guidance on genotoxic and carcinogenic impurities in drug substances and products allows use of computational quantitative structure–activity relationships (QSAR) to identify structural alerts for known and expected impurities present at levels below qualified thresholds. This study provides the information necessary to establish the practical use of a new in silico toxicology model for predicting Salmonella t. mutagenicity (Ames assay outcome) of drug impurities and other chemicals. We describe the model's chemical content and toxicity fingerprint in terms of compound space, molecular and structural toxicophores, and have rigorously tested its predictive power using both cross-validation and external validation experiments, as well as case studies. Consistent with desired regulatory use, the model performs with high sensitivity (81%) and high negative predictivity (81%) based on external validation with 2368 compounds foreign to the model and having known mutagenicity. A database of drug impurities was created from proprietary FDA submissions and the public literature which found significant overlap between the structural features of drug impurities and training set chemicals in the QSAR model. Overall, the model's predictive performance was found to be acceptable for screening drug impurities for Salmonella mutagenicity. -- Highlights: ► We characterize a new in silico model to predict mutagenicity of drug impurities. ► The model predicts Salmonella mutagenicity and will be useful for safety assessment. ► We examine toxicity fingerprints and toxicophores of this Ames assay model. ► We compare these attributes to those found in drug impurities known to FDA/CDER. ► We validate the model and find it has a desired predictive

  16. Characterization and validation of an in silico toxicology model to predict the mutagenic potential of drug impurities*

    International Nuclear Information System (INIS)

    Valerio, Luis G.; Cross, Kevin P.

    2012-01-01

    Control and minimization of human exposure to potential genotoxic impurities found in drug substances and products is an important part of preclinical safety assessments of new drug products. The FDA's 2008 draft guidance on genotoxic and carcinogenic impurities in drug substances and products allows use of computational quantitative structure–activity relationships (QSAR) to identify structural alerts for known and expected impurities present at levels below qualified thresholds. This study provides the information necessary to establish the practical use of a new in silico toxicology model for predicting Salmonella t. mutagenicity (Ames assay outcome) of drug impurities and other chemicals. We describe the model's chemical content and toxicity fingerprint in terms of compound space, molecular and structural toxicophores, and have rigorously tested its predictive power using both cross-validation and external validation experiments, as well as case studies. Consistent with desired regulatory use, the model performs with high sensitivity (81%) and high negative predictivity (81%) based on external validation with 2368 compounds foreign to the model and having known mutagenicity. A database of drug impurities was created from proprietary FDA submissions and the public literature which found significant overlap between the structural features of drug impurities and training set chemicals in the QSAR model. Overall, the model's predictive performance was found to be acceptable for screening drug impurities for Salmonella mutagenicity. -- Highlights: ► We characterize a new in silico model to predict mutagenicity of drug impurities. ► The model predicts Salmonella mutagenicity and will be useful for safety assessment. ► We examine toxicity fingerprints and toxicophores of this Ames assay model. ► We compare these attributes to those found in drug impurities known to FDA/CDER. ► We validate the model and find it has a desired predictive performance.

  17. The COSIMA-experiments, a data base for validation of two-phase flow computer codes

    International Nuclear Information System (INIS)

    Class, G.; Meyder, R.; Stratmanns, E.

    1985-12-01

    The report presents an overview on the large data base generated with COSIMA. The data base is to be used to validate and develop computer codes for two-phase flow. In terms of fuel rod behavior it was found that during blowdown under realistic conditions only small strains are reached. For clad rupture extremely high rod internal pressure is necessary. Additionally important results were found in the behavior of a fuel rod simulator and on the effect of thermocouples attached on the cladding outer surface. Post-test calculations, performed with the codes RELAP and DRUFAN show a good agreement with the experiments. This however can be improved if the phase separation models in the codes would be updated. (orig./HP) [de

  18. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  20. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    Science.gov (United States)

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. The cross-cultural validity of the Caregiving Experiences Questionnaire (CEQ) among Danish mothers with preschool children

    DEFF Research Database (Denmark)

    Røhder, Katrine; George, Carol; Brennan, Jessica

    2018-01-01

    The present study explored the Danish cross-cultural validity of the Caregiving Experiences Questionnaire (CEQ), a new measure of caregiving representations in parent-child relationships. Low-risk Danish mothers (N = 159) with children aged 1.5–5 years completed the CEQ and predictive validity...

  2. Validation and sensitivity tests on improved parametrizations of a land surface process model (LSPM) in the Po Valley

    International Nuclear Information System (INIS)

    Cassardo, C.; Carena, E.; Longhetto, A.

    1998-01-01

    The Land Surface Process Model (LSPM) has been improved with respect to the 1. version of 1994. The modifications have involved the parametrizations of the radiation terms and of turbulent heat fluxes. A parametrization of runoff has also been developed, in order to close the hydrologic balance. This 2. version of LSPM has been validated against experimental data gathered at Mottarone (Verbania, Northern Italy) during a field experiment. The results of this validation show that this new version is able to apportionate the energy into sensible and latent heat fluxes. LSPM has also been submitted to a series of sensitivity tests in order to investigate the hydrological part of the model. The physical quantities selected in these sensitivity experiments have been the initial soil moisture content and the rainfall intensity. In each experiment, the model has been forced by using the observations carried out at the synoptic stations of San Pietro Capofiume (Po Valley, Italy). The observed characteristics of soil and vegetation (not involved in the sensitivity tests) have been used as initial and boundary conditions. The results of the simulation show that LSPM can reproduce well the energy, heat and water budgets and their behaviours with varying the selected parameters. A careful analysis of the LSPM output shows also the importance to identify the effective soil type

  3. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  4. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  5. A clinical reasoning model focused on clients' behaviour change with reference to physiotherapists: its multiphase development and validation.

    Science.gov (United States)

    Elvén, Maria; Hochwälder, Jacek; Dean, Elizabeth; Söderlund, Anne

    2015-05-01

    A biopsychosocial approach and behaviour change strategies have long been proposed to serve as a basis for addressing current multifaceted health problems. This emphasis has implications for clinical reasoning of health professionals. This study's aim was to develop and validate a conceptual model to guide physiotherapists' clinical reasoning focused on clients' behaviour change. Phase 1 consisted of the exploration of existing research and the research team's experiences and knowledge. Phases 2a and 2b consisted of validation and refinement of the model based on input from physiotherapy students in two focus groups (n = 5 per group) and from experts in behavioural medicine (n = 9). Phase 1 generated theoretical and evidence bases for the first version of a model. Phases 2a and 2b established the validity and value of the model. The final model described clinical reasoning focused on clients' behaviour change as a cognitive, reflective, collaborative and iterative process with multiple interrelated levels that included input from the client and physiotherapist, a functional behavioural analysis of the activity-related target behaviour and the selection of strategies for behaviour change. This unique model, theory- and evidence-informed, has been developed to help physiotherapists to apply clinical reasoning systematically in the process of behaviour change with their clients.

  6. Thermal fluid-solid interaction model and experimental validation for hydrostatic mechanical face seals

    Science.gov (United States)

    Huang, Weifeng; Liao, Chuanjun; Liu, Xiangfeng; Suo, Shuangfu; Liu, Ying; Wang, Yuming

    2014-09-01

    Hydrostatic mechanical face seals for reactor coolant pumps are very important for the safety and reliability of pressurized-water reactor power plants. More accurate models on the operating mechanism of the seals are needed to help improve their performance. The thermal fluid-solid interaction (TFSI) mechanism of the hydrostatic seal is investigated in this study. Numerical models of the flow field and seal assembly are developed. Based on the mechanism for the continuity condition of the physical quantities at the fluid-solid interface, an on-line numerical TFSI model for the hydrostatic mechanical seal is proposed using an iterative coupling method. Dynamic mesh technology is adopted to adapt to the changing boundary shape. Experiments were performed on a test rig using a full-size test seal to obtain the leakage rate as a function of the differential pressure. The effectiveness and accuracy of the TFSI model were verified by comparing the simulation results and experimental data. Using the TFSI model, the behavior of the seal is presented, including mechanical and thermal deformation, and the temperature field. The influences of the rotating speed and differential pressure of the sealing device on the temperature field, which occur widely in the actual use of the seal, are studied. This research proposes an on-line and assembly-based TFSI model for hydrostatic mechanical face seals, and the model is validated by full-sized experiments.

  7. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    Science.gov (United States)

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  8. Reactivity worth measurements on fast burst reactor Caliban - description and interpretation of integral experiments for the validation of nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Richard, B. [Commissariat a l' Energie Atomique et Aux Energies Alternatives CEA, DAM, VALDUC, F-21120 Is-sur-Tille (France)

    2012-07-01

    Reactivity perturbation experiments using various materials are being performed on the HEU fast core CALIBAN, an experimental device operated by the CEA VALDUC Criticality and Neutron Transport Research Laboratory. These experiments provide valuable information to contribute to the validation of nuclear data for the materials used in such measurements. This paper presents the results obtained in a first series of measurements performed with Au-197 samples. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed. The experimental results have been compared to numerical calculation using both deterministic and Monte Carlo neutron transport codes with a simplified model of the reactor. This early work led to a methodology which will be applied to the future experiments which will concern other materials of interest. (authors)

  9. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  10. Some guidance on preparing validation plans for the DART Full System Models.

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy (Sandia National Laboratories, Albuquerque, NM)

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generally applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.

  11. Studying Validity of Single-Fluid Model in Inertial Confinement Fusion

    International Nuclear Information System (INIS)

    Gu Jian-Fa; Fan Zheng-Feng; Dai Zhen-Sheng; Ye Wen-Hua; Pei Wen-Bing; Zhu Shao-Ping

    2014-01-01

    The validity of single-fluid model in inertial confinement fusion simulations is studied by comparing the results of the multi- and single-fluid models. The multi-fluid model includes the effects of collision and interpenetration between fluid species. By simulating the collision of fluid species, steady-state shock propagation into the thin DT gas and expansion of hohlraum Au wall heated by lasers, the results show that the validity of single-fluid model is strongly dependent on the ratio of the characteristic length of the simulated system to the particle mean free path. When the characteristic length L is one order larger than the mean free path λ, the single-fluid model's results are found to be in good agreement with the multi-fluid model's simulations, and the modeling of single-fluid remains valid. If the value of L/λ is lower than 10, the interpenetration between fluid species is significant, and the single-fluid simulations show some unphysical results; while the multi-fluid model can describe well the interpenetration and mix phenomena, and give more reasonable results. (physics of gases, plasmas, and electric discharges)

  12. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  13. A Phenomenological Model and Validation of Shortening Induced Force Depression during Muscle Contractions

    Science.gov (United States)

    McGowan, C.P.; Neptune, R.R.; Herzog, W.

    2009-01-01

    History dependent effects on muscle force development following active changes in length have been measured in a number of experimental studies. However, few muscle models have included these properties or examined their impact on force and power output in dynamic cyclic movements. The goal of this study was to develop and validate a modified Hill-type muscle model that includes shortening induced force depression and assess its influence on locomotor performance. The magnitude of force depression was defined by empirical relationships based on muscle mechanical work. To validate the model, simulations incorporating force depression were developed to emulate single muscle in situ and whole muscle group leg extension experiments. There was excellent agreement between simulation and experimental values, with in situ force patterns closely matching the experimental data (average RMS error pedaling with and without force depression were generated. Force depression decreased maximum crank power by 20% – 40%, depending on the relationship between force depression and muscle work used. These results indicate that force depression has the potential to substantially influence muscle power output in dynamic cyclic movements. However, to fully understand the impact of this phenomenon on human movement, more research is needed to characterize the relationship between force depression and mechanical work in large muscles with different morphologies. PMID:19879585

  14. Busbar arcs at large fusion magnets: Conductor to feeder tube arcing model experiments with the LONGARC device

    Energy Technology Data Exchange (ETDEWEB)

    Klimenko, Dmitry, E-mail: dmitry.klimenko@kit.edu; Pasler, Volker

    2014-10-15

    Highlights: •The LONGARC device was successfully implemented for busbar to feeder tubes arcing model experiments. •Arcing at an ITER busbar inside its feeder tube was simulated in scaled model experiments. •The narrower half tubes imply a slight increase of the arc propagation speed in compare to full tube experiments. •All simulated half tubes experiments show severe damage indicating that the ITER inner feeder tube will not withstand a busbar arc. -- Abstract: Electric arcs moving along the power cables (the so-called busbars) of the toroidal field (TF) coils of ITER may reach and penetrate the cryostat wall. Model experiments with the new LONGARC device continue the VACARC (VACuum ARC) experiments that were initiated to investigate the propagation and destruction mechanisms of busbar arcs in small scale [1]. The experiments are intended to support the development and validation of a numerical model. LONGARC overcomes the space limitations inside VACARC and allows also for advanced 1:3 (vs. ITER full scale) model setups. The LONGARC device and first results are presented below.

  15. Validating a continental-scale groundwater diffuse pollution model using regional datasets.

    Science.gov (United States)

    Ouedraogo, Issoufou; Defourny, Pierre; Vanclooster, Marnik

    2017-12-11

    In this study, we assess the validity of an African-scale groundwater pollution model for nitrates. In a previous study, we identified a statistical continental-scale groundwater pollution model for nitrate. The model was identified using a pan-African meta-analysis of available nitrate groundwater pollution studies. The model was implemented in both Random Forest (RF) and multiple regression formats. For both approaches, we collected as predictors a comprehensive GIS database of 13 spatial attributes, related to land use, soil type, hydrogeology, topography, climatology, region typology, nitrogen fertiliser application rate, and population density. In this paper, we validate the continental-scale model of groundwater contamination by using a nitrate measurement dataset from three African countries. We discuss the issue of data availability, and quality and scale issues, as challenges in validation. Notwithstanding that the modelling procedure exhibited very good success using a continental-scale dataset (e.g. R 2  = 0.97 in the RF format using a cross-validation approach), the continental-scale model could not be used without recalibration to predict nitrate pollution at the country scale using regional data. In addition, when recalibrating the model using country-scale datasets, the order of model exploratory factors changes. This suggests that the structure and the parameters of a statistical spatially distributed groundwater degradation model for the African continent are strongly scale dependent.

  16. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2012-02-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  17. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2011-03-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  18. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  19. The international intraval project to study validation of geosphere transport models for performance assessment of nuclear waste disposal

    International Nuclear Information System (INIS)

    1990-01-01

    INTRAVAL is an international project concerned with the use of mathematical models for predicting the potential transport of radioactive substances in the geosphere. Such models are used to help assess the longterm safety of radioactive waste disposal systems. The INTRAVAL project was established to evaluate the validity of these models. Results from a set of selected laboratory and field experiments as well as studies of occurrences of radioactive substances in nature (natural analogues) are compared in a systematic way with model predictions. Discrepancies between observations and predictions are discussed and analyzed

  20. Validation experiment of a numerically processed millimeter-wave interferometer in a laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kogi, Y., E-mail: kogi@fit.ac.jp; Higashi, T.; Matsukawa, S. [Department of Information Electronics, Fukuoka Institute of Technology, Fukuoka 811-0295 (Japan); Mase, A. [Art, Science and Technology Center for Cooperative Research, Kyushu University, Kasuga, Fukuoka 816-0811 (Japan); Kohagura, J.; Yoshikawa, M. [Plasma Research Center, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Nagayama, Y.; Kawahata, K. [National Institute for Fusion Science, Toki, Gifu 509-5202 (Japan); Kuwahara, D. [Tokyo University of Agriculture and Technology, Koganei, Tokyo 184-8588 (Japan)

    2014-11-15

    We propose a new interferometer system for density profile measurements. This system produces multiple measurement chords by a leaky-wave antenna driven by multiple frequency inputs. The proposed system was validated in laboratory evaluation experiments. We confirmed that the interferometer generates a clear image of a Teflon plate as well as the phase shift corresponding to the plate thickness. In another experiment, we confirmed that quasi-optical mirrors can produce multiple measurement chords; however, the finite spot size of the probe beam degrades the sharpness of the resulting image.

  1. Validation of the ABBN/CONSYST constants system. Part 1: Validation through the critical experiments on compact metallic cores

    International Nuclear Information System (INIS)

    Ivanova, T.T.; Manturov, G.N.; Nikolaev, M.N.; Rozhikhin, E.V.; Semenov, M.Yu.; Tsiboulia, A.M.

    1999-01-01

    Worldwide compilation of criticality safety benchmark experiments, evaluated due to an activity of the International Criticality Safety Benchmark Evaluation Project (ICSBEP), discovers new possibilities for validation of the ABBN-93.1 cross section library for criticality safety analysis. Results of calculations of small assemblies with metal-fuelled cores are presented in this paper. It is concluded that ABBN-93.1 predicts criticality of such systems with required accuracy

  2. Analysis of Fresh Fuel Critical Experiments Appropriate for Burnup Credit Validation

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-01-01

    The ANS/ANS-8.1 standard requires that calculational methods used in determining criticality safety limits for applications outside reactors be validated by comparison with appropriate critical experiments. This report provides a detailed description of 34 fresh fuel critical experiments and their analyses using the SCALE-4.2 code system and the 27-group ENDF/B-IV cross-section library. The 34 critical experiments were selected based on geometry, material, and neutron interaction characteristics that are applicable to a transportation cask loaded with pressurized-water-reactor spent fuel. These 34 experiments are a representative subset of a much larger data base of low-enriched uranium and mixed-oxide critical experiments. A statistical approach is described and used to obtain an estimate of the bias and uncertainty in the calculational methods and to predict a confidence limit for a calculated neutron multiplication factor. The SCALE-4.2 results for a superset of approximately 100 criticals are included in uncertainty analyses, but descriptions of the individual criticals are not included

  3. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  4. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Criticality Experiments

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results for critical configurations are shown. (author)

  5. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    International Nuclear Information System (INIS)

    Apostolakis, J; Burkhardt, H; Ivanchenko, V N; Asai, M; Bagulya, A; Grichine, V; Brown, J M C; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Jacquemier, J; Guatelli, S; Incerti, S; Kadri, O; Maire, M; Urban, L; Pandola, L; Sawkey, D; Toshito, T; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed. (paper)

  6. Accounting for treatment use when validating a prognostic model: a simulation study.

    Science.gov (United States)

    Pajouheshnia, Romin; Peelen, Linda M; Moons, Karel G M; Reitsma, Johannes B; Groenwold, Rolf H H

    2017-07-14

    Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW) on the estimated model discrimination (c-index) and calibration (observed:expected ratio and calibration plots) in scenarios with different patterns and effects of treatment use. Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and should not be ignored. When treatment use is random, treated

  7. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  8. HTC Experimental Program: Validation and Calculational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fernex, F.; Ivanova, T.; Bernard, F.; Letang, E. [Inst Radioprotect and Surete Nucl, F-92262 Fontenay Aux Roses (France); Fouillaud, P. [CEA Valduc, Serv Rech Neutron and Critcite, 21 - Is-sur-Tille (France); Thro, J. F. [AREVA NC, F-78000 Versailles (France)

    2009-05-15

    In the 1980's a series of the Haut Taux de Combustion (HTC) critical experiments with fuel pins in a water-moderated lattice was conducted at the Apparatus B experimental facility in Valduc (Commissariat a I'Energie Atomique, France) with the support of the Institut de Radioprotection et de Surete Nucleaire and AREVA NC. Four series of experiments were designed to assess profit associated with actinide-only burnup credit in the criticality safety evaluation for fuel handling, pool storage, and spent-fuel cask conditions. The HTC rods, specifically fabricated for the experiments, simulated typical pressurized water reactor uranium oxide spent fuel that had an initial enrichment of 4. 5 wt% {sup 235}U and was burned to 37.5 GWd/tonne U. The configurations have been modeled with the CRISTAL criticality package and SCALE 5.1 code system. Sensitivity/uncertainty analysis has been employed to evaluate the HTC experiments and to study their applicability for validation of burnup credit calculations. This paper presents the experimental program, the principal results of the experiment evaluation, and modeling. The HTC data applicability to burnup credit validation is demonstrated with an example of spent-fuel storage models. (authors)

  9. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    International Nuclear Information System (INIS)

    Hall, David C; Paganetti, Harald; Makarova, Anastasia; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues. (note)

  10. Colloid-Facilitated Transport of 137Cs in Fracture-Fill Material. Experiments and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Dittrich, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Reimus, Paul William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-29

    In this study, we demonstrate how a combination of batch sorption/desorption experiments and column transport experiments were used to effectively parameterize a model describing the colloid-facilitated transport of Cs in the Grimsel granodiorite/FFM system. Cs partition coefficient estimates onto both the colloids and the stationary media obtained from the batch experiments were used as initial estimates of partition coefficients in the column experiments, and then the column experiment results were used to obtain refined estimates of the number of different sorption sites and the adsorption and desorption rate constants of the sites. The desorption portion of the column breakthrough curves highlighted the importance of accounting for adsorption-desorption hysteresis (or a very nonlinear adsorption isotherm) of the Cs on the FFM in the model, and this portion of the breakthrough curves also dictated that there be at least two different types of sorption sites on the FFM. In the end, the two-site model parameters estimated from the column experiments provided excellent matches to the batch adsorption/desorption data, which provided a measure of assurance in the validity of the model.

  11. Reactivity worth measurements on the CALIBAN reactor: interpretation of integral experiments for the nuclear data validation

    International Nuclear Information System (INIS)

    Richard, B.

    2012-01-01

    The good knowledge of nuclear data, input parameters for the neutron transport calculation codes, is necessary to support the advances of the nuclear industry. The purpose of this work is to bring pertinent information regarding the nuclear data integral validation process. Reactivity worth measurements have been performed on the Caliban reactor, they concern four materials of interest for the nuclear industry: gold, lutetium, plutonium and uranium 238. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed, the latter are necessary to the good interpretation of reactivity worth measurements. The experimental procedures are described with their associated uncertainties, measurements are then compared to numerical results. The methods used in numerical calculations are reported, especially the multigroup cross sections generation for deterministic codes. The modeling of the experiments is presented along with the associated uncertainties. This comparison led to an interpretation concerning the qualification of nuclear data libraries. Discrepancies are reported, discussed and justify the need of such experiments. (author) [fr

  12. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  13. PIV-validated numerical modeling of pulsatile flows in distal coronary end-to-side anastomoses.

    Science.gov (United States)

    Xiong, F L; Chong, C K

    2007-01-01

    This study employed particle image velocimetry (PIV) to validate a numerical model in a complementary approach to quantify hemodynamic factors in distal coronary anastomoses and to gain more insights on their relationship with anastomotic geometry. Instantaneous flow fields and wall shear stresses (WSS) were obtained from PIV measurement in a modified life-size silastic anastomosis model adapted from a conventional geometry by incorporating a smooth graft-artery transition. The results were compared with those predicted by a concurrent numerical model. The numerical method was then used to calculate cycle-averaged WSS (WSS(cyc)) and spatial wall shear stress gradient (SWSSG), two critical hemodynamic factors in the pathogenesis of intimal thickening (IT), to compare the conventional and modified geometries. Excellent qualitative agreement and satisfactory quantitative agreement with averaged normalized error in WSS between 0.8% and 8.9% were achieved between the PIV experiment and numerical model. Compared to the conventional geometry, the modified geometry produces a more uniform WSS(cyc) distribution eliminating both high and low WSS(cyc) around the toe, critical in avoiding IT. Peak SWSSG on the artery floor of the modified model is less than one-half that in the conventional case, and high SWSSG at the toe is eliminated. The validated numerical model is useful for modeling unsteady coronary anastomotic flows and elucidating the significance of geometry regulated hemodynamics. The results suggest the clinical relevance of constructing smooth graft-artery transition in distal coronary anastomoses to improve their hemodynamic performance.

  14. Validation of a 2-D semi-coupled numerical model for fluid-structure-seabed interaction

    Science.gov (United States)

    Ye, Jianhong; Jeng, Dongsheng; Wang, Ren; Zhu, Changqi

    2013-10-01

    A 2-D semi-coupled model PORO-WSSI 2D (also be referred as FSSI-CAS 2D) for the Fluid-Structure-Seabed Interaction (FSSI) has been developed by employing RANS equations for wave motion in fluid domain, VARANS equations for porous flow in porous structures; and taking the dynamic Biot's equations (known as "u - p" approximation) for soil as the governing equations. The finite difference two-step projection method and the forward time difference method are adopted to solve the RANS, VARANS equations; and the finite element method is adopted to solve the "u - p" approximation. A data exchange port is developed to couple the RANS, VARANS equations and the dynamic Biot's equations together. The analytical solution proposed by Hsu and Jeng (1994) and some experiments conducted in wave flume or geotechnical centrifuge in which various waves involved are used to validate the developed semi-coupled numerical model. The sandy bed involved in these experiments is poro-elastic or poro-elastoplastic. The inclusion of the interaction between fluid, marine structures and poro-elastoplastic seabed foundation is a special point and highlight in this paper, which is essentially different with other previous coupled models The excellent agreement between the numerical results and the experiment data indicates that the developed coupled model is highly reliablefor the FSSI problem.

  15. Dynamic crack initiation toughness : experiments and peridynamic modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Foster, John T.

    2009-10-01

    This is a dissertation on research conducted studying the dynamic crack initiation toughness of a 4340 steel. Researchers have been conducting experimental testing of dynamic crack initiation toughness, K{sub Ic}, for many years, using many experimental techniques with vastly different trends in the results when reporting K{sub Ic} as a function of loading rate. The dissertation describes a novel experimental technique for measuring K{sub Ic} in metals using the Kolsky bar. The method borrows from improvements made in recent years in traditional Kolsky bar testing by using pulse shaping techniques to ensure a constant loading rate applied to the sample before crack initiation. Dynamic crack initiation measurements were reported on a 4340 steel at two different loading rates. The steel was shown to exhibit a rate dependence, with the recorded values of K{sub Ic} being much higher at the higher loading rate. Using the knowledge of this rate dependence as a motivation in attempting to model the fracture events, a viscoplastic constitutive model was implemented into a peridynamic computational mechanics code. Peridynamics is a newly developed theory in solid mechanics that replaces the classical partial differential equations of motion with integral-differential equations which do not require the existence of spatial derivatives in the displacement field. This allows for the straightforward modeling of unguided crack initiation and growth. To date, peridynamic implementations have used severely restricted constitutive models. This research represents the first implementation of a complex material model and its validation. After showing results comparing deformations to experimental Taylor anvil impact for the viscoplastic material model, a novel failure criterion is introduced to model the dynamic crack initiation toughness experiments. The failure model is based on an energy criterion and uses the K{sub Ic} values recorded experimentally as an input. The failure model

  16. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  17. NUMERICAL MODELLING AND EXPERIMENTAL INFLATION VALIDATION OF A BIAS TWO-WHEEL TIRE

    Directory of Open Access Journals (Sweden)

    CHUNG KET THEIN

    2016-02-01

    Full Text Available This paper presents a parametric study on the development of a computational model for bias two-wheel tire through finite element analysis (FEA. An 80/90- 17 bias two-wheel tire was adopted which made up of four major layers of rubber compound with different material properties to strengthen the structure. Mooney-Rivlin hyperelastic model was applied to represent the behaviour of incompressible rubber compound. A 3D tire model was built for structural static finite element analysis. The result was validated from the inflation analysis. Structural static finite element analysis method is suitable for evaluation of the tire design and improvement of the tire behaviour to desired performance. Experimental tire was inflated at various pressures and the geometry between numerical and experimental tire were compared. There are good agreements between numerical simulation model and the experiment results. This indicates that the simulation model can be applied to the bias two-wheel tire design in order to predict the tire behaviour and improve its mechanical characteristics.

  18. Use of Synchronized Phasor Measurements for Model Validation in ERCOT

    Science.gov (United States)

    Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

    2013-05-01

    This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

  19. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  20. Development and validation of corium oxidation model for the VAPEX code

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, V.I.; Davydov, M.V.; Melikhov, O.I.; Borovkova, E.M.

    2011-01-01

    In light water reactor core melt accidents, the molten fuel (corium) can be brought into contact with coolant water in the course of the melt relocation in-vessel and ex-vessel as well as in an accident mitigation action of water addition. Mechanical energy release from such an interaction is of interest in evaluating the structural integrity of the reactor vessel as well as of the containment. Usually, the source for the energy release is considered to be the rapid transfer of heat from the molten fuel to the water ('vapor explosion'). When the fuel contains a chemically reactive metal component, there could be an additional source for the energy release, which is the heat release and hydrogen production due to the metal-water chemical reaction. In Electrogorsk Research and Engineering Center the computer code VAPEX (VAPor EXplosion) has been developed for analysis of the molten fuel coolant interaction. Multifield approach is used for modeling of dynamics of following phases: water, steam, melt jet, melt droplets, debris. The VAPEX code was successfully validated on FARO experimental data. Hydrogen generation was observed in FARO tests even though corium didn't contain metal component. The reason for hydrogen generation was not clear, so, simplified empirical model of hydrogen generation was implemented in the VAPEX code to take into account input of hydrogen into pressure increase. This paper describes new more detailed model of hydrogen generation due to the metal-water chemical reaction and results of its validation on ZREX experiments. (orig.)

  1. Dynamic experiments with high bisphenol-A concentrations modelled with an ASM model extended to include a separate XOC degrading microorganism

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Press-Kristensen, Kåre; Vanrolleghem, P.A.

    2009-01-01

    The perspective of this work is to develop a model, which can be used to better understand and optimize wastewater treatment plants that are able to remove xenobiotic organic compounds (XOCs) in combination with removal of traditional pollutants. Results from dynamic experiments conducted...... with the endocrine disrupting XOC bisphenol-A (BPA) in an activated sludge process with real wastewater were used to hypothesize an ASM-based process model including aerobic growth of a specific BPA-degrading microorganism and sorption of BPA to sludge. A parameter estimation method was developed, which...... simultaneously utilizes steady-state background concentrations and dynamic step response data, as well as conceptual simplifications of the plant configuration. Validation results show that biodegradation of BPA is sensitive to operational conditions before and during the experiment and that the proposed model...

  2. Validating a Technology Enhanced Student-Centered Learning Model

    Science.gov (United States)

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  3. Validation through model testing

    International Nuclear Information System (INIS)

    1995-01-01

    Geoval-94 is the third Geoval symposium arranged jointly by the OECD/NEA and the Swedish Nuclear Power Inspectorate. Earlier symposia in this series took place in 1987 and 1990. In many countries, the ongoing programmes to site and construct deep geological repositories for high and intermediate level nuclear waste are close to realization. A number of studies demonstrates the potential barrier function of the geosphere, but also that there are many unresolved issues. A key to these problems are the possibilities to gain knowledge by model testing with experiments and to increase confidence in models used for prediction. The sessions cover conclusions from the INTRAVAL-project, experiences from integrated experimental programs and underground research laboratories as well as the integration between performance assessment and site characterisation. Technical issues ranging from waste and buffer interactions with the rock to radionuclide migration in different geological media is addressed. (J.S.)

  4. Numerical modelling of transdermal delivery from matrix systems: parametric study and experimental validation with silicone matrices.

    Science.gov (United States)

    Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már

    2014-08-01

    A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  5. Polarographic validation of chemical speciation models

    International Nuclear Information System (INIS)

    Duffield, J.R.; Jarratt, J.A.

    2001-01-01

    It is well established that the chemical speciation of an element in a given matrix, or system of matrices, is of fundamental importance in controlling the transport behaviour of the element. Therefore, to accurately understand and predict the transport of elements and compounds in the environment it is a requirement that both the identities and concentrations of trace element physico-chemical forms can be ascertained. These twin requirements present the analytical scientist with considerable challenges given the labile equilibria, the range of time scales (from nanoseconds to years) and the range of concentrations (ultra-trace to macro) that may be involved. As a result of this analytical variability, chemical equilibrium modelling has become recognised as an important predictive tool in chemical speciation analysis. However, this technique requires firm underpinning by the use of complementary experimental techniques for the validation of the predictions made. The work reported here has been undertaken with the primary aim of investigating possible methodologies that can be used for the validation of chemical speciation models. However, in approaching this aim, direct chemical speciation analyses have been made in their own right. Results will be reported and analysed for the iron(II)/iron(III)-citrate proton system (pH 2 to 10; total [Fe] = 3 mmol dm -3 ; total [citrate 3- ] 10 mmol dm -3 ) in which equilibrium constants have been determined using glass electrode potentiometry, speciation is predicted using the PHREEQE computer code, and validation of predictions is achieved by determination of iron complexation and redox state with associated concentrations. (authors)

  6. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  7. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark

    2006-05-01

    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  8. Validation of the MC{sup 2}-3/DIF3D Code System for Control Rod Worth via the BFS-75-1 Reactor Physics Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Sunghwan; Kim, Sang Ji [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, control rod worths of the BFS-75-1 reactor physics experiments were examined using continuous energy MCNP models and deterministic MC2-3/DIF3D models based on the ENDF/B-VII.0 library. We can conclude that the ENDF/B-VII.0 library shows very good agreement in small-size metal uranium fuel loaded core which is surrounded by the depleted uranium blanket. However, the control rod heterogeneity effect reported by the reference is not significant in this problem because the tested control rod models were configured by single rod. Hence comparison with other control rod worth measurements data such as the BFS-109-2A reactor physics experiment is planned as a future study. The BFS-75-1 critical experiment was carried out in the BFS-1 facility of IPPE in Russia within the framework of validating an early phase of KALIMER- 150 design. The Monte-Carlo model of the BFS- 75-1 critical experiment had been developed. However, due to incomplete information for the BFS- 75-1 experiments, Monte-Carlo models had been generated for the reference criticality and sodium void reactivity measurements with disk-wise homogeneous model. Recently, KAERI performed another physics experiment, BFS-109-2A, by collaborating with Russian IPPE. During the review process of the experimental report of the BFS-109-2A critical experiments, valuable information for the BFS-1 facility which can also be used for the BFS-75-1 experiments was discovered.

  9. Mini-channel flow experiments and CFD validation analyses with the IFMIF Thermo- Hydraulic Experimental facility (ITHEX)

    International Nuclear Information System (INIS)

    Arbeiter, F.; Heinzel, V.; Leichtle, D.; Stratmanns, E.; Gordeev, S.

    2006-01-01

    The design of the IFMIF High Flux Test Module (HFTM) is based on the predictions for the heat transfer in narrow channels conducting helium flow of 50 o C inlet temperature at 0.3 MPa. The emerging helium flow conditions are in the transition regime of laminar to turbulent flow. The rectangular cooling channels are too short for the full development of the coolant flow. Relaminarization along the cooling passage is expected. At the shorter sides of the channels secondary flow occurs, which may have an impact on the temperature field inside the irradiation specimen's stack. As those conditions are not covered by available experimental data, the dedicated gas loop ITHEX has been constructed to operate up to a pressure of 0.42 MPa and temperatures of 200 o C. It's objective is to conduct experiments for the validation of the STAR-CD CFD code used for the design of the HFTM. As a first stage, two annular test-sections with hydraulic diameter of 1.2 mm have been used, where the experiments have been varied with respect to gas species (N 2 , He), inlet pressure, dimensionless heating span and Reynolds number encompassing the range of operational parameters of the HFTM. Local friction factors and Nusselt numbers have been obtained giving evidence that the transition regime will extend to Reynolds 10,000. For heating rates comparable to the HFTM filled with RAFM steels, local heat transfer coefficients are in consistence with the measured friction data. To validate local velocity profiles the ITHEX facility was further equipped with a flat rectangular test-section and a Laser Doppler Anemometry (LDA) system. An appropriate optical system has been developed and tested for the tiny observation volume of 40 μm diameter. Velocity profiles as induced by the transition of a wide inlet plenum to the flat mini-channels have been measured. Whereas the CFD models were able to reproduce the patterns far away from the nozzle, they show some disagreement for the conditions at the

  10. Validation Hydrodynamic Models of Three Topological Models of Secondary Facultative Ponds

    OpenAIRE

    Aponte-Reyes Alxander

    2014-01-01

    A methodology was developed to analyze boundary conditions, the size of the mesh and the turbulence of a mathematical model of CFD, which could explain hydrodynamic behavior on facultative stabilization ponds, FSP, built to pilot scale: conventional pond, CP, baffled pond, BP, and baffled-mesh pond, BMP. Models dispersion studies were performed in field for validation, taking samples into and out of the FSP, the information was used to carry out CFD model simulations of the three topologies. ...

  11. Gravitational Acceleration Effects on Macrosegregation: Experiment and Computational Modeling

    Science.gov (United States)

    Leon-Torres, J.; Curreri, P. A.; Stefanescu, D. M.; Sen, S.

    1999-01-01

    Experiments were performed under terrestrial gravity (1g) and during parabolic flights (10-2 g) to study the solidification and macrosegregation patterns of Al-Cu alloys. Alloys having 2% and 5% Cu were solidified against a chill at two different cooling rates. Microscopic and Electron Microprobe characterization was used to produce microstructural and macrosegregation maps. In all cases positive segregation occurred next to the chill because shrinkage flow, as expected. This positive segregation was higher in the low-g samples, apparently because of the higher heat transfer coefficient. A 2-D computational model was used to explain the experimental results. The continuum formulation was employed to describe the macroscopic transports of mass, energy, and momentum, associated with the solidification phenomena, for a two-phase system. The model considers that liquid flow is driven by thermal and solutal buoyancy, and by solidification shrinkage. The solidification event was divided into two stages. In the first one, the liquid containing freely moving equiaxed grains was described through the relative viscosity concept. In the second stage, when a fixed dendritic network was formed after dendritic coherency, the mushy zone was treated as a porous medium. The macrosegregation maps and the cooling curves obtained during experiments were used for validation of the solidification and segregation model. The model can explain the solidification and macrosegregation patterns and the differences between low- and high-gravity results.

  12. TRIMS: Validating T2 Molecular Effects for Neutrino Mass Experiments

    Science.gov (United States)

    Lin, Ying-Ting; Trims Collaboration

    2017-09-01

    The Tritium Recoil-Ion Mass Spectrometer (TRIMS) experiment examines the branching ratio of the molecular tritium (T2) beta decay to the bound state (3HeT+). Measuring this branching ratio helps to validate the current molecular final-state theory applied in neutrino mass experiments such as KATRIN and Project 8. TRIMS consists of a magnet-guided time-of-flight mass spectrometer with a detector located on each end. By measuring the kinetic energy and time-of-flight difference of the ions and beta particles reaching the detectors, we will be able to distinguish molecular ions from atomic ones and hence derive the ratio in question. We will give an update on the apparatus, simulation software, and analysis tools, including efforts to improve the resolution of our detectors and to characterize the stability and uniformity of our field sources. We will also share our commissioning results and prospects for physics data. The TRIMS experiment is supported by U.S. Department of Energy Office of Science, Office of Nuclear Physics, Award Number DE-FG02-97ER41020.

  13. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    Directory of Open Access Journals (Sweden)

    Giorgio Bacelli

    2017-04-01

    Full Text Available Empirically based modeling is an essential aspect of design for a wave energy converter. Empirically based models are used in structural, mechanical and control design processes, as well as for performance prediction. Both the design of experiments and methods used in system identification have a strong impact on the quality of the resulting model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followed for wave tank testing. The general system identification processes are shown to have a number of advantages, including an increased signal-to-noise ratio, reduced experimental time and higher frequency resolution. The experimental wave tank data is used to produce multiple models using different formulations to represent the dynamics of the wave energy converter. These models are validated and their performance is compared against one another. While most models of wave energy converters use a formulation with surface elevation as an input, this study shows that a model using a hull pressure measurement to incorporate the wave excitation phenomenon has better accuracy.

  14. A low earth orbit dynamic model for the proton anisotropy validation

    Science.gov (United States)

    Badavi, Francis F.

    2011-11-01

    Ionizing radiation measurements at low earth orbit (LEO) form the ideal tool for the experimental validation of radiation environmental models, nuclear transport code algorithms and nuclear reaction cross sections. Indeed, prior measurements on the space transportation system (STS; shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the LEO environment. Previous studies using computer aided design (CAD) models of the international space station (ISS) have demonstrated that the dosimetric prediction for a spacecraft at LEO requires the description of an environmental model with accurate anisotropic as well as dynamic behavior. This paper describes such a model for the trapped proton. The described model is a component of a suite of codes collectively named GEORAD (GEOmagnetic RADiation) which computes cutoff rigidity, trapped proton and trapped electron environments. The web version of GEORAD is named OLTARIS (On-line Tool for the Assessment of Radiation in Space). GEORAD suite is applicable to radiation environment prediction at LEO, medium earth orbit (MEO) and geosynchronous earth orbit (GEO) at quiet solar periods. GEORAD interest is in the study of long term effect of the trapped environment and therefore it does not account for any short term external field contribution due to solar activity. With the concentration of the paper on the LEO protons only, the paper presents the validation of the trapped proton model within GEORAD with reported measurements from the compact environment anomaly sensor (CEASE) science instrument package, flown onboard the tri-service experiment-5 (TSX-5) satellite during the period of June 2000 to July 2006. The spin stabilized satellite was flown in a 410 × 1710 km, 69° inclination elliptical orbit, allowing it to be exposed to a broad range of the LEO regime. The paper puts particular emphasize on the validation of the

  15. Validation study of the reactor physics lattice transport code WIMSD-5B by TRX and BAPL critical experiments of light water reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.

    2015-01-01

    Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to

  16. Accounting for treatment use when validating a prognostic model: a simulation study

    Directory of Open Access Journals (Sweden)

    Romin Pajouheshnia

    2017-07-01

    Full Text Available Abstract Background Prognostic models often show poor performance when applied to independent validation data sets. We illustrate how treatment use in a validation set can affect measures of model performance and present the uses and limitations of available analytical methods to account for this using simulated data. Methods We outline how the use of risk-lowering treatments in a validation set can lead to an apparent overestimation of risk by a prognostic model that was developed in a treatment-naïve cohort to make predictions of risk without treatment. Potential methods to correct for the effects of treatment use when testing or validating a prognostic model are discussed from a theoretical perspective.. Subsequently, we assess, in simulated data sets, the impact of excluding treated individuals and the use of inverse probability weighting (IPW on the estimated model discrimination (c-index and calibration (observed:expected ratio and calibration plots in scenarios with different patterns and effects of treatment use. Results Ignoring the use of effective treatments in a validation data set leads to poorer model discrimination and calibration than would be observed in the untreated target population for the model. Excluding treated individuals provided correct estimates of model performance only when treatment was randomly allocated, although this reduced the precision of the estimates. IPW followed by exclusion of the treated individuals provided correct estimates of model performance in data sets where treatment use was either random or moderately associated with an individual's risk when the assumptions of IPW were met, but yielded incorrect estimates in the presence of non-positivity or an unobserved confounder. Conclusions When validating a prognostic model developed to make predictions of risk without treatment, treatment use in the validation set can bias estimates of the performance of the model in future targeted individuals, and

  17. Development and validation of mechanical model for saturated/unsaturated bentonite buffer

    International Nuclear Information System (INIS)

    Yamamoto, S.; Komine, H.; Kato, S.

    2010-01-01

    Document available in extended abstract form only. Development and validation of mechanical models for bentonite buffer and backfill materials are one of important subjects to appropriately evaluate long term behaviour or condition of the EBS in radioactive waste disposal. The Barcelona Basic Model (BBM), which is one of extensions of the modified Cam-Clay model for unsaturated and expansive soil, has been developed and widely applied to several problems by using the coupled THM code, Code B right. Advantage of the model is that mechanical characteristics of buffer and backfill materials under not only saturated condition but also unsaturated one are taken account as well as swelling characteristics due to wetting. In this study the BBM is compared with already existing experimental data and already developed another model in terms of swelling characteristics of Japanese bentonite Kunigel-V1, and is validated in terms of consolidation characteristics based on newly performed controlled-suction oedometer tests for the Kunigel-V1 bentonite. Komine et al. (2003) have proposed a model (set of equations) for predicting swelling characteristics based on the diffuse double layer concept and the van der Waals force concept etc. They performed a lot of swelling deformation tests of bentonite and sand-bentonite mixture to confirm the applicability of the model. The BBM well agrees with the model proposed by Komine et al. and the experimental data in terms of swelling characteristics. Compression index and swelling index depending on suction are introduced in the BBM. Controlled-suction consolidation tests (oedometer tests) were performed to confirm the applicability of the suction dependent indexes to unsaturated bentonite. Compacted bentonite with initial dry density of 1.0 Mg/m 3 was tested. Constant suction, 80 kPa, 280 kPa and 480 kPa was applied and kept during the consolidation tests. Applicability of the BBM to consolidation and swelling behaviour of saturated and

  18. Validation of the TRACR3D code for soil water flow under saturated/unsaturated conditions in three experiments

    International Nuclear Information System (INIS)

    Perkins, B.; Travis, B.; DePoorter, G.

    1985-01-01

    Validation of the TRACR3D code in a one-dimensional form was obtained for flow of soil water in three experiments. In the first experiment, a pulse of water entered a crushed-tuff soil and initially moved under conditions of saturated flow, quickly followed by unsaturated flow. In the second experiment, steady-state unsaturated flow took place. In the final experiment, two slugs of water entered crushed tuff under field conditions. In all three experiments, experimentally measured data for volumetric water content agreed, within experimental errors, with the volumetric water content predicted by the code simulations. The experiments and simulations indicated the need for accurate knowledge of boundary and initial conditions, amount and duration of moisture input, and relevant material properties as input into the computer code. During the validation experiments, limitations on monitoring of water movement in waste burial sites were also noted. 5 references, 34 figures, 9 tables

  19. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  20. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  1. Development and Validation of a Scale Assessing Mental Health Clinicians' Experiences of Associative Stigma.

    Science.gov (United States)

    Yanos, Philip T; Vayshenker, Beth; DeLuca, Joseph S; O'Connor, Lauren K

    2017-10-01

    Mental health professionals who work with people with serious mental illnesses are believed to experience associative stigma. Evidence suggests that associative stigma could play an important role in the erosion of empathy among professionals; however, no validated measure of the construct currently exists. This study examined the convergent and discriminant validity and factor structure of a new scale assessing the associative stigma experiences of clinicians working with people with serious mental illnesses. A total of 473 clinicians were recruited from professional associations in the United States and participated in an online study. Participants completed the Clinician Associative Stigma Scale (CASS) and measures of burnout, quality of care, expectations about recovery, and self-efficacy. Associative stigma experiences were commonly endorsed; eight items on the 18-item scale were endorsed as being experienced "sometimes" or "often" by over 50% of the sample. The new measure demonstrated a logical four-factor structure: "negative stereotypes about professional effectiveness," "discomfort with disclosure," "negative stereotypes about people with mental illness," and "stereotypes about professionals' mental health." The measure had good internal consistency. It was significantly related to measures of burnout and quality of care, but it was not related to measures of self-efficacy or expectations about recovery. Findings suggest that the CASS is internally consistent and shows evidence of convergent validity and that associative stigma is commonly experienced by mental health professionals who work with people with serious mental illnesses.

  2. Design and Validation of 3D Printed Complex Bone Models with Internal Anatomic Fidelity for Surgical Training and Rehearsal.

    Science.gov (United States)

    Unger, Bertram J; Kraut, Jay; Rhodes, Charlotte; Hochman, Jordan

    2014-01-01

    Physical models of complex bony structures can be used for surgical skills training. Current models focus on surface rendering but suffer from a lack of internal accuracy due to limitations in the manufacturing process. We describe a technique for generating internally accurate rapid-prototyped anatomical models with solid and hollow structures from clinical and microCT data using a 3D printer. In a face validation experiment, otolaryngology residents drilled a cadaveric bone and its corresponding printed model. The printed bone models were deemed highly realistic representations across all measured parameters and the educational value of the models was strongly appreciated.

  3. Making Validated Educational Models Central in Preschool Standards.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  4. Fire Intensity Data for Validation of the Radiative Transfer Equation

    Energy Technology Data Exchange (ETDEWEB)

    Blanchat, Thomas K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jernigan, Dann A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    A set of experiments and test data are outlined in this report that provides radiation intensity data for the validation of models for the radiative transfer equation. The experiments were performed with lightly-sooting liquid hydrocarbon fuels that yielded fully turbulent fires 2 m diameter). In addition, supplemental measurements of air flow and temperature, fuel temperature and burn rate, and flame surface emissive power, wall heat, and flame height and width provide a complete set of boundary condition data needed for validation of models used in fire simulations.

  5. Mechanical Interaction in Pressurized Pipe Systems: Experiments and Numerical Models

    Directory of Open Access Journals (Sweden)

    Mariana Simão

    2015-11-01

    Full Text Available The dynamic interaction between the unsteady flow occurrence and the resulting vibration of the pipe are analyzed based on experiments and numerical models. Waterhammer, structural dynamic and fluid–structure interaction (FSI are the main subjects dealt with in this study. Firstly, a 1D model is developed based on the method of characteristics (MOC using specific damping coefficients for initial components associated with rheological pipe material behavior, structural and fluid deformation, and type of anchored structural supports. Secondly a 3D coupled complex model based on Computational Fluid Dynamics (CFD, using a Finite Element Method (FEM, is also applied to predict and distinguish the FSI events. Herein, a specific hydrodynamic model of viscosity to replicate the operation of a valve was also developed to minimize the number of mesh elements and the complexity of the system. The importance of integrated analysis of fluid–structure interaction, especially in non-rigidity anchored pipe systems, is equally emphasized. The developed models are validated through experimental tests.

  6. Validation and Analysis of Forward Osmosis CFD Model in Complex 3D Geometries

    Science.gov (United States)

    Gruber, Mathias F.; Johnson, Carl J.; Tang, Chuyang; Jensen, Mogens H.; Yde, Lars; Hélix-Nielsen, Claus

    2012-01-01

    In forward osmosis (FO), an osmotic pressure gradient generated across a semi-permeable membrane is used to generate water transport from a dilute feed solution into a concentrated draw solution. This principle has shown great promise in the areas of water purification, wastewater treatment, seawater desalination and power generation. To ease optimization and increase understanding of membrane systems, it is desirable to have a comprehensive model that allows for easy investigation of all the major parameters in the separation process. Here we present experimental validation of a computational fluid dynamics (CFD) model developed to simulate FO experiments with asymmetric membranes. Simulations are compared with experimental results obtained from using two distinctly different complex three-dimensional membrane chambers. It is found that the CFD model accurately describes the solute separation process and water permeation through membranes under various flow conditions. It is furthermore demonstrated how the CFD model can be used to optimize membrane geometry in such as way as to promote the mass transfer. PMID:24958428

  7. Validation and Analysis of Forward Osmosis CFD Model in Complex 3D Geometries

    Directory of Open Access Journals (Sweden)

    Lars Yde

    2012-11-01

    Full Text Available In forward osmosis (FO, an osmotic pressure gradient generated across a semi-permeable membrane is used to generate water transport from a dilute feed solution into a concentrated draw solution. This principle has shown great promise in the areas of water purification, wastewater treatment, seawater desalination and power generation. To ease optimization and increase understanding of membrane systems, it is desirable to have a comprehensive model that allows for easy investigation of all the major parameters in the separation process. Here we present experimental validation of a computational fluid dynamics (CFD model developed to simulate FO experiments with asymmetric membranes. Simulations are compared with experimental results obtained from using two distinctly different complex three-dimensional membrane chambers. It is found that the CFD model accurately describes the solute separation process and water permeation through membranes under various flow conditions. It is furthermore demonstrated how the CFD model can be used to optimize membrane geometry in such as way as to promote the mass transfer.

  8. Validation od computational model ALDERSON/EGSnrc for chest radiography

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Santos, André L. dos; Menezes, Claudio J.M.

    2017-01-01

    To perform dose studies in situations of exposure to radiation, without exposing individuals, the numerical dosimetry uses Computational Exposure Models (ECM). Composed essentially by a radioactive source simulator algorithm, a voxel phantom representing the human anatomy and a Monte Carlo code, the ECMs must be validated to determine the reliability of the physical array representation. The objective of this work is to validate the ALDERSON / EGSnrc MCE by through comparisons between the experimental measurements obtained with the ionization chamber and virtual simulations using Monte Carlo Method to determine the ratio of the input and output radiation dose. Preliminary results of these comparisons showed that the ECM reproduced the results of the experimental measurements performed with the physical phantom with a relative error of less than 10%, validating the use of this model for simulations of chest radiographs and estimates of radiation doses in tissues in the irradiated structures

  9. Experiments Using a Ground-Based Electrostatic Levitator and Numerical Modeling of Melt Convection for the Iron-Cobalt System in Support of Space Experiments

    Science.gov (United States)

    Lee, Jonghyun; SanSoucie, Michael P.

    2017-08-01

    Materials research is being conducted using an electromagnetic levitator installed in the International Space Station. Various metallic alloys were tested to elucidate unknown links among the structures, processes, and properties. To accomplish the mission of these space experiments, several ground-based activities have been carried out. This article presents some of our ground-based supporting experiments and numerical modeling efforts. Mass evaporation of Fe50Co50, one of flight compositions, was predicted numerically and validated by the tests using an electrostatic levitator (ESL). The density of various compositions within the Fe-Co system was measured with ESL. These results are being served as reference data for the space experiments. The convection inside a electromagnetically-levitated droplet was also modeled to predict the flow status, shear rate, and convection velocity under various process parameters, which is essential information for designing and analyzing the space experiments of some flight compositions influenced by convection.

  10. Arterial waveguide model for shear wave elastography: implementation and in vitro validation

    Science.gov (United States)

    Vaziri Astaneh, Ali; Urban, Matthew W.; Aquino, Wilkins; Greenleaf, James F.; Guddati, Murthy N.

    2017-07-01

    Arterial stiffness is found to be an early indicator of many cardiovascular diseases. Among various techniques, shear wave elastography has emerged as a promising tool for estimating local arterial stiffness through the observed dispersion of guided waves. In this paper, we develop efficient models for the computational simulation of guided wave dispersion in arterial walls. The models are capable of considering fluid-loaded tubes, immersed in fluid or embedded in a solid, which are encountered in in vitro/ex vivo, and in vivo experiments. The proposed methods are based on judiciously combining Fourier transformation and finite element discretization, leading to a significant reduction in computational cost while fully capturing complex 3D wave propagation. The developed methods are implemented in open-source code, and verified by comparing them with significantly more expensive, fully 3D finite element models. We also validate the models using the shear wave elastography of tissue-mimicking phantoms. The computational efficiency of the developed methods indicates the possibility of being able to estimate arterial stiffness in real time, which would be beneficial in clinical settings.

  11. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  12. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  13. Validation of the actuator line/Navier Stokes technique using mexico measurements

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2010-01-01

    This paper concerns the contribution of DTU MEK in the international research collaboration project (MexNext) within the framework of IEA Annex 29 to validate aerodynamic models or CFD codes using the existing measurements made in the previous EU funded projectMEXICO (Model Experiments in Control......This paper concerns the contribution of DTU MEK in the international research collaboration project (MexNext) within the framework of IEA Annex 29 to validate aerodynamic models or CFD codes using the existing measurements made in the previous EU funded projectMEXICO (Model Experiments...... in Controlled Conditions). The Actuator Line/Navier Stokes (AL/NS) technique developed at DTU is validated against the detailed MEXICO measurements. The AL/NS computations without the DNW wind tunnel with speeds of 10m/s, 15m/s and 24m/s. Comparisons of blade loading between computations and measurements show...

  14. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  15. How to enhance the future use of energy policy simulation models through ex post validation

    International Nuclear Information System (INIS)

    Qudrat-Ullah, Hassan

    2017-01-01

    Although simulation and modeling in general and system dynamics models in particular has long served the energy policy domain, ex post validation of these energy policy models is rarely addressed. In fact, ex post validation is a valuable area of research because it offers modelers a chance to enhance the future use of their simulation models by validating them against the field data. This paper contributes by presenting (i) a system dynamics simulation model, which was developed and used to do a three dimensional, socio-economical and environmental long-term assessment of Pakistan's energy policy in 1999, (ii) a systematic analysis of the 15-years old predictive scenarios produced by a system dynamics simulation model through ex post validation. How did the model predictions compare with the actual data? We report that the ongoing crisis of the electricity sector of Pakistan is unfolding, as the model-based scenarios had projected. - Highlights: • Argues that increased use of energy policy models is dependent on their credibility validation. • An ex post validation process is presented as a solution to build confidence in models. • A unique system dynamics model, MDESRAP, is presented. • The root mean square percentage error and Thiel's inequality statistics are applied. • The dynamic model, MDESRAP, is presented as an ex ante and ex post validated model.

  16. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  17. Groundwater flow through a natural fracture. Flow experiments and numerical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Larsson, Erik [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept of Geology

    1997-09-01

    Groundwater flow and transport play an important role not only for groundwater exploration but also in environmental engineering problems. This report considers how the hydraulic properties of fractures in crystalline rock depend on the fracture aperture geometry. Different numerical models are discussed and a FDM computer code for two- and three- dimensional flow-modelling has been developed. Different relations between the cells in the model are tested and compared with results in the literature. A laboratory experimental work has been done to carry out flow experiments and aperture measurements on the same specimen of a natural fracture. The drilled core sample had fractures parallel to the core axis and was placed inside a biaxial cell during the experiments. The water pressure gradient and the compression stress were varied during the experiments and also a tracer test was done. After the flow experiments, the aperture distribution for a certain compression was measured by injecting an epoxy resin into the fracture. The thickness of the resin layer was then studied in saw cut sections of the sample. The results from the experiments were used to validate numerical and analytical models, based on aperture distribution, for flow and transport simulations. In the disturbed zone around a drift both water and air are present in the fractures. The gas will go to the most wide part of the fracture because the capillarity and the conductivity decrease. The dependence of the effective conductivity on the variance of the conductivity and the effect of extinction of highly conductive cells has also been studied. A discussion of how gas in fractures around a drift can cause a skin effect is modelled and an example is given of what a saturation depending on the magnitude of the flow causes. 25 refs, 17 tabs, 43 figs.

  18. Groundwater flow through a natural fracture. Flow experiments and numerical modelling

    International Nuclear Information System (INIS)

    Larsson, Erik

    1997-09-01

    Groundwater flow and transport play an important role not only for groundwater exploration but also in environmental engineering problems. This report considers how the hydraulic properties of fractures in crystalline rock depend on the fracture aperture geometry. Different numerical models are discussed and a FDM computer code for two- and three- dimensional flow-modelling has been developed. Different relations between the cells in the model are tested and compared with results in the literature. A laboratory experimental work has been done to carry out flow experiments and aperture measurements on the same specimen of a natural fracture. The drilled core sample had fractures parallel to the core axis and was placed inside a biaxial cell during the experiments. The water pressure gradient and the compression stress were varied during the experiments and also a tracer test was done. After the flow experiments, the aperture distribution for a certain compression was measured by injecting an epoxy resin into the fracture. The thickness of the resin layer was then studied in saw cut sections of the sample. The results from the experiments were used to validate numerical and analytical models, based on aperture distribution, for flow and transport simulations. In the disturbed zone around a drift both water and air are present in the fractures. The gas will go to the most wide part of the fracture because the capillarity and the conductivity decrease. The dependence of the effective conductivity on the variance of the conductivity and the effect of extinction of highly conductive cells has also been studied. A discussion of how gas in fractures around a drift can cause a skin effect is modelled and an example is given of what a saturation depending on the magnitude of the flow causes

  19. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  20. Cognitive endophenotypes, gene-environment interactions and experience-dependent plasticity in animal models of schizophrenia.

    Science.gov (United States)

    Burrows, Emma L; Hannan, Anthony J

    2016-04-01

    Schizophrenia is a devastating brain disorder caused by a complex and heterogeneous combination of genetic and environmental factors. In order to develop effective new strategies to prevent and treat schizophrenia, valid animal models are required which accurately model the disorder, and ideally provide construct, face and predictive validity. The cognitive deficits in schizophrenia represent some of the most debilitating symptoms and are also currently the most poorly treated. Therefore it is crucial that animal models are able to capture the cognitive dysfunction that characterizes schizophrenia, as well as the negative and psychotic symptoms. The genomes of mice have, prior to the recent gene-editing revolution, proven the most easily manipulable of mammalian laboratory species, and hence most genetic targeting has been performed using mouse models. Importantly, when key environmental factors of relevance to schizophrenia are experimentally manipulated, dramatic changes in the phenotypes of these animal models are often observed. We will review recent studies in rodent models which provide insight into gene-environment interactions in schizophrenia. We will focus specifically on environmental factors which modulate levels of experience-dependent plasticity, including environmental enrichment, cognitive stimulation, physical activity and stress. The insights provided by this research will not only help refine the establishment of optimally valid animal models which facilitate development of novel therapeutics, but will also provide insight into the pathogenesis of schizophrenia, thus identifying molecular and cellular targets for future preclinical and clinical investigations. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. A rod-airfoil experiment as a benchmark for broadband noise modeling

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, M.C. [Ecole Centrale de Lyon, Laboratoire de Mecanique des Fluides et d' Acoustique, Ecully Cedex (France); Universite Claude Bernard/Lyon I, Villeurbanne Cedex (France); Boudet, J.; Michard, M. [Ecole Centrale de Lyon, Laboratoire de Mecanique des Fluides et d' Acoustique, Ecully Cedex (France); Casalino, D. [Ecole Centrale de Lyon, Laboratoire de Mecanique des Fluides et d' Acoustique, Ecully Cedex (France); Fluorem SAS, Ecully Cedex (France)

    2005-07-01

    A low Mach number rod-airfoil experiment is shown to be a good benchmark for numerical and theoretical broadband noise modeling. The benchmarking approach is applied to a sound computation from a 2D unsteady-Reynolds-averaged Navier-Stokes (U-RANS) flow field, where 3D effects are partially compensated for by a spanwise statistical model and by a 3D large eddy simulation. The experiment was conducted in the large anechoic wind tunnel of the Ecole Centrale de Lyon. Measurements taken included particle image velocity (PIV) around the airfoil, single hot wire, wall pressure coherence, and far field pressure. These measurements highlight the strong 3D effects responsible for spectral broadening around the rod vortex shedding frequency in the subcritical regime, and the dominance of the noise generated around the airfoil leading edge. The benchmarking approach is illustrated by two examples: the validation of a stochastical noise generation model applied to a 2D U-RANS computation; the assessment of a 3D LES computation using a new subgrid scale (SGS) model coupled to an advanced-time Ffowcs-Williams and Hawkings sound computation. (orig.)

  2. Bayesian Calibration, Validation and Uncertainty Quantification for Predictive Modelling of Tumour Growth: A Tutorial.

    Science.gov (United States)

    Collis, Joe; Connor, Anthony J; Paczkowski, Marcin; Kannan, Pavitra; Pitt-Francis, Joe; Byrne, Helen M; Hubbard, Matthew E

    2017-04-01

    In this work, we present a pedagogical tumour growth example, in which we apply calibration and validation techniques to an uncertain, Gompertzian model of tumour spheroid growth. The key contribution of this article is the discussion and application of these methods (that are not commonly employed in the field of cancer modelling) in the context of a simple model, whose deterministic analogue is widely known within the community. In the course of the example, we calibrate the model against experimental data that are subject to measurement errors, and then validate the resulting uncertain model predictions. We then analyse the sensitivity of the model predictions to the underlying measurement model. Finally, we propose an elementary learning approach for tuning a threshold parameter in the validation procedure in order to maximize predictive accuracy of our validated model.

  3. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  4. THX Experiment Overview

    Science.gov (United States)

    Wernet, Mark; Wroblewski, Adam; Locke, Randy; Georgiadis, Nick

    2016-01-01

    This presentation provides an overview of experiments conducted at NASA GRC to provide turbulent flow measurements needed for new turbulence model development and validation. The experiments include particle image velocimetry (PIV) and hot-wire measurements of mean flow velocity and temperature fields, as well as fluctuating components.

  5. Validity of two-phase polymer electrolyte membrane fuel cell models with respect to the gas diffusion layer

    Science.gov (United States)

    Ziegler, C.; Gerteisen, D.

    A dynamic two-phase model of a proton exchange membrane fuel cell with respect to the gas diffusion layer (GDL) is presented and compared with chronoamperometric experiments. Very good agreement between experiment and simulation is achieved for potential step voltammetry (PSV) and sine wave testing (SWT). Homogenized two-phase models can be categorized in unsaturated flow theory (UFT) and multiphase mixture (M 2) models. Both model approaches use the continuum hypothesis as fundamental assumption. Cyclic voltammetry experiments show that there is a deterministic and a stochastic liquid transport mode depending on the fraction of hydrophilic pores of the GDL. ESEM imaging is used to investigate the morphology of the liquid water accumulation in the pores of two different media (unteflonated Toray-TGP-H-090 and hydrophobic Freudenberg H2315 I3). The morphology of the liquid water accumulation are related with the cell behavior. The results show that UFT and M 2 two-phase models are a valid approach for diffusion media with large fraction of hydrophilic pores such as unteflonated Toray-TGP-H paper. However, the use of the homgenized UFT and M 2 models appears to be invalid for GDLs with large fraction of hydrophobic pores that corresponds to a high average contact angle of the GDL.

  6. Validation and Scaling of Soil Moisture in a Semi-Arid Environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    Science.gov (United States)

    Colliander, Andreas; Cosh, Michael H.; Misra, Sidharth; Jackson, Thomas J.; Crow, Wade T.; Chan, Steven; Bindlish, Rajat; Chae, Chun; Holifield Collins, Chandra; Yueh, Simon H.

    2017-01-01

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data products. The main goals of the experiment were to address issues regarding the spatial disaggregation methodologies for improvement of soil moisture products and validation of the in situ measurement upscaling techniques. To support these objectives high-resolution soil moisture maps were acquired with the airborne PALS (Passive Active L-band Sensor) instrument over an area in southeast Arizona that includes the Walnut Gulch Experimental Watershed (WGEW), and intensive ground sampling was carried out to augment the permanent in situ instrumentation. The objective of the paper was to establish the correspondence and relationship between the highly heterogeneous spatial distribution of soil moisture on the ground and the coarse resolution radiometer-based soil moisture retrievals of SMAP. The high-resolution mapping conducted with PALS provided the required connection between the in situ measurements and SMAP retrievals. The in situ measurements were used to validate the PALS soil moisture acquired at 1-km resolution. Based on the information from a dense network of rain gauges in the study area, the in situ soil moisture measurements did not capture all the precipitation events accurately. That is, the PALS and SMAP soil moisture estimates responded to precipitation events detected by rain gauges, which were in some cases not detected by the in situ soil moisture sensors. It was also concluded that the spatial distribution of the soil moisture resulted from the relatively small spatial extents of the typical convective storms in this region was not completely captured with the in situ stations. After removing those cases (approximately10 of the observations) the following metrics were obtained: RMSD (root mean square difference) of0.016m3m3 and correlation of 0.83. The

  7. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  8. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  9. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  10. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.

    2011-01-01

    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... for parameter estimation (calibration) and validation purposes. The model predictions using calibrated parameters have shown good agreement with the validation data sets, which provides credibility to the model structure and the parameter values....

  11. Design and implementation of new design of numerical experiments for non linear models; Conception et mise en oeuvre de nouvelles methodes d'elaboration de plans d'experiences pour l'apprentissage de modeles non lineaires

    Energy Technology Data Exchange (ETDEWEB)

    Gazut, St

    2007-03-15

    This thesis addresses the problem of the construction of surrogate models in numerical simulation. Whenever numerical experiments are costly, the simulation model is complex and difficult to use. It is important then to select the numerical experiments as efficiently as possible in order to minimize their number. In statistics, the selection of experiments is known as optimal experimental design. In the context of numerical simulation where no measurement uncertainty is present, we describe an alternative approach based on statistical learning theory and re-sampling techniques. The surrogate models are constructed using neural networks and the generalization error is estimated by leave-one-out, cross-validation and bootstrap. It is shown that the bootstrap can control the over-fitting and extend the concept of leverage for non linear in their parameters surrogate models. The thesis describes an iterative method called LDR for Learner Disagreement from experiment Re-sampling, based on active learning using several surrogate models constructed on bootstrap samples. The method consists in adding new experiments where the predictors constructed from bootstrap samples disagree most. We compare the LDR method with other methods of experimental design such as D-optimal selection. (author)

  12. Validation of a New Elastoplastic Constitutive Model Dedicated to the Cyclic Behaviour of Brittle Rock Materials

    Science.gov (United States)

    Cerfontaine, B.; Charlier, R.; Collin, F.; Taiebat, M.

    2017-10-01

    Old mines or caverns may be used as reservoirs for fuel/gas storage or in the context of large-scale energy storage. In the first case, oil or gas is stored on annual basis. In the second case pressure due to water or compressed air varies on a daily basis or even faster. In both cases a cyclic loading on the cavern's/mine's walls must be considered for the design. The complexity of rockwork geometries or coupling with water flow requires finite element modelling and then a suitable constitutive law for the rock behaviour modelling. This paper presents and validates the formulation of a new constitutive law able to represent the inherently cyclic behaviour of rocks at low confinement. The main features of the behaviour evidenced by experiments in the literature depict a progressive degradation and strain of the material with the number of cycles. A constitutive law based on a boundary surface concept is developed. It represents the brittle failure of the material as well as its progressive degradation. Kinematic hardening of the yield surface allows the modelling of cycles. Isotropic softening on the cohesion variable leads to the progressive degradation of the rock strength. A limit surface is introduced and has a lower opening than the bounding surface. This surface describes the peak strength of the material and allows the modelling of a brittle behaviour. In addition a fatigue limit is introduced such that no cohesion degradation occurs if the stress state lies inside this surface. The model is validated against three different rock materials and types of experiments. Parameters of the constitutive laws are calibrated against uniaxial tests on Lorano marble, triaxial test on a sandstone and damage-controlled test on Lac du Bonnet granite. The model is shown to reproduce correctly experimental results, especially the evolution of strain with number of cycles.

  13. An Evaluation of the FLAG Friction Model frictmultiscale2 using the Experiments of Juanicotena and Szarynski

    Energy Technology Data Exchange (ETDEWEB)

    Zocher, Marvin Anthony [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hammerberg, James Edward [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    The experiments of Juanicotena and Szarynski, namely T101, T102, and T105 are modeled for purposes of gaining a better understanding of the FLAG friction model frictmultiscale2. This exercise has been conducted as a first step toward model validation. It is shown that with inclusion of the friction model in the numerical analysis, the results of Juanicotena and Szarynski are predicted reasonably well. Without the friction model, simulation results do not match the experimental data nearly as well. Suggestions for follow-on work are included.

  14. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  15. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  16. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  17. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  18. In search of truth: the regulatory necessity of validation

    International Nuclear Information System (INIS)

    Niederer, U.

    1991-01-01

    A look at modern ideas of how scientific truth is achieved shows that theories are not really proved but accepted by a consensus of the experts, borne out by often repeated experience showing a theory to work well. In the same sense acceptability of models in waste disposal is mostly based on consensus. To obtain consensus of the relevant experts, including regulators, all models which considerably influence the results of a safety assessment have to be validated. This is particularly important for the models of geospheric migration because scientific experience with the deep underground is scarce. Validation plays a special role in public acceptance where regulators and other groups, which act as intermediaries between the public and the project manager, have to be convinced that all the relevant models are correct

  19. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  20. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    Science.gov (United States)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture

  1. Basic Modelling principles and Validation of Software for Prediction of Collision Damage

    DEFF Research Database (Denmark)

    Simonsen, Bo Cerup

    2000-01-01

    This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software.......This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software....

  2. Gap Conductance model Validation in the TASS/SMR-S code using MARS code

    International Nuclear Information System (INIS)

    Ahn, Sang Jun; Yang, Soo Hyung; Chung, Young Jong; Lee, Won Jae

    2010-01-01

    Korea Atomic Energy Research Institute (KAERI) has been developing the TASS/SMR-S (Transient and Setpoint Simulation/Small and Medium Reactor) code, which is a thermal hydraulic code for the safety analysis of the advanced integral reactor. An appropriate work to validate the applicability of the thermal hydraulic models within the code should be demanded. Among the models, the gap conductance model which is describes the thermal gap conductivity between fuel and cladding was validated through the comparison with MARS code. The validation of the gap conductance model was performed by evaluating the variation of the gap temperature and gap width as the changed with the power fraction. In this paper, a brief description of the gap conductance model in the TASS/SMR-S code is presented. In addition, calculated results to validate the gap conductance model are demonstrated by comparing with the results of the MARS code with the test case

  3. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  4. The reactor kinetics code tank: a validation against selected SPERT-1b experiments

    International Nuclear Information System (INIS)

    Ellis, R.J.

    1990-01-01

    The two-dimensional space-time analysis code TANK is being developed for the simulation of transient behaviour in the MAPLE class of research reactors. MAPLE research reactor cores are compact, light-water-cooled and -moderated, with a high degree of forced subcooling. The SPERT-1B(24/32) reactor core had many similarities to MAPLE-X10, and the results of the SPERT transient experiments are well documented. As a validation of TANK, a series of simulations of certain SPERT reactor transients was undertaken. Special features were added to the TANK code to model reactors with plate-type fuel and to allow for the simulation of rapid void production. The results of a series of super-prompt-critical reactivity step-insertion transient simulations are presented. The selected SPERT transients were all initiated from low power, at ambient temperatures, and with negligible coolant flow. Th results of the TANK simulations are in good agreement with the trends in the experimental SPERT data

  5. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  6. A new approach for the validation of skeletal muscle modelling using MRI data

    Science.gov (United States)

    Böl, Markus; Sturmat, Maike; Weichert, Christine; Kober, Cornelia

    2011-05-01

    Active and passive experiments on skeletal muscles are in general arranged on isolated muscles or by consideration of the whole muscle packages, such as the arm or the leg. Both methods exhibit advantages and disadvantages. By applying experiments on isolated muscles it turns out that no information about the surrounding tissues are considered what leads to insufficient specifications of the isolated muscle. Especially, the muscle shape and the fibre directions of an embedded muscle are completely different to that of the same isolated muscle. An explicit advantage, in contrast, is the possibility to study the mechanical characteristics in an unique, isolated way. On the other hand, by applying experiments on muscle packages the aforementioned pros and cons reverse. In such situation, the whole surrounding tissue is considered in the mechanical characteristics of the muscle which are much more difficult to identify. However, an embedded muscle reflects a much more realistic situation as in isolated condition. Thus, in the proposed work to our knowledge, we, for the first time, suggest a technique that allows to study characteristics of single skeletal muscles inside a muscle package without any computation of the tissue around the muscle of interest. In doing so, we use magnetic resonance imaging data of an upper arm during contraction. By applying a three-dimensional continuum constitutive muscle model we are able to study the biceps brachii inside the upper arm and validate the modelling approach by optical experiments.

  7. Validation in the Absence of Observed Events.

    Science.gov (United States)

    Lathrop, John; Ezell, Barry

    2016-04-01

    This article addresses the problem of validating models in the absence of observed events, in the area of weapons of mass destruction terrorism risk assessment. We address that problem with a broadened definition of "validation," based on stepping "up" a level to considering the reason why decisionmakers seek validation, and from that basis redefine validation as testing how well the model can advise decisionmakers in terrorism risk management decisions. We develop that into two conditions: validation must be based on cues available in the observable world; and it must focus on what can be done to affect that observable world, i.e., risk management. That leads to two foci: (1) the real-world risk generating process, and (2) best use of available data. Based on our experience with nine WMD terrorism risk assessment models, we then describe three best use of available data pitfalls: SME confidence bias, lack of SME cross-referencing, and problematic initiation rates. Those two foci and three pitfalls provide a basis from which we define validation in this context in terms of four tests--Does the model: … capture initiation? … capture the sequence of events by which attack scenarios unfold? … consider unanticipated scenarios? … consider alternative causal chains? Finally, we corroborate our approach against three validation tests from the DOD literature: Is the model a correct representation of the process to be simulated? To what degree are the model results comparable to the real world? Over what range of inputs are the model results useful? © 2015 Society for Risk Analysis.

  8. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Buchheit, Thomas E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Emery, John M [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Newton, Clay S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brown, Arthur [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element model of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.

  9. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  10. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  11. Closing the patient experience chasm: A two-level validation of the Consumer Quality Index Inpatient Hospital Care

    NARCIS (Netherlands)

    Smirnova, A.; Lombarts, K.; Arah, O.A.; Vleuten, C.P.M. van der

    2017-01-01

    BACKGROUND: Evaluation of patients' health care experiences is central to measuring patient-centred care. However, different instruments tend to be used at the hospital or departmental level but rarely both, leading to a lack of standardization of patient experience measures. OBJECTIVE: To validate

  12. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  13. Model validation of GAMMA code with heat transfer experiment for KO TBM in ITER

    International Nuclear Information System (INIS)

    Yum, Soo Been; Lee, Eo Hwak; Lee, Dong Won; Park, Goon Cherl

    2013-01-01

    Highlights: ► In this study, helium supplying system was constructed. ► Preparation for heat transfer experiment in KO TBM condition using helium supplying system was progressed. ► To get more applicable results, test matrix was made to cover the condition for KO TBM. ► Using CFD code; CFX 11, validation and modification for system code GAMMA was performed. -- Abstract: By considering the requirements for a DEMO-relevant blanket concept, Korea (KO) has proposed a He cooled molten lithium (HCML) test blanket module (TBM) for testing in ITER. A performance analysis for the thermal–hydraulics and a safety analysis for the KO TBM have been carried out using a commercial CFD code, ANSYS-CFX, and a system code, GAMMA (GAs multicomponent mixture analysis), which was developed by the gas cooled reactor in Korea. To verify the codes, a preliminary study was performed by Lee using a single TBM first wall (FW) mock-up made from the same material as the KO TBM, ferritic martensitic steel, using a 6 MPa nitrogen gas loop. The test was performed at pressures of 1.1, 1.9 and 2.9 MPa, and under various ranges of flow rate from 0.0105 to 0.0407 kg/s with a constant wall temperature condition. In the present study, a thermal–hydraulic test was performed with the newly constructed helium supplying system, in which the design pressure and temperature were 9 MPa and 500 °C, respectively. In the experiment, the same mock-up was used, and the test was performed under the conditions of 3 MPa pressure, 30 °C inlet temperature and 70 m/s helium velocity, which are almost same conditions of the KO TBM FW. One side of the mock-up was heated with a constant heat flux of 0.3–0.5 MW/m 2 using a graphite heating system, KoHLT-2 (Korea heat load test facility-2). Because the comparison result between CFX 11 and GAMMA showed a difference tendency, the modification of heat transfer correlation included in GAMMA was performed. And the modified GAMMA showed the strong parity with CFX

  14. Service validity and service reliability of search, experience and credence services. A scenario study

    NARCIS (Netherlands)

    Galetzka, Mirjam; Verhoeven, J.W.M.; Pruyn, Adriaan T.H.

    2006-01-01

    The purpose of this research is to add to our understanding of the antecedents of customer satisfaction by examining the effects of service reliability (Is the service “correctly” produced?) and service validity (Is the “correct” service produced?) of search, experience and credence services.

  15. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  16. Validation of neutronic methods applied to the analysis of fast subcritical systems. The MUSE-2 experiments

    International Nuclear Information System (INIS)

    Soule, R.; Salvatores, M.; Jacqmin, R.; Martini, M.; Lebrat, J.F.; Bertrand, P.; Broccoli, U.; Peluso, V.

    1997-01-01

    In the framework of the French SPIN program devoted to the separation and the transmutation of radioactive wastes, the CEA has launched the ISAAC program to investigate the potential of accelerator-driven systems and to provide an experimental validation of the physics characteristics of these systems. The neutronics of the subcritical core needs experimental validation. This can be done by decoupling the problem of the neutron source from the problem of the subcritical medium. Experiments with a well known external source placed in a subcritical medium have been performed in the MASURCA facility. The results confirm the high accuracy achievable with such experiments and the good quality of the ERANOS code system predictions. (author)

  17. Validation of neutronic methods applied to the analysis of fast subcritical systems. The MUSE-2 experiments

    Energy Technology Data Exchange (ETDEWEB)

    Soule, R; Salvatores, M; Jacqmin, R; Martini, M; Lebrat, J F; Bertrand, P [CEA Centre d` Etudes de Cadarache, Service de Physique des Reacteurs et du Cycle, 13 - Saint-Paul-lez-Durance (France); Broccoli, U; Peluso, V

    1998-12-31

    In the framework of the French SPIN program devoted to the separation and the transmutation of radioactive wastes, the CEA has launched the ISAAC program to investigate the potential of accelerator-driven systems and to provide an experimental validation of the physics characteristics of these systems. The neutronics of the subcritical core needs experimental validation. This can be done by decoupling the problem of the neutron source from the problem of the subcritical medium. Experiments with a well known external source placed in a subcritical medium have been performed in the MASURCA facility. The results confirm the high accuracy achievable with such experiments and the good quality of the ERANOS code system predictions. (author)

  18. Catchment-scale Validation of a Physically-based, Post-fire Runoff and Erosion Model

    Science.gov (United States)

    Quinn, D.; Brooks, E. S.; Robichaud, P. R.; Dobre, M.; Brown, R. E.; Wagenbrenner, J.

    2017-12-01

    The cascading consequences of fire-induced ecological changes have profound impacts on both natural and managed forest ecosystems. Forest managers tasked with implementing post-fire mitigation strategies need robust tools to evaluate the effectiveness of their decisions, particularly those affecting hydrological recovery. Various hillslope-scale interfaces of the physically-based Water Erosion Prediction Project (WEPP) model have been successfully validated for this purpose using fire-effected plot experiments, however these interfaces are explicitly designed to simulate single hillslopes. Spatially-distributed, catchment-scale WEPP interfaces have been developed over the past decade, however none have been validated for post-fire simulations, posing a barrier to adoption for forest managers. In this validation study, we compare WEPP simulations with pre- and post-fire hydrological records for three forested catchments (W. Willow, N. Thomas, and S. Thomas) that burned in the 2011 Wallow Fire in Northeastern Arizona, USA. Simulations were conducted using two approaches; the first using automatically created inputs from an online, spatial, post-fire WEPP interface, and the second using manually created inputs which incorporate the spatial variability of fire effects observed in the field. Both approaches were compared to five years of observed post-fire sediment and flow data to assess goodness of fit.

  19. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  20. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  1. Validation and selection of ODE based systems biology models: how to arrive at more reliable decisions.

    Science.gov (United States)

    Hasdemir, Dicle; Hoefsloot, Huub C J; Smilde, Age K

    2015-07-08

    Most ordinary differential equation (ODE) based modeling studies in systems biology involve a hold-out validation step for model validation. In this framework a pre-determined part of the data is used as validation data and, therefore it is not used for estimating the parameters of the model. The model is assumed to be validated if the model predictions on the validation dataset show good agreement with the data. Model selection between alternative model structures can also be performed in the same setting, based on the predictive power of the model structures on the validation dataset. However, drawbacks associated with this approach are usually under-estimated. We have carried out simulations by using a recently published High Osmolarity Glycerol (HOG) pathway from S.cerevisiae to demonstrate these drawbacks. We have shown that it is very important how the data is partitioned and which part of the data is used for validation purposes. The hold-out validation strategy leads to biased conclusions, since it can lead to different validation and selection decisions when different partitioning schemes are used. Furthermore, finding sensible partitioning schemes that would lead to reliable decisions are heavily dependent on the biology and unknown model parameters which turns the problem into a paradox. This brings the need for alternative validation approaches that offer flexible partitioning of the data. For this purpose, we have introduced a stratified random cross-validation (SRCV) approach that successfully overcomes these limitations. SRCV leads to more stable decisions for both validation and selection which are not biased by underlying biological phenomena. Furthermore, it is less dependent on the specific noise realization in the data. Therefore, it proves to be a promising alternative to the standard hold-out validation strategy.

  2. Sediment fingerprinting experiments to test the sensitivity of multivariate mixing models

    Science.gov (United States)

    Gaspar, Leticia; Blake, Will; Smith, Hugh; Navas, Ana

    2014-05-01

    Sediment fingerprinting techniques provide insight into the dynamics of sediment transfer processes and support for catchment management decisions. As questions being asked of fingerprinting datasets become increasingly complex, validation of model output and sensitivity tests are increasingly important. This study adopts an experimental approach to explore the validity and sensitivity of mixing model outputs for materials with contrasting geochemical and particle size composition. The experiments reported here focused on (i) the sensitivity of model output to different fingerprint selection procedures and (ii) the influence of source material particle size distributions on model output. Five soils with significantly different geochemistry, soil organic matter and particle size distributions were selected as experimental source materials. A total of twelve sediment mixtures were prepared in the laboratory by combining different quantified proportions of the Kruskal-Wallis test, Discriminant Function Analysis (DFA), Principal Component Analysis (PCA), or correlation matrix). Summary results for the use of the mixing model with the different sets of fingerprint properties for the twelve mixed soils were reasonably consistent with the initial mixing percentages initially known. Given the experimental nature of the work and dry mixing of materials, geochemical conservative behavior was assumed for all elements, even for those that might be disregarded in aquatic systems (e.g. P). In general, the best fits between actual and modeled proportions were found using a set of nine tracer properties (Sr, Rb, Fe, Ti, Ca, Al, P, Si, K, Si) that were derived using DFA coupled with a multivariate stepwise algorithm, with errors between real and estimated value that did not exceed 6.7 % and values of GOF above 94.5 %. The second set of experiments aimed to explore the sensitivity of model output to variability in the particle size of source materials assuming that a degree of

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  5. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Garces, A.; Souto, J. A.; Rodriguez, A.; Saavedra, S.; Casares, J. J.

    2015-07-01

    Calmest/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km{sup 2} horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km{sup 2}, with a coal-fired power plant emitting SO{sub 2}. Simulations were performed during three different periods when SO{sub 2} hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km{sup 2}, 0.5x0.5 km{sup 2}, and 0.2x0.2 km{sup 2}. The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km{sup 2} resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with

  6. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Garces, A.; Souto Rodriguez, J.A.; Saavedra, S.; Casares, J.J.

    2015-07-01

    CALMET/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km2 horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km2 , with a coal-fired power plant emitting SO2. Simulations were performed during three different periods when SO2 hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km2 , 0.5x0.5 km2 , and 0.2x0.2 km2. The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km2 resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with surface measurements (from sites for CALMET model

  7. Validation of CALMET/CALPUFF models simulations around a large power plant stack

    International Nuclear Information System (INIS)

    Hernandez-Garces, A.; Souto, J. A.; Rodriguez, A.; Saavedra, S.; Casares, J. J.

    2015-01-01

    Calmest/CALPUFF modeling system is frequently used in the study of atmospheric processes and pollution, and several validation tests were performed until now; nevertheless, most of them were based on experiments with a large compilation of surface and aloft meteorological measurements, rarely available. At the same time, the use of a large operational smokestack as tracer/pollutant source is not usual. In this work, first CALMET meteorological diagnostic model is nested to WRF meteorological prognostic model simulations (3x3 km 2 horizontal resolution) over a complex terrain and coastal domain at NW Spain, covering 100x100 km 2 , with a coal-fired power plant emitting SO 2 . Simulations were performed during three different periods when SO 2 hourly glc peaks were observed. NCEP reanalysis were applied as initial and boundary conditions. Yong Sei University-Pleim-Chang (YSU) PBL scheme was selected in the WRF model to provide the best input to three different CALMET horizontal resolutions, 1x1 km 2 , 0.5x0.5 km 2 , and 0.2x0.2 km 2 . The best results, very similar between them, were achieved using the last two resolutions; therefore, the 0.5x0.5 km 2 resolution was selected to test different CALMET meteorological inputs, using several combinations of WRF outputs and/or surface and upper-air measurements available in the simulation domain. With respect to meteorological aloft models output, CALMET PBL depth estimations are very similar to PBL depth estimations using upper-air measurements (rawinsondes), and significantly better than WRF PBL depth results. Regarding surface models surface output, the available meteorological sites were divided in two groups, one to provide meteorological input to CALMET (when applied), and another to models validation. Comparing WRF and CALMET outputs against surface measurements (from sites for models validation) the lowest RMSE was achieved using as CALMET input dataset WRF output combined with surface measurements (from sites for

  8. Validation of Effective Models for Simulation of Thermal Stratification and Mixing Induced by Steam Injection into a Large Pool of Water

    Directory of Open Access Journals (Sweden)

    Hua Li

    2014-01-01

    Full Text Available The Effective Heat Source (EHS and Effective Momentum Source (EMS models have been proposed to predict the development of thermal stratification and mixing during a steam injection into a large pool of water. These effective models are implemented in GOTHIC software and validated against the POOLEX STB-20 and STB-21 tests and the PPOOLEX MIX-01 test. First, the EHS model is validated against STB-20 test which shows the development of thermal stratification. Different numerical schemes and grid resolutions have been tested. A 48×114 grid with second order scheme is sufficient to capture the vertical temperature distribution in the pool. Next, the EHS and EMS models are validated against STB-21 test. Effective momentum is estimated based on the water level oscillations in the blowdown pipe. An effective momentum selected within the experimental measurement uncertainty can reproduce the mixing details. Finally, the EHS-EMS models are validated against MIX-01 test which has improved space and time resolution of temperature measurements inside the blowdown pipe. Excellent agreement in averaged pool temperature and water level in the pool between the experiment and simulation has been achieved. The development of thermal stratification in the pool is also well captured in the simulation as well as the thermal behavior of the pool during the mixing phase.

  9. Validation of effective momentum and heat flux models for stratification and mixing in a water pool

    Energy Technology Data Exchange (ETDEWEB)

    Hua Li; Villanueva, W.; Kudinov, P. [Royal Institute of Technology (KTH), Div. of Nuclear Power Safety, Stockholm (Sweden)

    2013-06-15

    modeling, and (c) validate our proposed models. The data from PPOOLEX STR-06, STR-09 and STR-10 tests are used for validation of the EHS and EMS models in this work. We found that estimations of the amplitude and frequency based on available experimental data from PPOOLEX experiments STR-06, STR-09, and STR-10 have too large uncertainties due to poor space and time resolution of the temperature measurements in the blowdown pipe. Nevertheless, the results demonstrated that simulations with variable effective momentum which is selected within the experimental uncertainty have provided reasonable agreement with test data on transient temperature distribution in the pool. In order to reduce uncertainty in both experimental data and EHS/EMS modeling, additional tests and modifications to the experimental procedures and measurements system in the PPOOLEX facility were proposed. Pre-test simulations were performed to aid in determining experimental conditions and procedures. Then, a new series of PPOOLEX experimental tests were carried out. A validation of EHS/EMS models against MIX-01 test is presented in this report. The results show that the clearing phase predicted with 3D drywell can match the experiment very well. The thermal stratification and mixing in MIX-01 is also well predicted in the simulation. (Author)

  10. Experimental Study of the Twin Turbulent Water Jets Using Laser Doppler Anemometry for Validating Numerical Models

    International Nuclear Information System (INIS)

    Wang Huhu; Lee Saya; Hassan, Yassin A.; Ruggles, Arthur E.

    2014-01-01

    The design of next generation (Gen. IV) high-temperature nuclear reactors including gas-cooled and sodium-cooled ones involves massive numerical works especially the Computational Fluid Dynamics (CFD) simulations. The high cost of large-scale experiments and the inherent uncertainties existing in the turbulent models and wall functions of any CFD codes solving Reynolds-averaged Navier-Stokes (RANS) equations necessitate the high-spacial experimental data sets for benchmarking the simulation results. In Gen. IV conceptual reactors, the high- temperature flows mix in the upper plenum before entering the secondary cooling system. The mixing condition should be accurately estimated and fully understood as it is related to the thermal stresses induced in the upper plenum and the magnitudes of output power oscillations due to any changes of primary coolant temperature. The purpose of this study is to use Laser Doppler Anemometry (LDA) technique to measure the flow field of two submerged parallel jets issuing from two rectangular channels. The LDA data sets can be used to validate the corresponding simulation results. The jets studied in this work were at room temperature. The turbulent characteristics including the distributions of mean velocities, turbulence intensities, Reynolds stresses were studied. Uncertainty analysis was also performed to study the errors involved in this experiment. The experimental results in this work are valid for benchmarking any steady-state numerical simulations using turbulence models to solve RANS equations. (author)

  11. Closing the patient experience chasm: A two-level validation of the Consumer Quality Index Inpatient Hospital Care

    NARCIS (Netherlands)

    Smirnova, Alina; Lombarts, Kiki M. J. M. H.; Arah, Onyebuchi A.; van der Vleuten, Cees P. M.

    2017-01-01

    BackgroundEvaluation of patients' health care experiences is central to measuring patient-centred care. However, different instruments tend to be used at the hospital or departmental level but rarely both, leading to a lack of standardization of patient experience measures. ObjectiveTo validate the

  12. Numerical simulation and experimental validation of aircraft ground deicing model

    Directory of Open Access Journals (Sweden)

    Bin Chen

    2016-05-01

    Full Text Available Aircraft ground deicing plays an important role of guaranteeing the aircraft safety. In practice, most airports generally use as many deicing fluids as possible to remove the ice, which causes the waste of the deicing fluids and the pollution of the environment. Therefore, the model of aircraft ground deicing should be built to establish the foundation for the subsequent research, such as the optimization of the deicing fluid consumption. In this article, the heat balance of the deicing process is depicted, and the dynamic model of the deicing process is provided based on the analysis of the deicing mechanism. In the dynamic model, the surface temperature of the deicing fluids and the ice thickness are regarded as the state parameters, while the fluid flow rate, the initial temperature, and the injection time of the deicing fluids are treated as control parameters. Ignoring the heat exchange between the deicing fluids and the environment, the simplified model is obtained. The rationality of the simplified model is verified by the numerical simulation and the impacts of the flow rate, the initial temperature and the injection time on the deicing process are investigated. To verify the model, the semi-physical experiment system is established, consisting of the low-constant temperature test chamber, the ice simulation system, the deicing fluid heating and spraying system, the simulated wing, the test sensors, and the computer measure and control system. The actual test data verify the validity of the dynamic model and the accuracy of the simulation analysis.

  13. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    Directory of Open Access Journals (Sweden)

    Mark R. Lafave

    2015-01-01

    Full Text Available Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete’s return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT. The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1 heading descriptors; (2 the order of the model; (3 the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline.

  14. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  15. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  16. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  17. Reactivity loss validation of high burn-up PWR fuels with pile-oscillation experiments in MINERVE

    Energy Technology Data Exchange (ETDEWEB)

    Leconte, P.; Vaglio-Gaudard, C.; Eschbach, R.; Di-Salvo, J.; Antony, M.; Pepino, A. [CEA, DEN, DER, Cadarache, F-13108 Saint-Paul-Lez-Durance (France)

    2012-07-01

    The ALIX experimental program relies on the experimental validation of the spent fuel inventory, by chemical analysis of samples irradiated in a PWR between 5 and 7 cycles, and also on the experimental validation of the spent fuel reactivity loss with bum-up, obtained by pile-oscillation measurements in the MINERVE reactor. These latter experiments provide an overall validation of both the fuel inventory and of the nuclear data responsible for the reactivity loss. This program offers also unique experimental data for fuels with a burn-up reaching 85 GWd/t, as spent fuels in French PWRs never exceeds 70 GWd/t up to now. The analysis of these experiments is done in two steps with the APOLLO2/SHEM-MOC/CEA2005v4 package. In the first one, the fuel inventory of each sample is obtained by assembly calculations. The calculation route consists in the self-shielding of cross sections on the 281 energy group SHEM mesh, followed by the flux calculation by the Method Of Characteristics in a 2D-exact heterogeneous geometry of the assembly, and finally a depletion calculation by an iterative resolution of the Bateman equations. In the second step, the fuel inventory is used in the analysis of pile-oscillation experiments in which the reactivity of the ALIX spent fuel samples is compared to the reactivity of fresh fuel samples. The comparison between Experiment and Calculation shows satisfactory results with the JEFF3.1.1 library which predicts the reactivity loss within 2% for burn-up of {approx}75 GWd/t and within 4% for burn-up of {approx}85 GWd/t. (authors)

  18. MSX-3D: a tool to validate 3D protein models using mass spectrometry.

    Science.gov (United States)

    Heymann, Michaël; Paramelle, David; Subra, Gilles; Forest, Eric; Martinez, Jean; Geourjon, Christophe; Deléage, Gilbert

    2008-12-01

    The technique of chemical cross-linking followed by mass spectrometry has proven to bring valuable information about the protein structure and interactions between proteic subunits. It is an effective and efficient way to experimentally investigate some aspects of a protein structure when NMR and X-ray crystallography data are lacking. We introduce MSX-3D, a tool specifically geared to validate protein models using mass spectrometry. In addition to classical peptides identifications, it allows an interactive 3D visualization of the distance constraints derived from a cross-linking experiment. Freely available at http://proteomics-pbil.ibcp.fr

  19. Animal models of binge drinking, current challenges to improve face validity.

    Science.gov (United States)

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Field validation of the contaminant transport model, FEMA

    International Nuclear Information System (INIS)

    Wong, K.-F.V.

    1986-01-01

    The work describes the validation with field data of a finite element model of material transport through aquifers (FEMA). Field data from the Idaho Chemical Processing Plant, Idaho, USA and from the 58th Street landfill in Miami, Florida, USA are used. In both cases the model was first calibrated and then integrated over a span of eight years to check on the predictive capability of the model. Both predictive runs gave results that matched well with available data. (author)