WorldWideScience

Sample records for model validation experiments

  1. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  2. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  3. Blast Load Simulator Experiments for Computational Model Validation Report 3

    Science.gov (United States)

    2017-07-01

    establish confidence in the simulation results specific to their intended use. One method for providing experimental data for computational model...walls, to higher blast pressures required to evaluate the performance of protective construction methods . Figure 1. ERDC Blast Load Simulator (BLS... Instrumentation included 3 pressure gauges mounted on the steel calibration plate, 2 pressure gauges mounted in the wall of the BLS, and 25 pressure gauges

  4. Validation of dispersion model of RTARC-DSS based on ''KIT'' field experiments

    International Nuclear Information System (INIS)

    Duran, J.

    2000-01-01

    The aim of this study is to present the performance of the Gaussian dispersion model RTARC-DSS (Real Time Accident Release Consequences - Decision Support System) at the 'Kit' field experiments. The Model Validation Kit is a collection of three experimental data sets from Kincaid, Copenhagen, Lillestrom and supplementary Indianopolis experimental campaigns accompanied by software for model evaluation. The validation of the model has been performed on the basis of the maximum arc-wise concentrations using the Bootstrap resampling procedure the variation of the model residuals. Validation was performed for the short-range distances (about 1 - 10 km, maximum for Kincaid data set - 50 km from source). Model evaluation procedure and amount of relative over- or under-prediction are discussed and compared with the model. (author)

  5. Validation of the containment code Sirius: interpretation of an explosion experiment on a scale model

    International Nuclear Information System (INIS)

    Blanchet, Y.; Obry, P.; Louvet, J.; Deshayes, M.; Phalip, C.

    1979-01-01

    The explicit 2-D axisymmetric Langrangian code SIRIUS, developed at the CEA/DRNR, Cadarache, deals with transient compressive flows in deformable primary tanks with more or less complex internal component geometries. This code has been subjected to a two-year intensive validation program on scale model experiments and a number of improvements have been incorporated. This paper presents a recent calculation of one of these experiments using the SIRIUS code, and the comparison with experimental results shows the encouraging possibilities of this Lagrangian code

  6. Design of experiments in medical physics: Application to the AAA beam model validation.

    Science.gov (United States)

    Dufreneix, S; Legrand, C; Di Bartolo, C; Bremaud, M; Mesgouez, J; Tiplica, T; Autret, D

    2017-09-01

    The purpose of this study is to evaluate the usefulness of the design of experiments in the analysis of multiparametric problems related to the quality assurance in radiotherapy. The main motivation is to use this statistical method to optimize the quality assurance processes in the validation of beam models. Considering the Varian Eclipse system, eight parameters with several levels were selected: energy, MLC, depth, X, Y 1 and Y 2 jaw dimensions, wedge and wedge jaw. A Taguchi table was used to define 72 validation tests. Measurements were conducted in water using a CC04 on a TrueBeam STx, a TrueBeam Tx, a Trilogy and a 2300IX accelerator matched by the vendor. Dose was computed using the AAA algorithm. The same raw data was used for all accelerators during the beam modelling. The mean difference between computed and measured doses was 0.1±0.5% for all beams and all accelerators with a maximum difference of 2.4% (under the 3% tolerance level). For all beams, the measured doses were within 0.6% for all accelerators. The energy was found to be an influencing parameter but the deviations observed were smaller than 1% and not considered clinically significant. Designs of experiment can help define the optimal measurement set to validate a beam model. The proposed method can be used to identify the prognostic factors of dose accuracy. The beam models were validated for the 4 accelerators which were found dosimetrically equivalent even though the accelerator characteristics differ. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Design of an intermediate-scale experiment to validate unsaturated- zone transport models

    International Nuclear Information System (INIS)

    Siegel, M.D.; Hopkins, P.L.; Glass, R.J.; Ward, D.B.

    1991-01-01

    An intermediate-scale experiment is being carried out to evaluate instrumentation and models that might be used for transport-model validation for the Yucca Mountain Site Characterization Project. The experimental test bed is a 6-m high x 3-m diameter caisson filled with quartz sand with a sorbing layer at an intermediate depth. The experiment involves the detection and prediction of the migration of fluid and tracers through an unsaturated porous medium. Pre-test design requires estimation of physical properties of the porous medium such as the relative permeability, saturation/pressure relations, porosity, and saturated hydraulic conductivity as well as geochemical properties such as surface complexation constants and empircial K d 'S. The pre-test characterization data will be used as input to several computer codes to predict the fluid flow and tracer migration. These include a coupled chemical-reaction/transport model, a stochastic model, and a deterministic model using retardation factors. The calculations will be completed prior to elution of the tracers, providing a basis for validation by comparing the predictions to observed moisture and tracer behavior

  8. Preliminary characterization of materials for a reactive transport model validation experiment

    International Nuclear Information System (INIS)

    Siegel, M.D.; Ward, D.B.; Cheng, W.C.; Bryant, C.; Chocas, C.S.; Reynolds, C.G.

    1993-01-01

    The geochemical properties of a porous sand and several tracers (Ni, Br, and Li) have been characterized for use in a caisson experiment designed to validate sorption models used in models of inactive transport. The surfaces of the sand grains have been examined by a combination of techniques including potentiometric titration, acid leaching, optical microscopy, and scanning electron microscopy with energy-dispersive spectroscopy. The surface studies indicate the presence of small amounts of carbonate, kaolinite and iron-oxyhydroxides. Adsorption of nickel, lithium and bromide by the sand was measured using batch techniques. Bromide was not sorbed by the sand. A linear (K d ) or an isotherm sorption model may adequately describe transport of Li; however, a model describing the changes of pH and the concentrations of other solution species as a function of time and position within the caisson and the concomitant effects on Ni sorption may be required for accurate predictions of nickel transport

  9. Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rao, Rekha Ranjana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelden, Bion [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Hern, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wyatt, Nicholas B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hileman, Michael Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Urquhart, Alexander [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle Richard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, David Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  10. An enhanced fire hazard assessment model and validation experiments for vertical cable trays

    Energy Technology Data Exchange (ETDEWEB)

    Li, Lu [Sate Key Laboratory of Fire Science, University of Science and Technology of China, Hefei 230027 (China); Huang, Xianjia, E-mail: huangxianjia@gziit.ac.cn [Joint Laboratory of Fire Safety in Nuclear Power Plants, Institute of Industry Technology Guangzhou & Chinese Academy of Sciences, Guangzhou 511458 (China); Bi, Kun; Liu, Xiaoshuang [China Nuclear Power Design Co., Ltd., Shenzhen 518045 (China)

    2016-05-15

    Highlights: • An enhanced model was developed for vertical cable fire hazard assessment in NPP. • The validated experiments on vertical cable tray fires were conducted. • The capability of the model for cable tray with different cable spacing were tested. - Abstract: The model, referred to as FLASH-CAT (Flame Spread over Horizontal Cable Trays), was developed to estimate the heat release rate for vertical cable tray fire. The focus of this work is to investigate the application of an enhanced model to the single vertical cable tray fires with different cable spacing. The experiments on vertical cable tray fires with three typical cable spacing were conducted. The histories of mass loss rate and flame length were recorded during the cable fire. From the experimental results, it is found that the space between cable lines intensifies the cable combustion and accelerates the flame spread. The predictions by the enhanced model show good agreements with the experimental data. At the same time, it is shown that the enhanced model is capable of predicting the different behaviors of cable fires with different cable spacing by adjusting the flame spread speed only.

  11. Optimal Design and Model Validation for Combustion Experiments in a Shock Tube

    KAUST Repository

    Long, Quan

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate functions. The control parameters are the initial hydrogen concentration and the temperature. First, we build a polynomial based surrogate model for the observable related to the reactions in the shock tube. Second, we use a novel MAP based approach to estimate the expected information gain in the proposed experiments and select the best experimental set-ups corresponding to the optimal expected information gains. Third, we use the synthetic data to carry out virtual validation of our methodology.

  12. Validation of ISAAC Thermal Hydraulic Model against the RD-14M Experiment B9401

    International Nuclear Information System (INIS)

    Kim, Dong Ha; Kim, Hyung Tae; Park, Soo Yong; Kim, Sang Baik

    2009-09-01

    The thermal hydraulic behavior prior to core damage was compared with the experimental data to validate the ISAAC thermal hydraulic models. B9401 test at RD-14M facility was selected as a benchmark data set, which was open for the international standard problem (ISP). As ISAAC had hard-wired systems and models inside the code, new input parameters were prepared with the minimum model changes. Among 37 parameters measured from the experiment, pressures, flow rates, void fractions, and the fuel temperatures were compared. The simulation showed that the thermal hydraulic behavior estimated from ISAAC was similar to the experimental data or to the detailed code, even though ISAAC was developed mainly for the severe accident analysis. Based on these comparison analyses, it can be concluded that the thermal hydraulic conditions estimated from ISAAC can be used as the boundary conditions for further analysis causing severe core damage. In order to extend the data base for the thermal hydraulic behavior of ISAAC, more RD-14M tests as well as DBA sequences need to be simulated

  13. Model support for an out-reactor-instrumented-defected-fuel-experiment to validate the RMC fuel oxidation model

    Energy Technology Data Exchange (ETDEWEB)

    Quastel, A.D.; Corcoran, E.C.; Lewis, B.J. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, Ontario (Canada); Thiriet, C. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Hadaller, G. [Stern Laboratories Inc., Hamilton, Ontario (Canada)

    2011-07-01

    An out-reactor fuel oxidation experiment with controlled parameters is being planned to provide data for validation of the Royal Military College (RMC) mechanistic fuel oxidation model. In support of this work, fuel oxidation 2D r-θ and 3D models are presented. The 2D r-θ model with radial cracks provides the radial temperature distribution in the test fuel element and also provides heating power information. The 3D model with radial cracks and a pellet-pellet gap under a defected sheath indicate that an oxygen stoichiometry deviation of 0.057 could result within one week of heating a defected UO{sub 2} fuel element with a 5-mm{sup 2} sheath defect. (author)

  14. Validation of Global Ozone Monitoring Experiment ozone profiles and evaluation of stratospheric transport in a global chemistry transport model

    NARCIS (Netherlands)

    de Laat, A.T.J.; Landgraf, J.; Aben, I.; Hasekamp, O.; Bregman, B.

    2007-01-01

    This paper presents a validation of Global Ozone Monitoring Experiment (GOME) ozone (O3) profiles which are used to evaluate stratospheric transport in the chemistry transport model (CTM) Tracer Model version 5 (TM5) using a linearized stratospheric O3 chemistry scheme. A

  15. Validation of Global Ozone Monitoring Experiment zone profiles and evaluation of stratospheric transport in a global chemistry transport model

    NARCIS (Netherlands)

    Laat, A.T.J.de; Landgraf, J.; Aben, I.; Hasekamp, O.; Bregman, B.

    2007-01-01

    This paper presents a validation of Global Ozone Monitoring Experiment (GOME) ozone (O3) profiles which are used to evaluate stratospheric transport in the chemistry transport model (CTM) Tracer Model version 5 (TM5) using a linearized stratospheric O3 chemistry scheme. A comparison of GOME O3

  16. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander

    2011-01-01

    AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis...

  17. Validation of vibro-impact force models by numerical simulation, perturbation methods and experiments

    Science.gov (United States)

    Rebouças, Geraldo F. de S.; Santos, Ilmar F.; Thomsen, Jon J.

    2018-01-01

    The frequency response of a single degree of freedom vibro-impact oscillator is analyzed using Harmonic Linearization, Averaging and Numeric Simulation, considering three different impact force models: one given by a piecewise-linear function (Kelvin-Voigt model), another by a high-order power function, and a third one combining the advantages of the other two. Experimental validation is carried out using control-based continuation to obtain the experimental frequency response, including its unstable branch.

  18. Mold-filling experiments for validation of modeling encapsulation. Part 1, "wine glass" mold.

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda, Jaime N.; Grillet, Anne Mary; Altobelli, Stephen A. (New Mexico Resonance, Albuquerque, NM); Cote, Raymond O.; Mondy, Lisa Ann

    2005-06-01

    The C6 project 'Encapsulation Processes' has been designed to obtain experimental measurements for discovery of phenomena critical to improving these processes, as well as data required in the verification and validation plan (Rao et al. 2001) for model validation of flow in progressively complex geometries. We have observed and recorded the flow of clear, Newtonian liquids and opaque, rheologically complex suspensions in two mold geometries. The first geometry is a simple wineglass geometry in a cylinder and is reported here in Part 1. The results in a more realistic encapsulation geometry are reported in Part 2.

  19. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  20. Computational Fluid Dynamics Modeling and Validating Experiments of Airflow in a Data Center

    Directory of Open Access Journals (Sweden)

    Emelie Wibron

    2018-03-01

    Full Text Available The worldwide demand on data storage continues to increase and both the number and the size of data centers are expanding rapidly. Energy efficiency is an important factor to consider in data centers since the total energy consumption is huge. The servers must be cooled and the performance of the cooling system depends on the flow field of the air. Computational Fluid Dynamics (CFD can provide detailed information about the airflow in both existing data centers and proposed data center configurations before they are built. However, the simulations must be carried out with quality and trust. The k– ε model is the most common choice to model the turbulent airflow in data centers. The aim of this study is to examine the performance of more advanced turbulence models, not previously investigated for CFD modeling of data centers. The considered turbulence models are the k– ε model, the Reynolds Stress Model (RSM and Detached Eddy Simulations (DES. The commercial code ANSYS CFX 16.0 is used to perform the simulations and experimental values are used for validation. It is clarified that the flow field for the different turbulence models deviate at locations that are not in the close proximity of the main components in the data center. The k– ε model fails to predict low velocity regions. RSM and DES produce very similar results and, based on the solution times, it is recommended to use RSM to model the turbulent airflow data centers.

  1. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, M.D.; Cheng, W.C. [Sandia National Labs., Albuquerque, NM (United States); Ward, D.B.; Bryan, C.R. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project.

  2. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Siegel, M.D.; Cheng, W.C.; Ward, D.B.; Bryan, C.R.

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project

  3. Experience with field testing for model validation of GE wind plants

    Energy Technology Data Exchange (ETDEWEB)

    Miller, N.; Hannett, L.; Clark, K.; MacDowell, J.; Barton, W. [GE Energy, Mississauga, ON (Canada)

    2007-07-01

    GE Energy recently conducted field tests on wind turbines using a suite of controls and electronics. Zero voltage ride through (ZVRT) and Volt/Var tests were performed on operating wind turbine generators (WTG) to determine fault tolerance. The Western Electricity Coordinating Council's (WECC) model validation results were used to examine voltage regulation and VAR management issues. GE's WindCONTROL supervisor controller system regulates voltage and power in real time. It supplies reactive power to the grid to regulate system voltage and stabilize grids. It was emphasized that model validation is becoming increasingly important as wind penetration increases. Development of stability models is ongoing and grid codes are driving increased functionality in wind plants. This presentation included graphs indicating WTG reactive power response; WTG voltage response; plant level Volt/Var tests; and, Volt/Var control. Field test simulation results were also presented. It was revealed that ZVRT test results met grid requirements. Volt/Var response of WTGs was extremely fast and stable. It was determined that the response to significant grid disturbances will produce maximum (or minimum) reactive power output within 200 ms. The stability models were shown to closely replicate plant performance. figs.

  4. Validation of GNSS Multipath Model for Space Proximity Operations Using the Hubble Servicing Mission 4 Experiment

    Science.gov (United States)

    Ashman, Ben; Veldman, Jeanette; Axelrad, Penina; Garrison, James; Winternitz, Luke

    2016-01-01

    In the rendezvous and docking of spacecraft, GNSS signals can reflect off the target vehicle and cause prohibitively large errors in the chaser vehicle receiver at ranges below 200 meters. It has been proposed that the additional ray paths, or multipath, be used as a source of information about the state of the target relative to the receiver. With Hubble Servicing Mission 4 as a case study, electromagnetic ray tracing has been used to construct a model of reflected signals from known geometry. Oscillations in the prompt correlator power due to multipath, known as multipath fading, are studied as a means of model validation. Agreement between the measured and simulated multipath fading serves to confirm the presence of signals reflected off the target spacecraft that might be used for relative navigation.

  5. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Giron, Nicholas Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150°C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  6. Direct-contact condensers for open-cycle OTEC applications: Model validation with fresh water experiments for structured packings

    Energy Technology Data Exchange (ETDEWEB)

    Bharathan, D.; Parsons, B.K.; Althof, J.A.

    1988-10-01

    The objective of the reported work was to develop analytical methods for evaluating the design and performance of advanced high-performance heat exchangers for use in open-cycle thermal energy conversion (OC-OTEC) systems. This report describes the progress made on validating a one-dimensional, steady-state analytical computer of fresh water experiments. The condenser model represents the state of the art in direct-contact heat exchange for condensation for OC-OTEC applications. This is expected to provide a basis for optimizing OC-OTEC plant configurations. Using the model, we examined two condenser geometries, a cocurrent and a countercurrent configuration. This report provides detailed validation results for important condenser parameters for cocurrent and countercurrent flows. Based on the comparisons and uncertainty overlap between the experimental data and predictions, the model is shown to predict critical condenser performance parameters with an uncertainty acceptable for general engineering design and performance evaluations. 33 refs., 69 figs., 38 tabs.

  7. TOPFLOW-experiments, development and validation of CFD models for steam-water flows with phase transfer. Final report

    International Nuclear Information System (INIS)

    Lucas, D.; Beyer, M.; Krepper, E.

    2011-11-01

    The aim of the project was the qualification of CFD codes for steam-water flows with phase transfer. While CFD methods for single-phase flows are already widely used for industrial applications, a corresponding use for two-phase flows is only at the beginning due to the complex structure of the interface and the related interactions between the phases. For the further development and validation of appropriate closure models, experimental data with high spatial and temporal resolution are required. Such data were obtained at the TOPFLOW test facility of HZDR by combination of experiments at realistic parameters for the nuclear reactor safety (large scales, high pressures and temperatures) with innovative measuring techniques. The wire-mesh sensor technology, which provides detailed information on the structure of the interface, was applied in adiabatic air-water experiments as well as in condensation and pressure relief experiments in a large DN200 pipe. As the result of the project, extensive databases with high quality are available. The technology for the fast X-ray tomography, which allows measurements without influencing the flow, was further developed and successfully applied in a first test series. High-resolution data were also obtained from experiments in a model of the hot leg of a pressurized water reactor for different flow situations, including counter-current flow limitation. For the corresponding steam-water experiments conducted at pressures of up to 5 MPa, the newly developed pressure tank technology was successfully used for the first time. For the qualification of CFD codes for two-phase flows the Inhomogeneous MUSIG model was extended in co.operation with ANSYS to consider phase transfer and validated on the basis of the above mentioned TOPFLOW experiments. In addition, improvements were achieved e.g. for turbulence modelling in bubbly flows and simulations were done to validate models for bubble forces and bubble coalescence and breakup. A

  8. TOUGH-RBSN simulator for hydraulic fracture propagation within fractured media: Model validations against laboratory experiments

    Science.gov (United States)

    Kim, Kunhwi; Rutqvist, Jonny; Nakagawa, Seiji; Birkholzer, Jens

    2017-11-01

    This paper presents coupled hydro-mechanical modeling of hydraulic fracturing processes in complex fractured media using a discrete fracture network (DFN) approach. The individual physical processes in the fracture propagation are represented by separate program modules: the TOUGH2 code for multiphase flow and mass transport based on the finite volume approach; and the rigid-body-spring network (RBSN) model for mechanical and fracture-damage behavior, which are coupled with each other. Fractures are modeled as discrete features, of which the hydrological properties are evaluated from the fracture deformation and aperture change. The verification of the TOUGH-RBSN code is performed against a 2D analytical model for single hydraulic fracture propagation. Subsequently, modeling capabilities for hydraulic fracturing are demonstrated through simulations of laboratory experiments conducted on rock-analogue (soda-lime glass) samples containing a designed network of pre-existing fractures. Sensitivity analyses are also conducted by changing the modeling parameters, such as viscosity of injected fluid, strength of pre-existing fractures, and confining stress conditions. The hydraulic fracturing characteristics attributed to the modeling parameters are investigated through comparisons of the simulation results.

  9. Intercomparison and validation of operational coastal-scale models, the experience of the project MOMAR.

    Science.gov (United States)

    Brandini, C.; Coudray, S.; Taddei, S.; Fattorini, M.; Costanza, L.; Lapucci, C.; Poulain, P.; Gerin, R.; Ortolani, A.; Gozzini, B.

    2012-04-01

    The need for regional governments to implement operational systems for the sustainable management of coastal waters, in order to meet the requirements imposed by legislation (e.g. EU directives such as WFD, MSFD, BD and relevant national legislation) often lead to the implementation of coastal measurement networks and to the construction of computational models that surround and describe parts of regional seas without falling in the classic definition of regional/coastal models. Although these operational models may be structured to cover parts of different oceanographic basins, they can have considerable advantages and highlight relevant issues, such as the role of narrow channels, straits and islands in coastal circulation, as both in physical and biogeochemical processes such as in the exchanges of water masses among basins. Two models of this type were made in the context of cross-border European project MOMAR: an operational model of the Tuscan Archipelago sea and one around the Corsica coastal waters, which are both located between the Tyrrhenian and the Algerian-Ligurian-Provençal basins. Although these two models were based on different computer codes (MARS3D and ROMS), they have several elements in common, such as a 400 m resolution, boundary conditions from the same "father" model, and an important area of overlap, the Corsica channel, which has a key role in the exchange of water masses between the two oceanographic basins. In this work we present the results of the comparison of these two ocean forecasting systems in response to different weather and oceanographic forcing. In particular, we discuss aspects related to the validation of the two systems, and a systematic comparison between the forecast/hindcast based on such hydrodynamic models, as regards to both operational models available at larger scale, both to in-situ measurements made by fixed or mobile platforms. In this context we will also present the results of two oceanographic cruises in the

  10. A Large-Scale Multibody Manipulator Soft Sensor Model and Experiment Validation

    Directory of Open Access Journals (Sweden)

    Wu Ren

    2014-01-01

    Full Text Available Stress signal is difficult to obtain in the health monitoring of multibody manipulator. In order to solve this problem, a soft sensor method is presented. In the method, stress signal is considered as dominant variable and angle signal is regarded as auxiliary variable. By establishing the mathematical relationship between them, a soft sensor model is proposed. In the model, the stress information can be deduced by angle information which can be easily measured for such structures by experiments. Finally, test of ground and wall working conditions is done on a multibody manipulator test rig. The results show that the stress calculated by the proposed method is closed to the test one. Thus, the stress signal is easier to get than the traditional method. All of these prove that the model is correct and the method is feasible.

  11. Modelling and upscaling of transport in carbonates during dissolution: Validation and calibration with NMR experiments.

    Science.gov (United States)

    Muljadi, Bagus P; Bijeljic, Branko; Blunt, Martin J; Colbourne, Adam; Sederman, Andy J; Mantle, Mick D; Gladden, Lynn F

    2017-09-01

    We present an experimental and numerical study of transport in carbonates during dissolution and its upscaling from the pore (∼μm) to core (∼cm) scale. For the experimental part, we use nuclear magnetic resonance (NMR) to probe molecular displacements (propagators) of an aqueous hydrochloric acid (HCl) solution through a Ketton limestone core. A series of propagator profiles are obtained at a large number of spatial points along the core at multiple time-steps during dissolution. For the numerical part, first, the transport model-a particle-tracking method based on Continuous Time Random Walks (CTRW) by Rhodes et al. (2008)-is validated at the pore scale by matching to the NMR-measured propagators in a beadpack, Bentheimer sandstone, and Portland carbonate (Scheven et al., 2005). It was found that the emerging distribution of particle transit times in these samples can be approximated satisfactorily using the power law function ψ(t) ∼ t -1-β , where 0 CTRW parameters; then the shape of the propagators is predicted at later observation times. Finally, a numerical upscaling technique is employed to obtain CTRW parameters for the core. From the NMR-measured propagators, an increasing frequency of displacements in stagnant regions was apparent as the reaction progressed. The present model predicts that non-Fickian behaviour exhibited at the pore scale persists on the centimetre scale. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Experiment for validation of fluid-structure interaction models and algorithms.

    Science.gov (United States)

    Hessenthaler, A; Gaddum, N R; Holub, O; Sinkus, R; Röhrle, O; Nordsletten, D

    2017-09-01

    In this paper a fluid-structure interaction (FSI) experiment is presented. The aim of this experiment is to provide a challenging yet easy-to-setup FSI test case that addresses the need for rigorous testing of FSI algorithms and modeling frameworks. Steady-state and periodic steady-state test cases with constant and periodic inflow were established. Focus of the experiment is on biomedical engineering applications with flow being in the laminar regime with Reynolds numbers 1283 and 651. Flow and solid domains were defined using computer-aided design (CAD) tools. The experimental design aimed at providing a straightforward boundary condition definition. Material parameters and mechanical response of a moderately viscous Newtonian fluid and a nonlinear incompressible solid were experimentally determined. A comprehensive data set was acquired by using magnetic resonance imaging to record the interaction between the fluid and the solid, quantifying flow and solid motion. Copyright © 2016 The Authors. International Journal for Numerical Methods in Biomedical Engineering published by John Wiley & Sons Ltd.

  13. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H Oh; Eung S Kim

    2011-09-01

    Idaho National Laboratory carried out air ingress experiments as part of validating computational fluid dynamics (CFD) calculations. An isothermal test loop was designed and set to understand the stratified-flow phenomenon, which is important as the initial air flow into the lower plenum of the very high temperature gas cooled reactor (VHTR) when a large break loss-of-coolant accident occurs. The unique flow characteristics were focused on the VHTR air-ingress accident, in particular, the flow visualization of the stratified flow in the inlet pipe to the vessel lower plenum of the General Atomic’s Gas Turbine-Modular Helium Reactor (GT-MHR). Brine and sucrose were used as heavy fluids, and water was used to represent a light fluid, which mimics a counter current flow due to the density difference between the stimulant fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between simulant fluids was established even for very small density differences. The CFD calculations were compared with experimental data. A grid sensitivity study on CFD models was also performed using the Richardson extrapolation and the grid convergence index method for the numerical accuracy of CFD calculations . As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  14. A model experiment to study sonic boom propagation through turbulence. Part III: validation of sonic boom propagation models.

    Science.gov (United States)

    Lipkens, Bart

    2002-01-01

    In previous papers, we have shown that model experiments are successful in simulating the propagation of sonic booms through the atmospheric turbulent boundary layer. The results from the model experiment, pressure wave forms of spark-produced N waves and turbulence characteristics of the plane jet, are used to test various sonic boom models for propagation through turbulence. Both wave form distortion models and rise time prediction models are tested. Pierce's model [A. D. Pierce, "Statistical theory of atmospheric turbulence effects on sonic boom rise times," J. Acoust. Soc. Am. 49, 906-924 (1971)] based on the wave front folding mechanism at a caustic yields an accurate prediction for the rise time of the mean wave form after propagation through the turbulence.

  15. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  16. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  17. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  18. Validation of Friction Models in MARS-MultiD Module with Two-Phase Cross Flow Experiment

    International Nuclear Information System (INIS)

    Choi, Chi-Jin; Yang, Jin-Hwa; Cho, Hyoung-Kyu; Park, Goon-Cher; Euh, Dong-Jin

    2015-01-01

    In the downcomer of Advanced Power Reactor 1400 (APR1400) which has direct vessel injection (DVI) lines as an emergency core cooling system, multidimensional two-phase flow may occur due to the Loss-of-Coolant-Accident (LOCA). The accurate prediction about that is high relevance to evaluation of the integrity of the reactor core. For this reason, Yang performed an experiment that was to investigate the two-dimensional film flow which simulated the two-phase cross flow in the upper downcomer, and obtained the local liquid film velocity and thickness data. From these data, it could be possible to validate the multidimensional modules of system analysis codes. In this study, MARS-MultiD was used to simulate the Yang's experiment, and obtained the local variables. Then, the friction models used in MARS-MultiD were validated by comparing the two-phase flow experimental results with the calculated local variables. In this study, the two-phase cross flow experiment was modeled by the MARS-MultiD. Compared with the experimental results, the calculated results by the code properly presented mass conservation which could be known from the relation between the liquid film velocity and thickness at the same flow rate. The magnitude and direction of the liquid film, however, did not follow well with experimental results. According to the results of Case-2, wall friction should be increased, and interfacial friction should be decreased in MARS-MultiD. These results show that it is needed to modify the friction models in the MARS-MultiD to simulate the two-phase cross flow

  19. Three-dimensional fuel pin model validation by prediction of hydrogen distribution in cladding and comparison with experiment

    Energy Technology Data Exchange (ETDEWEB)

    Aly, A. [North Carolina State Univ., Raleigh, NC (United States); Avramova, Maria [North Carolina State Univ., Raleigh, NC (United States); Ivanov, Kostadin [Pennsylvania State Univ., University Park, PA (United States); Motta, Arthur [Pennsylvania State Univ., University Park, PA (United States); Lacroix, E. [Pennsylvania State Univ., University Park, PA (United States); Manera, Annalisa [Univ. of Michigan, Ann Arbor, MI (United States); Walter, D. [Univ. of Michigan, Ann Arbor, MI (United States); Williamson, R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gamble, K. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-10-29

    To correctly describe and predict this hydrogen distribution there is a need for multi-physics coupling to provide accurate three-dimensional azimuthal, radial, and axial temperature distributions in the cladding. Coupled high-fidelity reactor-physics codes with a sub-channel code as well as with a computational fluid dynamics (CFD) tool have been used to calculate detailed temperature distributions. These high-fidelity coupled neutronics/thermal-hydraulics code systems are coupled further with the fuel-performance BISON code with a kernel (module) for hydrogen. Both hydrogen migration and precipitation/dissolution are included in the model. Results from this multi-physics analysis is validated utilizing calculations of hydrogen distribution using models informed by data from hydrogen experiments and PIE data.

  20. Validation of Vibro-Impact Force Models by Numerical Simulation, Perturbation Methods and Experiments

    DEFF Research Database (Denmark)

    de Souza Reboucas, Geraldo Francisco; Santos, Ilmar; Thomsen, Jon Juel

    2017-01-01

    The frequency response of a single degree of freedom vibro-impact oscillator is analyzed using Harmonic Linearization, Averaging and Numeric Simulation, considering three different impact force models: one given by a piecewise-linear function (Kelvin-Voigt model), another by a high-order power...

  1. RELAP5 Model Description and Validation for the BR2 Loss-of-Flow Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States); Van den Branden, G. [Argonne National Lab. (ANL), Argonne, IL (United States); Sikik, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Koonen, E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-07-01

    This paper presents a description of the RELAP5 model, the calibration method used to obtain the minor loss coefficients from the available hydraulic data and the LOFA simulation results compared to the 1963 experimental tests for HEU fuel.

  2. Model and experiences of initiating collaboration with traditional healers in validation of ethnomedicines for HIV/AIDS in Namibia

    Directory of Open Access Journals (Sweden)

    Chinsembu Kazhila C

    2009-10-01

    Full Text Available Abstract Many people with Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome (HIV/AIDS in Namibia have access to antiretroviral drugs but some still use traditional medicines to treat opportunistic infections and offset side-effects from antiretroviral medication. Namibia has a rich biodiversity of indigenous plants that could contain novel anti-HIV agents. However, such medicinal plants have not been identified and properly documented. Various ethnomedicines used to treat HIV/AIDS opportunistic infections have not been scientifically validated for safety and efficacy. These limitations are mostly attributable to the lack of collaboration between biomedical scientists and traditional healers. This paper presents a five-step contextual model for initiating collaboration with Namibian traditional healers in order that candidate plants that may contain novel anti-HIV agents are identified, and traditional medicines used to treat HIV/AIDS opportunistic infections are subjected to scientific validation. The model includes key structures and processes used to initiate collaboration with traditional healers in Namibia; namely, the National Biosciences Forum, a steering committee with the University of Namibia (UNAM as the focal point, a study tour to Zambia and South Africa where other collaborative frameworks were examined, commemorations of the African Traditional Medicine Day (ATMD, and consultations with stakeholders in north-eastern Namibia. Experiences from these structures and processes are discussed. All traditional healers in north-eastern Namibia were willing to collaborate with UNAM in order that their traditional medicines could be subjected to scientific validation. The current study provides a framework for future collaboration with traditional healers and the selection of candidate anti-HIV medicinal plants and ethnomedicines for scientific testing in Namibia.

  3. Towards a CFD model for boiling flows: validation of QMOM predictions with TOPFLOW experiments

    OpenAIRE

    Buffo, Antonio; Vanni, Marco; Marchisio, Daniele L.; Montoya, Gustavo; Baglietto, Emilio

    2017-01-01

    Boiling flows are very complex systems, usually confined in vertical pipes, where the liquid water moving upwards and the steam gas bubbles generated at the walls. The fluid dynamics of such systems is determined by the interplay of many different phenomena, including bubble nucleation, growth, condensation, coalescence, and breakage. For this reason, the development of a fully predictive computational fluid dynamics (CFD) model is very challenging, therefore we focus here only on some of the...

  4. A multimodal detection model of dolphins to estimate abundance validated by field experiments.

    Science.gov (United States)

    Akamatsu, Tomonari; Ura, Tamaki; Sugimatsu, Harumi; Bahl, Rajendar; Behera, Sandeep; Panda, Sudarsan; Khan, Muntaz; Kar, S K; Kar, C S; Kimura, Satoko; Sasaki-Yamamoto, Yukiko

    2013-09-01

    Abundance estimation of marine mammals requires matching of detection of an animal or a group of animal by two independent means. A multimodal detection model using visual and acoustic cues (surfacing and phonation) that enables abundance estimation of dolphins is proposed. The method does not require a specific time window to match the cues of both means for applying mark-recapture method. The proposed model was evaluated using data obtained in field observations of Ganges River dolphins and Irrawaddy dolphins, as examples of dispersed and condensed distributions of animals, respectively. The acoustic detection probability was approximately 80%, 20% higher than that of visual detection for both species, regardless of the distribution of the animals in present study sites. The abundance estimates of Ganges River dolphins and Irrawaddy dolphins fairly agreed with the numbers reported in previous monitoring studies. The single animal detection probability was smaller than that of larger cluster size, as predicted by the model and confirmed by field data. However, dense groups of Irrawaddy dolphins showed difference in cluster sizes observed by visual and acoustic methods. Lower detection probability of single clusters of this species seemed to be caused by the clumped distribution of this species.

  5. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  6. Subsurface Bio-Immobilization of Plutonium: Experiment and Model Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Donald; Rittmann, Bruce

    2006-06-01

    The goal of this project is to conduct a concurrent experimental and modeling study centered on the interactions of Shewanella algae BrY with plutonium and uranium species and phases. The most important objective of this research is to investigate the long-term stability of bioprecipitated immobilized actinide phases under changing redox conditions in biologically active systems. The long-term stability of bio-immobilized actinides (e.g. by bio-reduction) is a key criteria that defines the utility and effectiveness of a remediation/containment strategy for subsurface actinide contaminants. Plutonium, which is the focus of this project, is the key contaminant of concern at several DOE sites.

  7. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...

  8. Validation of a primary production model of the desert shrub Larrea tridentata using soil-moisture augmentation experiments.

    Science.gov (United States)

    Reynolds, James F; Cunningham, Gary L

    1981-01-01

    In previous papers we have described and verified a primary production model of the desert shrub Larrea tridentata. Here we address the validation phase of the evaluation of this model. Two versions of the model which differ in the priority scheme used for allocating carbon to reproductive or vegetative organs were compared on the basis of their usefulness and reliability over a range of soil-moisture conditions. Over an entire growing season when soil-moisture conditions were near "normal" both versions of the model were adequate predictors of total above-ground vegetative growth and one was an adequate predictor of reproductive growth as well. A more detailed analysis revealed that the versions varied in the range of soil-moisture conditions over which they were adequate and that neither was adequate when soil-moisture had remained high for extended periods. The validation process has revealed some likely areas for model improvement to increase adequacy.

  9. Validation of CTF Droplet Entrainment and Annular/Mist Closure Models using Riso Steam/Water Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    This report summarizes the work done to validate the droplet entrainment and de-entrainment models as well as two-phase closure models in the CTF code by comparison with experimental data obtained at Riso National Laboratory. The Riso data included a series of over 250 steam/water experiments that were performed in both tube and annulus geometries over a range of various pressures and outlet qualities. Experimental conditions were set so that the majority of cases were in the annular/mist ow regime. Measurements included liquid lm ow rate, droplet ow rate, lm thickness, and two-phase pressure drop. CTF was used to model 180 of the tubular geometry cases, matching experimental geometry, outlet pressure, and outlet ow quality to experimental values. CTF results were compared to the experimental data at the outlet of the test section in terms of vapor and entrained liquid ow fractions, pressure drop per unit length, and liquid lm thickness. The entire process of generating CTF input decks, running cases, extracting data, and generating comparison plots was scripted using Python and Matplotlib for a completely automated validation process. All test cases and scripting tools have been committed to the COBRA-TF master repository and selected cases have been added to the continuous testing system to serve as regression tests. The dierences between the CTF- and experimentally-calculated ow fraction values were con- sistent with previous calculations by Wurtz, who applied the same entrainment correlation to the same data. It has been found that CTF's entrainment/de-entrainment predictive capability in the annular/mist ow regime for this particular facility is comparable to the licensed industry code, COBRAG. While lm and droplet predictions are generally good, it has been found that accuracy is diminished at lower ow qualities. This nding is consistent with the noted deciencies in the Wurtz entrainment model employed by CTF. The CTF predicted two-phase pressure drop in

  10. Validating the energy transport modeling of the DIII-D and EAST ramp up experiments using TSC

    Science.gov (United States)

    Liu, Li; Guo, Yong; Chan, Vincent; Mao, Shifeng; Wang, Yifeng; Pan, Chengkang; Luo, Zhengping; Zhao, Hailin; Ye, Minyou

    2017-06-01

    The confidence in ramp up scenario design of the China fusion engineering test reactor (CFETR) can be significantly enhanced using validated transport models to predict the current profile and temperature profile. In the tokamak simulation code (TSC), two semi-empirical energy transport models (the Coppi-Tang (CT) and BGB model) and three theory-based models (the GLF23, MMM95 and CDBM model) are investigated on the CFETR relevant ramp up discharges, including three DIII-D ITER-like ramp up discharges and one EAST ohmic discharge. For the DIII-D discharges, all the transport models yield dynamic {{\\ell}\\text{i}} within +/- 0.15 deviations except for some time points where the experimental fluctuation is very strong. All the models agree with the experimental {β\\text{p}} except that the CT model strongly overestimates {β\\text{p}} in the first half of ramp up phase. When applying the CT, CDBM and GLF23 model to estimate the internal flux, they show maximum deviations of more than 10% because of inaccuracies in the temperature profile predictions, while the BGB model performs best on the internal flux. Although all the models fall short in reproducing the dynamic {{\\ell}\\text{i}} evolution for the EAST tokamak, the result of the BGB model is the closest to the experimental {{\\ell}\\text{i}} . Based on these comparisons, we conclude that the BGB model is the most consistent among these models for simulating CFETR ohmic ramp-up. The CT model with improvement for better simulation of the temperature profiles in the first half of ramp up phase will also be attractive. For the MMM95, GLF23 and CDBM model, better prediction of the edge temperature will improve the confidence for CFETR L-mode simulation. Conclusive validation of any transport model will require extensive future investigation covering a larger variety discharges.

  11. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  12. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  13. The role of CFD combustion modeling in hydrogen safety management – V: Validation for slow deflagrations in homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, Tadej, E-mail: tadej.holler@ijs.si [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Kljenak, Ivo [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, Ed [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2016-12-15

    Highlights: • Validation of the modeling approach for hydrogen deflagration is presented. • Modeling approach is based on two combustion models implemented in ANSYS Fluent. • Experiments with various initial hydrogen concentrations were used for validation. • The effects of heat transfer mechanisms selection were also investigated. • The grid sensitivity analysis was performed as well. - Abstract: The control of hydrogen in the containment is an important safety issue following rapid oxidation of the uncovered reactor core during a severe accident in a Nuclear Power Plant (NPP), because dynamic pressure loads from eventual hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In the set of our previous papers, a CFD-based method to assess the consequence of fast combustion of uniform hydrogen-air mixtures was presented, followed by its validation for hydrogen-air mixtures with diluents and for non-uniform hydrogen-air mixtures. In the present paper, the extension of this model for the slow deflagration regime is presented and validated using the hydrogen deflagration experiments performed in the medium-scale experimental facility THAI. The proposed method is implemented in the CFD software ANSYS Fluent using user defined functions. The paper describes the combustion model and the main results of code validation. It addresses questions regarding turbulence model selection, effect of heat transfer mechanisms, and grid sensitivity, as well as provides insights into the importance of combustion model choice for the slow deflagration regime of hydrogen combustion in medium-scale and large-scale experimental vessels mimicking the NPP containment.

  14. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  15. Validation Study for an Atmospheric Dispersion Model, Using Effective Source Heights Determined from Wind Tunnel Experiments in Nuclear Safety Analysis

    Directory of Open Access Journals (Sweden)

    Masamichi Oura

    2018-03-01

    Full Text Available For more than fifty years, atmospheric dispersion predictions based on the joint use of a Gaussian plume model and wind tunnel experiments have been applied in both Japan and the U.K. for the evaluation of public radiation exposure in nuclear safety analysis. The effective source height used in the Gaussian model is determined from ground-level concentration data obtained by a wind tunnel experiment using a scaled terrain and site model. In the present paper, the concentrations calculated by this method are compared with data observed over complex terrain in the field, under a number of meteorological conditions. Good agreement was confirmed in near-neutral and unstable stabilities. However, it was found to be necessary to reduce the effective source height by 50% in order to achieve a conservative estimation of the field observations in a stable atmosphere.

  16. ATHLET validation using accident management experiments

    Energy Technology Data Exchange (ETDEWEB)

    Teschendorff, V.; Glaeser, H.; Steinhoff, F. [Gasellschaft fuer Anlagen - und Reaktorsicherheit (GSR) mbH, Garching (Germany)

    1995-09-01

    The computer code ATHLET is being developed as an advanced best-estimate code for the simulation of leaks and transients in PWRs and BWRs including beyond design basis accidents. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialisation by a steady-state calculation, full-range drift-flux model, and dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The systematic validation of ATHLET is based on a well balanced set of integral and separate effect tests derived from the CSNI proposal emphasising, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities. PKL-III test B 2.1 simulates a cool-down procedure during an emergency power case with three steam generators isolated. Natural circulation under these conditions was investigated in detail in a pressure range of 4 to 2 MPa. The transient was calculated over 22000 s with complicated boundary conditions including manual control actions. The calculations demonstrations the capability to model the following processes successfully: (1) variation of the natural circulation caused by steam generator isolation, (2) vapour formation in the U-tubes of the isolated steam generators, (3) break-down of circulation in the loop containing the isolated steam generator following controlled cool-down of the secondary side, (4) accumulation of vapour in the pressure vessel dome. One conclusion with respect to the suitability of experiments simulating AM procedures for code validation purposes is that complete documentation of control actions during the experiment must be available. Special attention should be given to the documentation of operator actions in the course of the experiment.

  17. Lidar Atmospheric Sensing Experiment (LASE) Validation

    Data.gov (United States)

    National Aeronautics and Space Administration — An extensive validation experiment was conducted in September 1995 from Wallops Island, Virginia, to evaluate the performance of the LASE (Lidar Atmospheric Sensing...

  18. Validation of the CATHENA channel model for the post blowdown analysis for the CS28-1 experiment, II - transient

    International Nuclear Information System (INIS)

    Rhee, B.W.; Park, J.H.

    2006-01-01

    To form a licensing bases for the new methodology of fuel channel safety analysis code system for CANDU-6, a CATHENA model for the post-blowdown fuel channel analysis has been developed, and tested for a high temperature thermal-chemical experiment CS28-1. Pursuant to the objective of this study the current study has focused on understanding the involved phenomena, their interrelations, and how to maintain good accuracy in the temperature and H 2 generation rate prediction without losing the important physics of the involved phenomena. The transient simulation results for the FESs of three fuel rings and the pressure tube were quite good as proven in the Figs. 3∼6. However this raises a question how the transient FES and pressure tube temperature can be predicted so well in spite of the insufficient justification of using the 'non-participating medium assumption' for the CO 2 gas gap. Through this study, it was found that the radiation heat transfer model of CATHENA among FES of three rings and the pressure tube as well as the exothermic metal-water reaction model based on the Urbanic-Heidrick correlation are quite accurate and sound. Also it was found that an accurate prediction of the initial condition of the experiment is very important for the accurate prediction of the whole transient as it serves as the starting point of the transient. (author)

  19. Validity - a matter of resonant experience

    DEFF Research Database (Denmark)

    Revsbæk, Line

    This paper is about doing interview analysis drawing on researcher’s own lived experience concerning the question of inquiry. The paper exemplifies analyzing case study participants’ experience from the resonant experience of researcher’s own life evoked while listening to recorded interview mate...... entry processes. The validity of doing interview analysis drawing on the resonant experience of researcher is argued from a pragmatist perspective....

  20. Influence of deformation mechanisms on the mechanical behavior of metals and alloys: Experiments, constitutive modeling, and validation

    International Nuclear Information System (INIS)

    Gray, G.T. III; Cerreta, E.; Chen, S.R.; Maudlin, P.J.

    2004-01-01

    Jim Williams has made seminal contributions to the field of structure / property relations and its controlling effects on the mechanical behavior of metals and alloys. This talk will discuss experimental results illustrating the role of interstitial content, grain size, texture, temperature, and strain rate on the operative deformation mechanisms, mechanical behavior, and substructure evolution in titanium, zirconium, hafnium, and rhenium. Increasing grain size is shown to significantly decrease the dynamic flow strength of Ti and Zr while increasing work-hardening rates due to an increased incidence of deformation twinning. Increasing oxygen interstitial content is shown to significantly alter both the constitutive response and α-ω shock-induced phase transition in Zr. The influence of crystallographic texture on the mechanical behavior in Ti, Zr, and Hf is discussed in terms of slip system and deformation twinning activity. An example of the utility of incorporation of operative deformation mechanisms into a polycrystalline plasticity constitutive model and validation using Taylor cylinder impact testing is presented

  1. An attempt to calibrate and validate a simple ductile failure model against axial-torsion experiments on Al 6061-T651

    Energy Technology Data Exchange (ETDEWEB)

    Reedlunn, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lu, Wei -Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-01-01

    This report details a work in progress. We have attempted to calibrate and validate a Von Mises plasticity model with the Johnson-Cook failure criterion ( Johnson & Cook , 1985 ) against a set of experiments on various specimens of Al 6061-T651. As will be shown, the effort was not successful, despite considerable attention to detail. When the model was com- pared against axial-torsion experiments on tubes, it over predicted failure by 3 x in tension, and never predicted failure in torsion, even when the tube was twisted by 4 x further than the experiment. While this result is unfortunate, it is not surprising. Ductile failure is not well understood. In future work, we will explore whether more sophisticated material mod- els of plasticity and failure will improve the predictions. Selecting the appropriate advanced material model and interpreting the results of said model are not trivial exercises, so it is worthwhile to fully investigate the behavior of a simple plasticity model before moving on to an anisotropic yield surface or a similarly complicated model.

  2. The role of CFD combustion modelling in hydrogen safety management – VI: Validation for slow deflagration in homogeneous hydrogen-air-steam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cutrono Rakhimov, A., E-mail: cutrono@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Visser, D.C., E-mail: visser@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, T., E-mail: tadej.holler@ijs.si [Jožef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, E.M.J., E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2017-01-15

    Highlights: • Deflagration of hydrogen-air-steam homogeneous mixtures is modeled in a medium-scale containment. • Adaptive mesh refinement is applied on flame front positions. • Steam effect influence on combustion modeling capabilities is investigated. • Mean pressure rise is predicted with 18% under-prediction when steam is involved. • Peak pressure is evaluated with 5% accuracy when steam is involved. - Abstract: Large quantities of hydrogen can be generated during a severe accident in a water-cooled nuclear reactor. When released in the containment, the hydrogen can create a potential deflagration risk. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor. Therefore, accurate prediction of these pressure loads is an important safety issue. In previous papers, we validated a Computational Fluid Dynamics (CFD) based method to determine the pressure loads from a fast deflagration. The combustion model applied in the CFD method is based on the Turbulent Flame Speed Closure (TFC). In our last paper, we presented the extension of this combustion model, Extended Turbulent Flame Speed Closure (ETFC), and its validation against hydrogen deflagration experiments in the slow deflagration regime. During a severe accident, cooling water will enter the containment as steam. Therefore, the effect of steam on hydrogen deflagration is important to capture in a CFD model. The primary objectives of the present paper are to further validate the TFC and ETFC combustion models, and investigate their capability to predict the effect of steam. The peak pressures, the trends of the flame velocity, and the pressure rise with an increase in the initial steam dilution are captured reasonably well by both combustion models. In addition, the ETFC model appeared to be more robust to mesh resolution changes. The mean pressure rise is evaluated with 18% under-prediction and the peak pressure is evaluated with 5

  3. Characterization of Aluminum Honeycomb and Experimentation for Model Development and Validation, Volume I: Discovery and Characterization Experiments for High-Density Aluminum Honeycomb

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Wei-Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Korellis, John S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Lee, Kenneth L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Scheffel, Simon [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Hinnerichs, Terry Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Solid Mechanics; Neilsen, Michael K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Applied Mechanics Development; Scherzinger, William Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Solid Mechanics

    2006-08-01

    Honeycomb is a structure that consists of two-dimensional regular arrays of open cells. High-density aluminum honeycomb has been used in weapon assemblies to mitigate shock and protect payload because of its excellent crush properties. In order to use honeycomb efficiently and to certify the payload is protected by the honeycomb under various loading conditions, a validated honeycomb crush model is required and the mechanical properties of the honeycombs need to be fully characterized. Volume I of this report documents an experimental study of the crush behavior of high-density honeycombs. Two sets of honeycombs were included in this investigation: commercial grade for initial exploratory experiments, and weapon grade, which satisfied B61 specifications. This investigation also includes developing proper experimental methods for crush characterization, conducting discovery experiments to explore crush behaviors for model improvement, and identifying experimental and material uncertainties.

  4. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  5. The role of CFD combustion modeling in hydrogen safety management – III: Validation based on homogeneous hydrogen–air–diluent experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Shell Global Solutions Ltd., Brabazon House, Concord Business Park, Threapwood Road, Manchester M220RR (United Kingdom); Komen, Ed [Nuclear Research and Consultancy Group – NRG, P.O. Box 25, 1755 ZG Petten (Netherlands); Roekaerts, Dirk [Delft University of Technology, P.O. Box 5, 2600 AA Delft (Netherlands)

    2015-08-15

    Highlights: • A CFD based method proposed in the previous article is used for the simulation of the effect of CO{sub 2}–He dilution on hydrogen deflagration. • A theoretical study is presented to verify whether CO{sub 2}–He diluent can be used as a replacement for H{sub 2}O as diluent. • CFD model used for the validation work is described. • TFC combustion model results are in good agreement with large-scale homogeneous hydrogen–air–CO{sub 2}–He experiments. - Abstract: Large quantities of hydrogen can be generated and released into the containment during a severe accident in a PWR. The generated hydrogen, when mixed with air, can lead to hydrogen combustion. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. Therefore, accurate prediction of these pressure loads is an important safety issue. In our previous article, a CFD based method to determine these pressure loads was presented. This CFD method is based on the application of a turbulent flame speed closure combustion model. The method was validated against three uniform hydrogen–air deflagration experiments with different blockage ratio performed in the ENACCEF facility. It was concluded that the maximum pressures were predicted within 13% accuracy, while the rate of pressure rise dp/dt was predicted within about 30%. The eigen frequencies of the residual pressure wave phenomena were predicted within a few %. In the present article, we perform additional validation of the CFD based method against three uniform hydrogen–air–CO{sub 2}–He deflagration experiments with three different concentrations of the CO{sub 2}–He diluent. The trends of decrease in the flame velocity, the intermediate peak pressure, the rate of pressure rise dp/dt, and the maximum value of the mean pressure with an increase in the CO{sub 2}–He dilution are captured well in the simulations. From the

  6. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  7. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    Energy Technology Data Exchange (ETDEWEB)

    Westin, J.; Henriksson, M. (Vattenfall Research and Development AB (Sweden)); Paettikangas, T. (VTT (Finland)); Toppila, T.; Raemae, T. (Fortum Nuclear Services Ltd (Finland)); Kudinov, P. (KTH Nuclear Power Safety (Sweden)); Anglart, H. (KTH Nuclear Reactor Technology (Sweden))

    2009-08-15

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in AElvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  8. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  9. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  10. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  11. Validation of MCCI models implemented in ASTEC MEDICIS on OECD CCI-2 and CCI-3 experiments and further consideration on reactor cases

    International Nuclear Information System (INIS)

    Agethen, K.; Koch, M.K.

    2014-01-01

    Within a severe accident in a light water reactor a loss of coolant can result in core melting and vessel failure. Afterwards, molten core material may discharge into the containment cavity and interact with the concrete basemat. Due to concrete erosion gases are released, which lead to exothermic oxidation reactions with the metals in the corium and to formation of combustible mixtures. In this work the MEDICIS module of the Accident Source Term Evaluation Code (ASTEC) is validated on experiments of the OECD CCI programme. The primary focus is set on the CCI-2 experiment with limestone common sand (LCS) concrete, in which nearly homogenous erosion appeared, and the CCI-3 experiment with siliceous concrete, in which increased lateral erosion occurred. These experiments enable the analysis of heat transfer depending on the axial and radial orientation from the interior of the melt to the surrounding surfaces and the impact of top flooding. For the simulation of both tests, two existing models in MEDICIS are used and analysed. Results of simulations show a good agreement of ablation behaviour, layer temperature and energy balance with experimental results. Furthermore the issue of a quasi-steady state in the energy balance for the long term appeared. Finally the basic data are scaled up to a generic reactor scenario, which shows that this quasi-steady state similarly occurred. (author)

  12. Suicidal Ideation, Parent-Child Relationships, and Adverse Childhood Experiences: A Cross-Validation Study Using a Graphical Markov Model

    Science.gov (United States)

    Hardt, Jochen; Herke, Max; Schier, Katarzyna

    2011-01-01

    Suicide is one of the leading causes of death in many Western countries. An exploration of factors associated with suicidality may help to understand the mechanisms that lead to suicide. Two samples in Germany (n = 500 and n = 477) were examined via Internet regarding suicidality, depression, alcohol abuse, adverse childhood experiences, and…

  13. CFD code development for incompressible two-phase flow using two-fluid model: preliminary calculation and plume validation experiment

    International Nuclear Information System (INIS)

    Heo, B. G.; Jung, C. H.; Yoon, H. Y.; Yeo, D. J.; Song, C. H.

    2002-01-01

    A multidimensional numerical code for solving incompressible two-fluid is presented based on the Finite Volume Method (FVM) and the Simplified Marker And Cell (SMAC) method. Details of the present method and comparisons between the calculation and experiment are described for two-dimensional flow patterns of bubbly flow which show good agreement. Further implementations of the interfacial correlations are required for the application of the present code to various two-phase problems

  14. Agglomeration of Non-metallic Inclusions at Steel/Ar Interface: In- Situ Observation Experiments and Model Validation

    Science.gov (United States)

    Mu, Wangzhong; Dogan, Neslihan; Coley, Kenneth S.

    2017-10-01

    Better understanding of agglomeration behavior of nonmetallic inclusions in the steelmaking process is important to control the cleanliness of the steel. In this work, a revision on the Paunov simplified model has been made according to the original Kralchevsky-Paunov model. Thus, this model has been applied to quantitatively calculate the attractive capillary force on inclusions agglomerating at the liquid steel/gas interface. Moreover, the agglomeration behavior of Al2O3 inclusions at a low carbon steel/Ar interface has been observed in situ by high-temperature confocal laser scanning microscopy (CLSM). The velocity and acceleration of inclusions and attractive forces between Al2O3 inclusions of various sizes were calculated based on the CLSM video. The results calculated using the revised model offered a reasonable fit with the present experimental data for different inclusion sizes. Moreover, a quantitative comparison was made between calculations using the equivalent radius of a circle and those using the effective radius. It was found that the calculated capillary force using equivalent radius offered a better fit with the present experimental data because of the inclusion characteristics. Comparing these results with other studies in the literature allowed the authors to conclude that when applied in capillary force calculations, the equivalent radius is more suitable for inclusions with large size and irregular shape, and the effective radius is more appropriate for inclusions with small size or a large shape factor. Using this model, the effect of inclusion size on attractive capillary force has been investigated, demonstrating that larger inclusions are more strongly attracted.

  15. Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics model. Code-Saturne validation with the Prairie Grass experiment/Study of atmospheric stratification influence on pollutants dispersion using a numerical fluid mechanics software

    International Nuclear Information System (INIS)

    Coulon, Fanny

    2010-09-01

    A validation of Code-Saturne, a computational fluids dynamics model developed by EDF, is proposed for stable conditions. The goal is to guarantee the performance of the model in order to use it for impacts study. A comparison with the Prairie Grass data field experiment and with two Gaussian plume models will be done [fr

  16. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  17. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  18. Forward modeling of fluctuating dietary 13C signals to validate 13C turnover models of milk and milk components from a diet-switch experiment.

    Directory of Open Access Journals (Sweden)

    Alexander Braun

    Full Text Available Isotopic variation of food stuffs propagates through trophic systems. But, this variation is dampened in each trophic step, due to buffering effects of metabolic and storage pools. Thus, understanding of isotopic variation in trophic systems requires knowledge of isotopic turnover. In animals, turnover is usually quantified in diet-switch experiments in controlled conditions. Such experiments usually involve changes in diet chemical composition, which may affect turnover. Furthermore, it is uncertain if diet-switch based turnover models are applicable under conditions with randomly fluctuating dietary input signals. Here, we investigate if turnover information derived from diet-switch experiments with dairy cows can predict the isotopic composition of metabolic products (milk, milk components and feces under natural fluctuations of dietary isotope and chemical composition. First, a diet-switch from a C3-grass/maize diet to a pure C3-grass diet was used to quantify carbon turnover in whole milk, lactose, casein, milk fat and feces. Data were analyzed with a compartmental mixed effects model, which allowed for multiple pools and intra-population variability, and included a delay between feed ingestion and first tracer appearance in outputs. The delay for milk components and whole milk was ~12 h, and that of feces ~20 h. The half-life (t½ for carbon in the feces was 9 h, while lactose, casein and milk fat had a t½ of 10, 18 and 19 h. The (13C kinetics of whole milk revealed two pools, a fast pool with a t½ of 10 h (likely representing lactose, and a slower pool with a t½ of 21 h (likely including casein and milk fat. The diet-switch based turnover information provided a precise prediction (RMSE ~0.2 ‰ of the natural (13C fluctuations in outputs during a 30 days-long period when cows ingested a pure C3 grass with naturally fluctuating isotope composition.

  19. Statistical validation of stochastic models

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  20. Reconceptualising the external validity of discrete choice experiments.

    Science.gov (United States)

    Lancsar, Emily; Swait, Joffre

    2014-10-01

    External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.

  1. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  2. Validation of KENO V.a: Comparison with critical experiments

    International Nuclear Information System (INIS)

    Jordan, W.C.; Landers, N.F.; Petrie, L.M.

    1986-12-01

    Section 1 of this report documents the validation of KENO V.a against 258 critical experiments. Experiments considered were primarily high or low enriched uranium systems. The results indicate that the KENO V.a Monte Carlo Criticality Program accurately calculates a broad range of critical experiments. A substantial number of the calculations showed a positive or negative bias in excess of 1 1/2% in k-effective (k/sub eff/). Classes of criticals which show a bias include 3% enriched green blocks, highly enriched uranyl fluoride slab arrays, and highly enriched uranyl nitrate arrays. If these biases are properly taken into account, the KENO V.a code can be used with confidence for the design and criticality safety analysis of uranium-containing systems. Sections 2 of this report documents the results of investigation into the cause of the bias observed in Sect. 1. The results of this study indicate that the bias seen in Sect. 1 is caused by code bias, cross-section bias, reporting bias, and modeling bias. There is evidence that many of the experiments used in this validation and in previous validations are not adequately documented. The uncertainty in the experimental parameters overshadows bias caused by the code and cross sections and prohibits code validation to better than about 1% in k/sub eff/. 48 refs., 19 figs., 19 tabs

  3. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  4. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pbex measurements

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.

    2012-01-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ( 137 Cs) and excess lead-210 ( 210 Pb ex ) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from 137 Cs and 210 Pb ex measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. - Highlights: ► Soil erosion is an important threat to the long-term sustainability of agriculture.

  5. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    Science.gov (United States)

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Criticality Experiments

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results for critical configurations are shown. (author)

  7. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....

  8. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  9. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  10. PSI-Center Simulations of Validation Platform Experiments

    Science.gov (United States)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  11. Geochemistry Model Validation Report: External Accumulation Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  12. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  13. Site characterization and validation - Tracer migration experiment in the validation drift, report 2, part 1: performed experiments, results and evaluation

    International Nuclear Information System (INIS)

    Birgersson, L.; Widen, H.; Aagren, T.; Neretnieks, I.; Moreno, L.

    1992-01-01

    This report is the second of the two reports describing the tracer migration experiment where water and tracer flow has been monitored in a drift at the 385 m level in the Stripa experimental mine. The tracer migration experiment is one of a large number of experiments performed within the Site Characterization and Validation (SCV) project. The upper part of the 50 m long validation drift was covered with approximately 150 plastic sheets, in which the emerging water was collected. The water emerging into the lower part of the drift was collected in short boreholes, sumpholes. Sex different tracer mixtures were injected at distances between 10 and 25 m from the drift. The flowrate and tracer monitoring continued for ten months. Tracer breakthrough curves and flowrate distributions were used to study flow paths, velocities, hydraulic conductivities, dispersivities, interaction with the rock matrix and channelling effects within the rock. The present report describes the structure of the observations, the flowrate measurements and estimated hydraulic conductivities. The main part of this report addresses the interpretation of the tracer movement in fractured rock. The tracer movement as measured by the more than 150 individual tracer curves has been analysed with the traditional advection-dispersion model and a subset of the curves with the advection-dispersion-diffusion model. The tracer experiments have permitted the flow porosity, dispersion and interaction with the rock matrix to be studied. (57 refs.)

  14. Intercenter validation of a knowledge based model for automated planning of volumetric modulated arc therapy for prostate cancer. The experience of the German RapidPlan Consortium.

    Directory of Open Access Journals (Sweden)

    Carolin Schubert

    Full Text Available To evaluate the performance of a model-based optimisation process for volumetric modulated arc therapy applied to prostate cancer in a multicentric cooperative group. The RapidPlan (RP knowledge-based engine was tested for the planning of Volumetric modulated arc therapy with RapidArc on prostate cancer patients. The study was conducted in the frame of the German RapidPlan Consortium (GRC.43 patients from one institute of the GRC were used to build and train a RP model. This was further shared with all members of the GRC plus an external site from a different country to increase the heterogeneity of the patient's sampling. An in silico multicentric validation of the model was performed at planning level by comparing RP against reference plans optimized according to institutional procedures. A total of 60 patients from 7 institutes were used.On average, the automated RP based plans resulted fully consistent with the manually optimised set with a modest tendency to improvement in the medium-to-high dose region. A per-site stratification allowed to identify different patterns of performance of the model with some organs at risk resulting better spared with the manual or with the automated approach but in all cases the RP data fulfilled the clinical acceptability requirements. Discrepancies in the performance were due to different contouring protocols or to different emphasis put in the optimization of the manual cases.The multicentric validation demonstrated that it was possible to satisfactorily optimize with the knowledge based model patients from all participating centres. In the presence of possibly significant differences in the contouring protocols, the automated plans, though acceptable and fulfilling the benchmark goals, might benefit from further fine tuning of the constraints. The study demonstrates that, at least for the case of prostate cancer patients, it is possibile to share models among different clinical institutes in a cooperative

  15. Patient Experiences with the Preoperative Assessment Clinic (PEPAC): validation of an instrument to measure patient experiences

    NARCIS (Netherlands)

    Edward, G. M.; Lemaire, L. C.; Preckel, B.; Oort, F. J.; Bucx, M. J. L.; Hollmann, M. W.; de Haes, J. C. J. M.

    2007-01-01

    Background. Presently, no comprehensive and validated questionnaire to measure patient experiences of the preoperative assessment clinic (PAC) is available. We developed and validated the Patient Experiences with the Preoperative Assessment Clinic (PEPAC) questionnaire, which can be used for

  16. Tracing Crop Nitrogen Dynamics on the Field-Scale by Combining Multisensoral EO Data with an Integrated Process Model- A Validation Experiment for Cereals in Southern Germany

    Science.gov (United States)

    Hank, Tobias B.; Bach, Heike; Danner, Martin; Hodrius, Martina; Mauser, Wolfram

    2016-08-01

    Nitrogen, being the basic element for the construction of plant proteins and pigments, is one of the most important production factors for agricultural cultivation. High resolution and near real-time information on nitrogen status in the soil thus is of highest interest for economically and ecologically optimized fertilizer planning and application. Unfortunately, nitrogen storage in the soil column cannot be directly observed with Earth Observation (EO) instruments. Advanced EO supported process modelling approaches therefore must be applied that allow tracing the spatiotemporal dynamics of nitrogen transformation, translocation and transport in the soil and in the canopy. Before these models can be applied as decision support tools for smart farming, they must be carefully parameterized and validated. This study applies an advanced land surface process model (PROMET) to selected winter cereal fields in Southern Germany and correlates the model outputs to destructively sampled nitrogen data from the growing season of 2015 (17 sampling dates, 8 sample locations). The spatial parametrization of the process model thereby is supported by assimilating eight satellite images (5 times Landsat 8 OLI and 3 times RapidEye). It was found that the model is capable of realistically tracing the temporal and spatial dynamics of aboveground nitrogen uptake and allocation (R2 = 0.84, RMSE 31.3 kg ha-1).

  17. Steam gasification of wood biomass in a fluidized biocatalytic system bed gasifier: A model development and validation using experiment and Boubaker Polynomials Expansion Scheme BPES

    Directory of Open Access Journals (Sweden)

    Luigi Vecchione

    2015-07-01

    Full Text Available One of the most important issues in biomass biocatalytic gasification is the correct prediction of gasification products, with particular attention to the Topping Atmosphere Residues (TARs. In this work, performedwithin the European 7FP UNIfHY project, we develops and validate experimentally a model which is able of predicting the outputs,including TARs, of a steam-fluidized bed biomass gasifier. Pine wood was chosen as biomass feedstock: the products obtained in pyrolysis tests are the relevant model input. Hydrodynamics and chemical properties of the reacting system are considered: the hydrodynamic approach is based on the two phase theory of fluidization, meanwhile the chemical model is based on the kinetic equations for the heterogeneous and homogenous reactions. The derived differentials equations for the gasifier at steady state were implemented MATLAB. Solution was consecutively carried out using the Boubaker Polynomials Expansion Scheme by varying steam/biomass ratio (0.5-1 and operating temperature (750-850°C.The comparison between models and experimental results showed that the model is able of predicting gas mole fractions and production rate including most of the representative TARs compounds

  18. The Grimsel radionuclide migration experiment - a contribution to raising confidence in the validity of solute transport models used in performance assessment

    International Nuclear Information System (INIS)

    Frick, U.

    1995-01-01

    The safety assessment of radioactive waste repositories is to provide confidence that the predictive models utilized are applicable for the specific repository systems. Nagra has carried out radionuclide migration experiments at the Grimsel underground test site (Switzerland) for testing of currently used methodologies, data bases, conceptual approaches and codes for modeling radionuclide transport through fractured host rocks. Specific objectives included: identification of the relevant transport processes, to test the extrapolation of laboratory sorption data to field conditions, and to demonstrate the applicability of currently used methodology for conceptualizing or building realistic transport models. Field tests and transport modeling work are complemented by an extensive laboratory program. The field experimental activities focused predominantly on establishing appropriate conditions for identifying relevant transport mechanisms on the scale of a few meters, aiming at full recovery of injected tracers, simple geometry and long-term stability of induced dipole flow fields. A relatively simple homogeneous, dual-porosity advection/diffusion model was built with input from a state of the art petrographic characterisation of the water conducting feature. It was possible to calibrate the model from conservative tracer breakthrough curves. (J.S.). 21 refs., 14 figs., 4 tabs

  19. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the

  20. Changes and Issues in the Validation of Experience

    Science.gov (United States)

    Triby, Emmanuel

    2005-01-01

    This article analyses the main changes in the rules for validating experience in France and of what they mean for society. It goes on to consider university validation practices. The way in which this system is evolving offers a chance to identify the issues involved for the economy and for society, with particular attention to the expected…

  1. Gamma streaming experiments for validation of Monte Carlo code

    International Nuclear Information System (INIS)

    Thilagam, L.; Mohapatra, D.K.; Subbaiah, K.V.; Iliyas Lone, M.; Balasubramaniyan, V.

    2012-01-01

    In-homogeneities in shield structures lead to considerable amount of leakage radiation (streaming) increasing the radiation levels in accessible areas. Development works on experimental as well as computational methods for quantifying this streaming radiation are still continuing. Monte Carlo based radiation transport code, MCNP is usually a tool for modeling and analyzing such problems involving complex geometries. In order to validate this computational method for streaming analysis, it is necessary to carry out some experimental measurements simulating these inhomogeneities like ducts and voids present in the bulk shields for typical cases. The data thus generated will be analysed by simulating the experimental set up employing MCNP code and optimized input parameters for the code in finding solutions for similar radiation streaming problems will be formulated. Comparison of experimental data obtained from radiation streaming experiments through ducts will give a set of thumb rules and analytical fits for total radiation dose rates within and outside the duct. The present study highlights the validation of MCNP code through the gamma streaming experiments carried out with the ducts of various shapes and dimensions. Over all, the present study throws light on suitability of MCNP code for the analysis of gamma radiation streaming problems for all duct configurations considered. In the present study, only dose rate comparisons have been made. Studies on spectral comparison of streaming radiation are in process. Also, it is planned to repeat the experiments with various shield materials. Since the penetrations and ducts through bulk shields are unavoidable in an operating nuclear facility the results on this kind of radiation streaming simulations and experiments will be very useful in the shield structure optimization without compromising the radiation safety

  2. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  3. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  4. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  5. Debris flows: Experiments and modelling

    Science.gov (United States)

    Turnbull, Barbara; Bowman, Elisabeth T.; McElwaine, Jim N.

    2015-01-01

    Debris flows and debris avalanches are complex, gravity-driven currents of rock, water and sediments that can be highly mobile. This combination of component materials leads to a rich morphology and unusual dynamics, exhibiting features of both granular materials and viscous gravity currents. Although extreme events such as those at Kolka Karmadon in North Ossetia (2002) [1] and Huascarán (1970) [2] strongly motivate us to understand how such high levels of mobility can occur, smaller events are ubiquitous and capable of endangering infrastructure and life, requiring mitigation. Recent progress in modelling debris flows has seen the development of multiphase models that can start to provide clues of the origins of the unique phenomenology of debris flows. However, the spatial and temporal variations that debris flows exhibit make this task challenging and laboratory experiments, where boundary and initial conditions can be controlled and reproduced, are crucial both to validate models and to inspire new modelling approaches. This paper discusses recent laboratory experiments on debris flows and the state of the art in numerical models.

  6. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    Department of Mechanical Engineering, Imperial College of Science, ... If we are unable to obtain a satisfactory degree of correlation between the initial theoretical model and the test data, then it is extremely unlikely that any form of model updating (correcting the model to match the test data) will succeed. Thus, a successful ...

  7. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  8. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  9. Simulation - modeling - experiment

    International Nuclear Information System (INIS)

    2004-01-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  10. Simulations of Validation Platform Experiments by the PSI-Center

    Science.gov (United States)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Kim, C. C.; Marklin, G. J.; Milroy, R. D.; Shumlak, U.; Sovinec, C. R.; O'Bryan, J. B.; Held, E.; Ji, J.-Y.; Lukin, V. S.

    2012-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) assists collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LDX (M.I.T.), MST & Pegasus (U Wisc-Madison), PHD (UW), PFRC (PPPL), SSX (Swarthmore College), TCS (UW), and ZaP (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is planning to add neutrals to NIMROD. When implemented in NIMROD, these results will be compared to the neutral particle physics in the 2D version of HiFi. Coaxial helicity injection BCs will be specified in HiFi to simulate the Caltech co-planar experiment, for verification with previous and ongoing NIMROD simulations. Results from these simulations, as well as an overview of the PSI-Center status will be presented.

  11. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  12. Validation of the Danish language Injustice Experience Questionnaire

    DEFF Research Database (Denmark)

    la Cour, Peter; Schultz, Rikke; Smith, Anne Agerskov

    2017-01-01

    The Injustice Experience Questionnaire has shown promising ability to predict problematic rehabilitation in pain conditions, especially concerning work status. A Danish language version of the Injustice Experience Questionnaire was developed and completed by 358 patients with long-lasting pain....../somatoform symptoms. These patients also completed questionnaires concerning sociodemographics, anxiety and depression, subjective well-being, and overall physical and mental functioning. Our results showed satisfactory interpretability and face validity, and high internal consistency (Cronbach's alpha = .90......). The original one-factor structure was confirmed, but subscales should be interpreted cautiously. The Danish version of the Injustice Experience Questionnaire is found to be valid and reliable....

  13. Validation of the Danish language Injustice Experience Questionnaire.

    Science.gov (United States)

    la Cour, Peter; Smith, Anne Agerskov; Schultz, Rikke

    2017-06-01

    The Injustice Experience Questionnaire has shown promising ability to predict problematic rehabilitation in pain conditions, especially concerning work status. A Danish language version of the Injustice Experience Questionnaire was developed and completed by 358 patients with long-lasting pain/somatoform symptoms. These patients also completed questionnaires concerning sociodemographics, anxiety and depression, subjective well-being, and overall physical and mental functioning. Our results showed satisfactory interpretability and face validity, and high internal consistency (Cronbach's alpha = .90). The original one-factor structure was confirmed, but subscales should be interpreted cautiously. The Danish version of the Injustice Experience Questionnaire is found to be valid and reliable.

  14. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  16. Evidential Model Validation under Epistemic Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Deng

    2018-01-01

    Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

  17. Validation of the Paediatric Hearing Impairment Caregiver Experience (PHICE) Questionnaire.

    Science.gov (United States)

    Lim, Lynne H Y; Xiang, Ling; Wong, Naomi L Y; Yuen, Kevin C P; Li, Ruijie

    2014-07-01

    The paediatric hearing impairment caregiver experience (PHICE) questionnaire is a 68-item instrument that assesses the stress experienced by caregivers of children with hearing impairment (HI). While the questionnaire has been validated in the United States, it may need to be modified for use in the Singapore context due to the differing healthcare system, costing and culture related to caregiving for children with HI. This study aims to modify and validate the PHICE questionnaire to increase its relevance and ease of use in Singapore. The original PHICE questionnaire was filled out by 127 caregivers of HI children managed at the otolaryngology clinic of the National University Hospital (NUH). An expert panel was convened to assess the questionnaire for its suitability for use in Singapore. Exploratory factor analysis was conducted to evaluate the underlying factor structure of the original PHICE questionnaire. Items with high cross-loadings were removed and a new factor structure was adopted which was further analysed using confirmatory factor analysis (CFA). Cronbach's alpha (α) was computed to determine the internal consistency of the new subscales. Items that are less relevant in Singapore and those with high cross-loadings were removed. A 5-factor structure with only 42 items remaining and corresponding to the factors: " Policy", "Healthcare", "Education", "Support" and "Adaptation" was adopted. CFA suggests a good model fit for the modified questionnaire, improved from the 8-factor structure of the original PHICE. Cronbach's α were high (>0.7) for each new subscale. The original PHICE questionnaire has been shortened and reorganised in terms of the subscales composition. The resulting instrument is structurally valid and internally consistent. It is a simple and useful tool for identifying factors related to caregiving that can negatively impact rehabilitation outcomes for children with HI in Singapore. Removal of some sign language items makes this

  18. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    of refining the theoretical model which will be used for the design optimisation process. There are many different names given to the tasks involved in this refinement. .... slightly from the ideal line but in a systematic rather than a random fashion as this situation suggests that there is a specific characteristic responsible for the ...

  19. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  20. Operational experience and validation of the triathlerTM model 425-034(1) single vial liquid scintillation counter for meeting department of energy release criteria

    International Nuclear Information System (INIS)

    Kanady, R. W.

    2008-01-01

    Triathler TM Model 425-034 single vial liquid scintillation counter (LSC) counters have been in use at the Safety and Tritium Applied Research Facility (STAR) for approximately three years. During facility setup and determination of instrumentation needs to support STAR facility operations, the Triathler was chose to assess smearable tritium contamination levels for operational conditions. The Triathler was selected due to the rapid turnaround time for obtaining tritium contamination levels versus other automated batch LSC counters currently in use at the Idaho National Laboratory (INL) and other Dept. of Energy (DOE) installations. Operational experience with the Triathler thus far has shown a high reliability for verifying removable contamination levels at a level of 2 when compared to the Packard TM Tri-Carb 1905 AB/LA Liquid Scintillation Analyzer used by the Reactor Technologies Complex (RTC) Radiochemistry Measurements Laboratory (RML). However, variances in the reported results for activity in DPM/vial from the Triathler versus the Packard Tri-Carb have been noted when operating in the range of 5,000 to 20,000 DPM. These variances make reliability and use of the Triathler suspect for verifying smearable contamination levels meet the release criteria identified in DOE Order 5400.5, Radiation Protection of the Public and Environment. Ensuring that removable tritium contamination levels on materials and equipment intended for free-release to the public are 2 is a requirement in the Idaho National Laboratory (INL) contract. Comprehensive cross-comparisons have been ongoing to ensure the Triathler LSC reported DPM values provide sufficient detection of smearable tritium contamination when cross-compared to other automated liquid scintillation counters available at the INL. (authors)

  1. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...

  2. Validation of models with proportional bias

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2017-01-01

    Full Text Available Objective. This paper presents extensions to Freese’s statistical method for model-validation when proportional bias (PB is present in the predictions. The method is illustrated with data from a model that simulates grassland growth. Materials and methods. The extensions to validate models with PB were: the maximum anticipated error for the original proposal, hypothesis testing, and the maximum anticipated error for the alternative proposal, and the confidence interval for a quantile of error distribution. Results. The tested model had PB, which once removed, and with a confidence level of 95%, the magnitude of error does not surpass 1225.564 kg ha-1. Therefore, the validated model can be used to predict grassland growth. However, it would require a fit of its structure based on the presence of PB. Conclusions. The extensions presented to validate models with PB are applied without modification in the model structure. Once PB is corrected, the confidence interval for the quantile 1-α of the error distribution enables a higher bound for the magnitude of the prediction error and it can be used to evaluate the evolution of the model for a system prediction.

  3. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  4. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  5. Validation of the IMS CORE Diabetes Model.

    Science.gov (United States)

    McEwan, Phil; Foos, Volker; Palmer, James L; Lamotte, Mark; Lloyd, Adam; Grant, David

    2014-09-01

    The IMS CORE Diabetes Model (CDM) is a widely published and validated simulation model applied in both type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) analyses. Validation to external studies is an important part of demonstrating model credibility. Because the CDM is widely used to estimate long-term clinical outcomes in diabetes patients, the objective of this analysis was to validate the CDM to contemporary outcomes studies, including those with long-term follow-up periods. A total of 112 validation simulations were performed, stratified by study follow-up duration. For long-term results (≥15-year follow-up), simulation cohorts representing baseline Diabetes Control and Complications Trial (DCCT) and United Kingdom Prospective Diabetes Study (UKPDS) cohorts were generated and intensive and conventional treatment arms were defined in the CDM. Predicted versus observed macrovascular and microvascular complications and all-cause mortality were assessed using the coefficient of determination (R(2)) goodness-of-fit measure. Across all validation studies, the CDM simulations produced an R(2) statistic of 0.90. For validation studies with a follow-up duration of less than 15 years, R(2) values of 0.90 and 0.88 were achieved for T1DM and T2DM respectively. In T1DM, validating against 30-year outcomes data (DCCT) resulted in an R(2) of 0.72. In T2DM, validating against 20-year outcomes data (UKPDS) resulted in an R(2) of 0.92. This analysis supports the CDM as a credible tool for predicting the absolute number of clinical events in DCCT- and UKPDS-like populations. With increasing incidence of diabetes worldwide, the CDM is particularly important for health care decision makers, for whom the robust evaluation of health care policies is essential. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. The Mistra experiment for field containment code validation first results

    International Nuclear Information System (INIS)

    Caron-Charles, M.; Blumenfeld, L.

    2001-01-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  7. Pathways to Engineering: The Validation Experiences of Transfer Students

    Science.gov (United States)

    Zhang, Yi; Ozuna, Taryn

    2015-01-01

    Community college engineering transfer students are a critical student population of engineering degree recipients and technical workforce in the United States. Focusing on this group of students, we adopted Rendón's (1994) validation theory to explore the students' experiences in community colleges prior to transferring to a four-year…

  8. Global precipitation measurements for validating climate models

    Science.gov (United States)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  9. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  10. Argonne Bubble Experiment Thermal Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-03

    This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiation. It is based on the model used to calculate temperatures and volume fractions in an annular vessel containing an aqueous solution of uranium . The experiment was repeated at several electron beam power levels, but the CFD analysis was performed only for the 12 kW irradiation, because this experiment came the closest to reaching a steady-state condition. The aim of the study is to compare results of the calculation with experimental measurements to determine the validity of the CFD model.

  11. Modeling a High Explosive Cylinder Experiment

    Science.gov (United States)

    Zocher, Marvin A.

    2017-06-01

    Cylindrical assemblies constructed from high explosives encased in an inert confining material are often used in experiments aimed at calibrating and validating continuum level models for the so-called equation of state (constitutive model for the spherical part of the Cauchy tensor). Such is the case in the work to be discussed here. In particular, work will be described involving the modeling of a series of experiments involving PBX-9501 encased in a copper cylinder. The objective of the work is to test and perhaps refine a set of phenomenological parameters for the Wescott-Stewart-Davis reactive burn model. The focus of this talk will be on modeling the experiments, which turned out to be non-trivial. The modeling is conducted using ALE methodology.

  12. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  13. Landslide Tsunami Generation Models: Validation and Case Studies

    Science.gov (United States)

    Watts, P.; Grilli, S. T.; Kirby, J. T.; Fryer, G. J.; Tappin, D. R.

    2002-12-01

    There has been a proliferation of landslide tsunami generation and propagation models in recent time, spurred largely by the 1998 Papua New Guinea event. However, few of these models or techniques have been carefully validated. Moreover, few of these models have proven capable of integrating the best available geological data and interpretations into convincing case studies. The Tsunami Open and Progressive Initial Conditions System (TOPICS) rapidly provides approximate landslide tsunami sources for tsunami propagation models. We present 3D laboratory experiments and 3D Boundary Element Method simulations that validate the tsunami sources given by TOPICS. Geowave is a combination of TOPICS with the fully nonlinear and dispersive Boussinesq model FUNWAVE, which has been the subject of extensive testing and validation over the course of the last decade. Geowave is currently a tsunami community model made available to all tsunami researchers on the web site www.tsunamicommunity.org. We validate Geowave with case studies of the 1946 Unimak, Alaska, the 1994 Skagway, Alaska, and the 1998 Papua New Guinea events. The benefits of Boussinesq wave propagation over traditional shallow water wave models is very apparent for these relatively steep and nonlinear waves. For the first time, a tsunami community model appear sufficiently powerful to reproduce all observations and records with the first numerical simulation. This can only be accomplished by first assembling geological data and interpretations into a reasonable tsunami source.

  14. Calibration and validation of rockfall models

    Science.gov (United States)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  15. Nondestructive measurements of nuclear wastes: validation and industrial operating experience

    International Nuclear Information System (INIS)

    Montigon, J.F.; Guerin, V.; Lalande, R.; Saas, A.

    1990-01-01

    After a short survey of the means employed for the nondestructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation and control

  16. Monitoring Building Deformation with InSAR: Experiments and Validation

    OpenAIRE

    Kui Yang; Li Yan; Guoman Huang; Chu Chen; Zhengpeng Wu

    2016-01-01

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real example...

  17. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  18. MARKETING MODELS APPLICATION EXPERIENCE

    Directory of Open Access Journals (Sweden)

    A. Yu. Rymanov

    2011-01-01

    Full Text Available Marketing models are used for the assessment of such marketing elements as sales volume, market share, market attractiveness, advertizing costs, product pushing and selling, profit, profitableness. Classification of buying process decision taking models is presented. SWOT- and GAPbased models are best for selling assessments. Lately, there is a tendency to transfer from the assessment on the ba-sis of financial indices to that on the basis of those non-financial. From the marketing viewpoint, most important are long-term company activities and consumer drawingmodels as well as market attractiveness operative models.

  19. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  20. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  1. Modelling Urban Experiences

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2008-01-01

    How can urban designers develop an emotionally satisfying environment not only for today's users but also for coming generations? Which devices can they use to elicit interesting and relevant urban experiences? This paper attempts to answer these questions by analyzing the design of Zuidas, a new...

  2. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  3. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  4. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  5. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    Science.gov (United States)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  6. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  7. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  8. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  9. An Examination and Validation of an Adapted Youth Experience Scale for University Sport

    Science.gov (United States)

    Rathwell, Scott; Young, Bradley W.

    2016-01-01

    Limited tools assess positive development through university sport. Such a tool was validated in this investigation using two independent samples of Canadian university athletes. In Study 1, 605 athletes completed 99 survey items drawn from the Youth Experience Scale (YES 2.0), and separate a priori measurement models were evaluated (i.e., 99…

  10. Seclazone Reactor Modeling And Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Osinga, T. [ETH-Zuerich (Switzerland); Olalde, G. [CNRS Odeillo (France); Steinfeld, A. [PSI and ETHZ (Switzerland)

    2005-03-01

    A numerical model is formulated for the SOLZINC solar chemical reactor for the production of Zn by carbothermal reduction of ZnO. The model involves solving, by the finite-volume technique, a 1D unsteady state energy equation that couples heat transfer to the chemical kinetics for a shrinking packed bed exposed to thermal radiation. Validation is accomplished by comparison with experimentally measured temperature profiles and Zn production rates as a function of time, obtained for a 5-kW solar reactor tested at PSI's solar furnace. (author)

  11. Turbulence Modeling Validation, Testing, and Development

    Science.gov (United States)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  12. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  13. Free Radicals and Reactive Intermediates for the SAGE III Ozone Loss and Validation Experiment (SOLVE) Mission

    Science.gov (United States)

    Anderson, James G.

    2001-01-01

    This grant provided partial support for participation in the SAGE III Ozone Loss and Validation Experiment. The NASA-sponsored SOLVE mission was conducted Jointly with the European Commission-sponsored Third European Stratospheric Experiment on Ozone (THESEO 2000). Researchers examined processes that control ozone amounts at mid to high latitudes during the arctic winter and acquired correlative data needed to validate the Stratospheric Aerosol and Gas Experiment (SAGE) III satellite measurements that are used to quantitatively assess high-latitude ozone loss. The campaign began in September 1999 with intercomparison flights out of NASA Dryden Flight Research Center in Edwards. CA. and continued through March 2000. with midwinter deployments out of Kiruna. Sweden. SOLVE was co-sponsored by the Upper Atmosphere Research Program (UARP). Atmospheric Effects of Aviation Project (AEAP). Atmospheric Chemistry Modeling and Analysis Program (ACMAP). and Earth Observing System (EOS) of NASA's Earth Science Enterprise (ESE) as part of the validation program for the SAGE III instrument.

  14. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  15. Numerical modelling of the bonding process for wind turbine blades: model validation

    DEFF Research Database (Denmark)

    Uzal, Anil; Spangenberg, Jon; W. Nielsen, Michael

    numerical modelis developed in order to analyse adhesive propagation in squeeze flow problems with 3-D flow effects.The model is validated by comparison with an experiment where a rectangular prism shaped adhesivesample is squeezed between two parallel plates. In the numerical model the rheological...

  16. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  17. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  18. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  19. Predicting third molar surgery operative time: a validated model.

    Science.gov (United States)

    Susarla, Srinivas M; Dodson, Thomas B

    2013-01-01

    The purpose of the present study was to develop and validate a statistical model to predict third molar (M3) operative time. This was a prospective cohort study consisting of a sample of subjects presenting for M3 removal. The demographic, anatomic, and operative variables were recorded for each subject. Using an index sample of randomly selected subjects, a multiple linear regression model was generated to predict the operating time. A nonoverlapping group of randomly selected subjects (validation sample) was used to assess model accuracy. P≤.05 was considered significant. The sample was composed of 150 subjects (n) who had 450 (k) M3s removed. The index sample (n=100 subjects, k=313 M3s extracted) had a mean age of 25.4±10.0 years. The mean extraction time was 6.4±7.0 minutes. The multiple linear regression model included M3 location, Winter's classification, tooth morphology, number of teeth extracted, procedure type, and surgical experience (R2=0.58). No statistically significant differences were seen between the index sample and the validation sample (n=50, k=137) for any of the study variables. Compared with the index model, the β-coefficients of the validation model were similar in direction and magnitude for most variables. Compared with the observed extraction time for all teeth in the sample, the predicted extraction time was not significantly different (P=.16). Fair agreement was seen between the β-coefficients for our multiple models in the index and validation populations, with no significant difference in the predicted and observed operating times. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  20. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  1. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    USACE, Pittsburgh District ( LRP ) requested that the US Army Engineer Research and Development Center, Coastal and ERDC/CHL TR-13-9 2 Hydraulics...approaching the lock and dam. The second set of experiments considered a design, referred to as Plan B lock approach, which contained the weir field in...conditions and model parameters A discharge of 1.35 cfs was set as the inflow boundary condition at the upstream end of the model. The outflow boundary was

  2. Modelling and Validating a Deoiling Hydrocyclone for Fault Diagnosis using Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Nielsen, Emil Krabbe; Bram, Mads Valentin; Frutiger, Jerome

    Decision support systems are a key focus in research on developing control rooms to aidoperators in making reliable decisions, and reducing incidents caused by human errors. For thispurpose, models of complex systems can be developed to diagnose causes or consequences forspecific alarms. Models a...... experiments are used for validation of two simpleMultilevel Flow Modeling models of a deoiling hydrocyclone, used for water and oil separation....

  3. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  4. On the external validity of construction bidding experiment

    Directory of Open Access Journals (Sweden)

    Bee Lan Oo

    2016-03-01

    Full Text Available The external validity of experimental studies and in particular, the subject pool effects have been much debated among researchers. The common objections are that the use of student as experimental subjects is invalid as they are likely to be unrepresentative. This paper addresses this methodological aspect in building economics research. It compares the bidding behavioural patterns of experienced construction executives (professionals and student subjects through replication of a bidding experiment that aimed at testing theories. The results show that the student subjects’ bidding behavourial patterns, in terms of decision to bid and mark-up decision, are sufficiently similar to that of the professionals. This suggests that the subject pool per se is not a threat to the external validity of the bidding experiment. In addition, the demonstrated practicality of an experimental approach in testing theories should lead to more use of experimental studies with student subjects in building economics research. It is suggested that experimental and field findings should be seen as complementary in building economics research, as advocated in social sciences.

  5. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  6. CFD and FEM modeling of PPOOLEX experiments

    Energy Technology Data Exchange (ETDEWEB)

    Paettikangas, T.; Niemi, J.; Timperi, A. (VTT Technical Research Centre of Finland (Finland))

    2011-01-15

    Large-break LOCA experiment performed with the PPOOLEX experimental facility is analysed with CFD calculations. Simulation of the first 100 seconds of the experiment is performed by using the Euler-Euler two-phase model of FLUENT 6.3. In wall condensation, the condensing water forms a film layer on the wall surface, which is modelled by mass transfer from the gas phase to the liquid water phase in the near-wall grid cell. The direct-contact condensation in the wetwell is modelled with simple correlations. The wall condensation and direct-contact condensation models are implemented with user-defined functions in FLUENT. Fluid-Structure Interaction (FSI) calculations of the PPOOLEX experiments and of a realistic BWR containment are also presented. Two-way coupled FSI calculations of the experiments have been numerically unstable with explicit coupling. A linear perturbation method is therefore used for preventing the numerical instability. The method is first validated against numerical data and against the PPOOLEX experiments. Preliminary FSI calculations are then performed for a realistic BWR containment by modeling a sector of the containment and one blowdown pipe. For the BWR containment, one- and two-way coupled calculations as well as calculations with LPM are carried out. (Author)

  7. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  8. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  9. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  10. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  11. Experiences from Designing and Validating a Software Modernization Transformation

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Al-Sibahi, Ahmad Salim; Dimovski, Aleksandar

    2015-01-01

    as input, which it translates to a declarative configuration model. The correctness criterion for the transformation is that the produced model admits the same configurations as the input code. The transformation converts C++ functions specifying around a thousand configuration parameters. We verify......Software modernization often involves complex code transformations that convert legacy code to new architectures or platforms, while preserving the semantics of the original programs. We present the lessons learnt from an industrial software modernization project of considerable size. This includes...... collecting requirements for a code-to-model transformation, designing and implementing the transformation algorithm, and then validating correctness of this transformation for the code-base at hand. Our transformation is implemented in the TXL rewriting language and assumes specifically structured C++ code...

  12. Atmospheric corrosion: statistical validation of models

    International Nuclear Information System (INIS)

    Diaz, V.; Martinez-Luaces, V.; Guineo-Cobs, G.

    2003-01-01

    In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs

  13. Ion channel model development and validation

    Science.gov (United States)

    Nelson, Peter Hugo

    2010-03-01

    The structure of the KcsA ion channel selectivity filter is used to develop three simple models of ion channel permeation. The quantitative predictions of the knock-on model are tested by comparison with experimental data from single-channel recordings of the KcsA channel. By comparison with experiment, students discover that the knock-on model can't explain saturation of ion channel current as the concentrations of the bathing solutions are increased. By inverting the energy diagram, students derive the association-dissociation model of ion channel permeation. This model predicts non-linear Michaelis-Menten saturating behavior that requires students to perform non-linear least-squares fits to the experimental data. This is done using Excel's solver feature. Students discover that this simple model does an excellent job of explaining the qualitative features of ion channel permeation but cannot account for changes in voltage sensitivity. The model is then extended to include an electrical dissociation distance. This rapid translocation model is then compared with experimental data from a wide variety of ion channels and students discover that this model also has its limitations. Support from NSF DUE 0836833 is gratefully acknowledged.

  14. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust.

  15. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  16. Evaluation model and experimental validation of tritium in agricultural plant

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hee Suk; Keum, Dong Kwon; Lee, Han Soo; Jun, In; Choi, Yong Ho; Lee, Chang Woo [KAERI, Daejon (Korea, Republic of)

    2005-12-15

    This paper describes a compartment dynamic model for evaluating the contamination level of tritium in agricultural plants exposed by accidentally released tritium. The present model uses a time dependent growth equation of plant so that it can predict the effect of growth stage of plant during the exposure time. The model including atmosphere, soil and plant compartments is described by a set of nonlinear ordinary differential equations, and is able to predict time-dependent concentrations of tritium in the compartments. To validate the model, a series of exposure experiments of HTO vapor on Chinese cabbage and radish was carried out at the different growth stage of each plant. At the end of exposure, the tissue free water(TFWT) and the organically bound tritium (OBT) were measured. The measured concentrations were agreed well with model predictions.

  17. Explicating Experience: Development of a Valid Scale of Past Hazard Experience for Tornadoes.

    Science.gov (United States)

    Demuth, Julie L

    2018-03-23

    People's past experiences with a hazard theoretically influence how they approach future risks. Yet, past hazard experience has been conceptualized and measured in wide-ranging, often simplistic, ways, resulting in mixed findings about its relationship with risk perception. This study develops a scale of past hazard experiences, in the context of tornadoes, that is content and construct valid. A conceptual definition was developed, a set of items were created to measure one's most memorable and multiple tornado experiences, and the measures were evaluated through two surveys of the public who reside in tornado-prone areas. Four dimensions emerged of people's most memorable experience, reflecting their awareness of the tornado risk that day, their personalization of the risk, the intrusive impacts on them personally, and impacts experienced vicariously through others. Two dimensions emerged of people's multiple experiences, reflecting common types of communication received and negative emotional responses. These six dimensions are novel in that they capture people's experience across the timeline of a hazard as well as intangible experiences that are both direct and indirect. The six tornado experience dimensions were correlated with tornado risk perceptions measured as cognitive-affective and as perceived probability of consequences. The varied experience-risk perception results suggest that it is important to understand the nuances of these concepts and their relationships. This study provides a foundation for future work to continue explicating past hazard experience, across different risk contexts, and for understanding its effect on risk assessment and responses. © 2018 Society for Risk Analysis.

  18. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, Shu A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1998-03-01

    In order to make benchmark validation of the existing evaluated nuclear data for fusion related material, neutron leakage spectra from spherical piles were measured with a time-of-flight technique using the intense 14 MeV neutron source, OKTAVIAN in the energy range from 0.1 to 15 MeV. The neutron energy spectra were obtained as the absolute value normalized per the source neutron. The measured spectra were compared with those by theoretical calculation using a Monte Carlo neutron transport code, MCNP with several libraries processed from the evaluated nuclear data files. Comparison has been made with the spectrum shape, the C/E values of neutron numbers integrated in 4 energy regions and the calculated spectra unfolded by the number of collisions, especially those after a single collision. The new libraries predicted the experiment fairly well for Li, Cr, Mn, Cu and Mo. For Al, Si, Zr, Nb and W, new data files could give fair prediction. However, C/E differed more than 20% for several regions. For LiF, CF{sub 2}, Ti and Co, no calculation could predict the experiment. The detailed discussion has been given for Cr, Mn and Cu samples. EFF-2 calculation overestimated by 24% for the Cr experiment between 1 and 5-MeV neutron energy region, presumably because of overestimation of inelastic cross section and {sup 52}Cr(n,2n) cross section and the problem in energy and angular distribution of secondary neutrons in EFF-2. For Cu, ENDF/B-VI and EFF-2 overestimated the experiment by about 20 to 30-% in the energy range between 5 and 12-MeV, presumably from the problem in inelastic scattering cross section. (author)

  19. Validation of A Global Hydrological Model

    Science.gov (United States)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

  20. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  1. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  2. Synthesis of clad motion experiments interpretation: codes and validation

    International Nuclear Information System (INIS)

    Papin, J.; Fortunato, M.; Seiler, J.M.

    1983-04-01

    This communication deals with clad melting and relocation phenomena related to LMFBR safety analysis of loss of flow accidents. We present: - the physical models developed at DSN/CEN Cadarache in single channel and bundle geometry. The interpretation with these models of experiments performed by the STT (CEN Grenoble). It comes out that we have now obtained a good understanding of the involved phenomena in single channel geometry. On the other hand, further studies are necessary for a better knowledge of clad motion phenomena in bundle cases with conditions close to reactor ones

  3. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  4. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  5. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... noise emission, trying at the same time to preserve some of its aerodynamic and geometric characteristics. The new designs are characterized by less cambered airfoils and flatter suction sides. The resulting noise reductions seem to be mainly achieved by a reduction in the turbulent kinetic energy...

  6. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  7. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  8. Validation of the community radiative transfer model

    International Nuclear Information System (INIS)

    Ding Shouguo; Yang Ping; Weng Fuzhong; Liu Quanhua; Han Yong; Delst, Paul van; Li Jun; Baum, Bryan

    2011-01-01

    To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ 30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.

  9. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  10. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  11. [Caregiver's health: adaption and validation in a Spanish population of the Experience of Caregiving Inventory (ECI)].

    Science.gov (United States)

    Crespo-Maraver, Mariacruz; Doval, Eduardo; Fernández-Castro, Jordi; Giménez-Salinas, Jordi; Prat, Gemma; Bonet, Pere

    2018-04-04

    To adapt and to validate the Experience of Caregiving Inventory (ECI) in a Spanish population, providing empirical evidence of its internal consistency, internal structure and validity. Psychometric validation of the adapted version of the ECI. One hundred and seventy-two caregivers (69.2% women), mean age 57.51 years (range: 21-89) participated. Demographic and clinical data, standardized measures (ECI, suffering scale of SCL-90-R, Zarit burden scale) were used. The two scales of negative evaluation of the ECI most related to serious mental disorders (disruptive behaviours [DB] and negative symptoms [NS]) and the two scales of positive appreciation (positive personal experiences [PPE], and good aspects of the relationship [GAR]) were analyzed. Exploratory structural equation modelling was used to analyze the internal structure. The relationship between the ECI scales and the SCL-90-R and Zarit scores was also studied. The four-factor model presented a good fit. Cronbach's alpha (DB: 0.873; NS: 0.825; PPE: 0.720; GAR: 0.578) showed a higher homogeneity in the negative scales. The SCL-90-R scores correlated with the negative ECI scales, and none of the ECI scales correlated with the Zarit scale. The Spanish version of the ECI can be considered a valid, reliable, understandable and feasible self-report measure for its administration in the health and community context. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Contributions to the validation of the CJS model for granular materials

    Science.gov (United States)

    Elamrani, Khadija

    1992-07-01

    Behavior model validation in the field of geotechnics is addressed, with the objective of showing the advantages and limits of the CJS (Cambou Jafari Sidoroff) behavior model for granular materials. Several levels are addressed: theoretical analysis of the CJS model to reveal consistence and first capacities; shaping (followed by validation by confrontation with other programs) of a computation code by finite elements (FINITEL) to integrate this model and prepare it for complex applications; validation of the code/model structure thus constituted by comparing its results to those of experiments in the case of nonhomogeneous (superficial foundations) problems.

  13. Model Validation of Radiocaesium Transfer from Soil to Leafy Vegetables

    Directory of Open Access Journals (Sweden)

    P. Sukmabuana

    2012-04-01

    Full Text Available The accumulation of radionuclide in plant tissues can be estimated using a mathematical model, however the applicability of the model into field experiment still needs to be evaluated. A model validation has been conducted for radiocaesium transfer from soil to two leafy vegetables generally consumed by Indonesian people, i.e. spinach and morning glory in order to validate the transfer model toward field experimental data. The vegetable plants were grown on the soil contaminated with 134CsNO3 of 19 MBq for about 70 days. As the control, vegetables plant were also grown on soil without 134CsNO3 contamination. Every 5 days, both of contaminated and un contaminated plants were sampled for 3 persons respectively. The soil media was also tested. The samples were dried by infra red lamp and then the radioactivity was counted using gamma spectrometer. Data of 134Cs radioactivity on soil and plants were substituted into mathematical equation to obtain the coeficient of transfer rate (k12. The values of k12 were then used for calculating the 134Cs radioactivity in the vegetable plants. The 134Cs radioactivity in plants obtained from mathematical model analysis was compared with the radioactivity data obtained from the experiment. Correlation of 134Cs radioactivity in vegetables plant obtained from the experiment with those obtained from model analysis was expressed as correlation coefficient, and it was obtained to be 0.90 and 0.71 for spinach and morning glory plants respectively. The values of 134Cs in plants obtained from the model analysis can be corrected using standard deviation values, namely 48.65 and 20 for spinach at 0model analysis and experiment data, the model of 134Cs transfer from soil to plant can be used for analysing 134Cs radioactivity

  14. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, S.A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1997-03-01

    The new version of Japanese nuclear data library JENDL-3.2 has recently been released. JENDL Fusion File which adopted DDX representations for secondary neutrons was also improved with the new evaluation method. On the other hand, FENDL nuclear data project to compile nuclear data library for fusion related research has been conducted partly under auspices of International Atomic Energy Agency (IAEA). The first version FENDL-1 consists of JENDL-3.1, ENDF/B-VI, BROND-2 and EFF-1 and has been released in 1995. The work for the second version FENDL-2 is now ongoing. The Bench mark validation of the nuclear data libraries have been performed to help selecting the candidate for the FENDL-2. The benchmark experiment have been conducted at OKTAVIAN of Osaka university. The sample spheres were constructed by filling the spherical shells with sample. The leakage neutron spectra from sphere piles were measured with a time-of-flight method. The measured spectra were compared with the theoretical calculation using MCNP 4A and the processed libraries from JENDL-3.1, JENDL-3.2, JENDL Fusion File, and FENDL-1. JENDL Fusion File and JENDL-3.2 gave almost the same prediction for the experiment. And both prediction are almost satisfying for Li, Cr, Mn, Cu, Zr, Nb and Mo, whereas for Al, LiF, CF2, Si, Ti, Co and W there is some discrepancy. However, they gave better prediction than the calculations using the library from FENDL-1, except for W. (author)

  15. Lidar Atmopheric Sensing Experiment (LASE) Data Obtained During the SAGE III Ozone Loss and Validation Experiment (SOLVE)

    Data.gov (United States)

    National Aeronautics and Space Administration — LASE_SOLVE data are Lidar Atmospheric Sensing Experiment water vapor and aerosol data measurements taken during SAGE III Ozone Loss and Validation Experiment...

  16. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS AT IDAHO NATIONAL LABORATORY: DESCRIPTION AND SUMMARY OF DATA

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2010-09-01

    Idaho National Laboratory performed air ingress experiments as part of validating computational fluid dynamics code (CFD). An isothermal stratified flow experiment was designed and set to understand stratified flow phenomena in the very high temperature gas cooled reactor (VHTR) and to provide experimental data for validating computer codes. The isothermal experiment focused on three flow characteristics unique in the VHTR air-ingress accident: stratified flow in the horizontal pipe, stratified flow expansion at the pipe and vessel junction, and stratified flow around supporting structures. Brine and sucrose were used as heavy fluids and water was used as light fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between heavy and light fluids is generated even for very small density differences. The code was validated by conducting blind CFD simulations and comparing the results to the experimental data. A grid sensitivity study was also performed based on the Richardson extrapolation and the grid convergence index method for modeling confidence. As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  17. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS AT IDAHO NATIONAL LABORATORY: DESCRIPTION AND SUMMARY OF DATA

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2010-01-01

    Idaho National Laboratory performed air ingress experiments as part of validating computational fluid dynamics code (CFD). An isothermal stratified flow experiment was designed and set to understand stratified flow phenomena in the very high temperature gas cooled reactor (VHTR) and to provide experimental data for validating computer codes. The isothermal experiment focused on three flow characteristics unique in the VHTR air-ingress accident: stratified flow in the horizontal pipe, stratified flow expansion at the pipe and vessel junction, and stratified flow around supporting structures. Brine and sucrose were used as heavy fluids and water was used as light fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between heavy and light fluids is generated even for very small density differences. The code was validated by conducting blind CFD simulations and comparing the results to the experimental data. A grid sensitivity study was also performed based on the Richardson extrapolation and the grid convergence index method for modeling confidence. As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  18. EPIC Calibration/Validation Experiment Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Steven E [National Severe Storm Laboratory/NOAA; Chilson, Phillip [University of Oklahoma; Argrow, Brian [University of Colorado

    2017-03-15

    A field exercise involving several different kinds of Unmanned Aerial Systems (UAS) and supporting instrumentation systems provided by DOE/ARM and NOAA/NSSL was conducted at the ARM SGP site in Lamont, Oklahoma on 29-30 October 2016. This campaign was part of a larger National Oceanic and Atmospheric Administration (NOAA) UAS Program Office program awarded to the National Severe Storms Laboratory (NSSL). named Environmental Profiling and Initiation of Convection (EPIC). The EPIC Field Campaign (Test and Calibration/Validation) proposed to ARM was a test or “dry-run” for a follow-up campaign to be requested for spring/summer 2017. The EPIC project addresses NOAA’s objective to “evaluate options for UAS profiling of the lower atmosphere with applications for severe weather.” The project goal is to demonstrate that fixed-wing and rotary-wing small UAS have the combined potential to provide a unique observing system capable of providing detailed profiles of temperature, moisture, and winds within the atmospheric boundary layer (ABL) to help determine the potential for severe weather development. Specific project objectives are: 1) to develop small UAS capable of acquiring needed wind and thermodynamic profiles and transects of the ABL using one fixed-wing UAS operating in tandem with two different fixed rotary-wing UAS pairs; 2) adapt and test miniaturized, high-precision, and fast-response atmospheric sensors with high accuracy in strong winds characteristic of the pre-convective ABL in Oklahoma; 3) conduct targeted short-duration experiments at the ARM Southern Great Plains site in northern Oklahoma concurrently with a second site to be chosen in “real-time” from the Oklahoma Mesonet in coordination with the (National Weather Service (NWS)-Norman Forecast Office; and 4) gain valuable experience in pursuit of NOAA’s goals for determining the value of airborne, mobile observing systems for monitoring rapidly evolving high-impact severe weather

  19. Validation of the measure automobile emissions model : a statistical analysis

    Science.gov (United States)

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  20. Dental models made with an intraoral scanner: A validation study.

    NARCIS (Netherlands)

    Cuperus, A.M.; Harms, M.C.; Rangel, F.A.; Bronkhorst, E.M.; Schols, J.G.J.H.; Breuning, K.H.

    2012-01-01

    INTRODUCTION: Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. METHODS: Ten dry human skulls were scanned; from the scans, stereolithographic models and digital

  1. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  2. The structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ

    Directory of Open Access Journals (Sweden)

    Pieter Schaap

    2016-09-01

    Full Text Available Orientation: Best practice frameworks suggest that an assessment practitioner’s choice of an assessment tool should be based on scientific evidence that underpins the appropriate and just use of the instrument. This is a context-specific validity study involving a classified psychological instrument against the background of South African regulatory frameworks and contemporary validity theory principles. Research purpose: The aim of the study was to explore the structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ administered to employees in the automotive assembly plant of a South African automotive manufacturing company. Motivation for the study: Although the WLQ has been used by registered health practitioners and numerous researchers, evidence to support the structural validity is lacking. This study, therefore, addressed the need for context-specific empirical support for the validity of score inferences in respect of employees in a South African automotive manufacturing plant. Research design, approach and method: The research was conducted using a convenience sample (N = 217 taken from the automotive manufacturing company where the instrument was used. Reliability and factor analyses were carried out to explore the structural validity of the WLQ. Main findings: The reliability of the WLQ appeared to be acceptable, and the assumptions made about unidimensionality were mostly confirmed. One of the proposed higher-order structural models of the said questionnaire administered to the sample group was confirmed, whereas the other one was partially confirmed. Practical/managerial implications: The conclusion reached was that preliminary empirical grounds existed for considering the continued use of the WLQ (with some suggested refinements by the relevant company, provided the process of accumulating a body of validity evidence continued. Contribution/value-add: This study identified some of the difficulties

  3. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...

  4. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik (ed.)

    2016-04-15

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  5. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik

    2016-04-01

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  6. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  7. Contaminant transport model validation: The Oak Ridge Reservation

    International Nuclear Information System (INIS)

    Lee, R.R.; Ketelle, R.H.

    1988-09-01

    In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs

  8. Correction for misclassification of caries experience in the absence of internal validation data.

    Science.gov (United States)

    Mutsvari, T; Declerck, D; Lesaffre, E

    2013-11-01

    To quantify the effects of risk factors and/or determinants on disease occurrence, it is important that the risk factors as well as the variable that measures the disease outcome are recorded with the least error as possible. When investigating the factors that influence a binary outcome, a logistic regression model is often fitted under the assumption that the data are collected without error. However, most categorical outcomes (e.g., caries experience) are accompanied by misclassification and this needs to be accounted for. The aim of this research was to adjust for binary outcome misclassification using an external validation study when investigating factors influencing caries experience in schoolchildren. Data from the Signal Tandmobiel(®) study were used. A total of 500 children from the main and 148 from the validation study were included in the analysis. Regression models (with several covariates) for sensitivity and specificity were used to adjust for misclassification in the main data. The use of sensitivity and specificity modeled as functions of several covariates resulted in a better correction compared to using point estimates of sensitivity and specificity. Age, geographical location of the school to which the child belongs, dentition type, tooth type, and surface type were significantly associated with the prevalence of caries experience. Sensitivity and specificity calculated based on an external validation study may resemble those obtained from an internal study if conditioned on a rich set of covariates. Main data can be corrected for misclassification using information obtained from an external validation study when a rich set of covariates is recorded during calibration.

  9. Validation of NEPTUNE-CFD on ULPU-V experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jamet, Mathieu, E-mail: mathieu.jamet@edf.fr; Lavieville, Jerome; Atkhen, Kresna; Mechitoua, Namane

    2015-11-15

    In-vessel retention (IVR) of molten corium through external cooling of the reactor pressure vessel is one possible means of severe accident mitigation for a class of nuclear power plants. The aim is to successfully terminate the progression of a core melt within the reactor vessel. The probability of success depends on the efficacy of the cooling strategy; hence one of the key aspects of an IVR demonstration relates to the heat removal capability through the vessel wall by convection and boiling in the external water flow. This is only possible if the in-vessel thermal loading is lower than the local critical heat flux expected along the outer wall of the vessel, which is in turn highly dependent on the flow characteristics between the vessel and the insulator. The NEPTUNE-CFD multiphase flow solver is used to obtain a better understanding at local scale of the thermal hydraulics involved in this situation. The validation of the NEPTUNE-CFD code on the ULPU-V facility experiments carried out at the University of California Santa Barbara is presented as a first attempt of using CFD codes at EDF to address such an issue. Two types of computation are performed. On the one hand, a steady state algorithm is used to compute natural circulation flow rates and differential pressures and, on the other, a transient algorithm computation reveals the oscillatory nature of the pressure data recorded in the ULPU facility. Several dominant frequencies are highlighted. In both cases, the CFD simulations reproduce reasonably well the experimental data for these quantities.

  10. New validation metrics for models with multiple correlated responses

    International Nuclear Information System (INIS)

    Li, Wei; Chen, Wei; Jiang, Zhen; Lu, Zhenzhou; Liu, Yu

    2014-01-01

    Validating models with correlated multivariate outputs involves the comparison of multiple stochastic quantities. Considering both uncertainty and correlations among multiple responses from model and physical observations imposes challenges. Existing marginal comparison methods and the hypothesis testing-based methods either ignore correlations among responses or only reach Boolean conclusions (yes or no) without accounting for the amount of discrepancy between a model and the underlying reality. A new validation metric is needed to quantitatively characterize the overall agreement of multiple responses considering correlations among responses and uncertainty in both model predictions and physical observations. In this paper, by extending the concept of “area metric” and the “u-pooling method” developed for validating a single response, we propose new model validation metrics for validating correlated multiple responses using the multivariate probability integral transformation (PIT). One new metric is the PIT area metric for validating multi-responses at a single validation site. The other is the t-pooling metric that allows for pooling observations of multiple responses observed at multiple validation sites to assess the global predictive capability. The proposed metrics have many favorable properties that are well suited for validation assessment of models with correlated responses. The two metrics are examined and compared with the direct area metric and the marginal u-pooling method respectively through numerical case studies and an engineering example to illustrate their validity and potential benefits

  11. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  12. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  13. Development and Validation of an Instrument for Assessing Patient Experience of Chronic Illness Care

    Directory of Open Access Journals (Sweden)

    José Joaquín Mira

    2016-08-01

    Full Text Available Introduction: The experience of chronic patients with the care they receive, fuelled by the focus on patient-centeredness and the increasing evidence on its positive relation with other dimensions of quality, is being acknowledged as a key element in improving the quality of care. There are a dearth of accepted tools and metrics to assess patient experience from the patient’s perspective that have been adapted to the new chronic care context: continued, systemic, with multidisciplinary teams and new technologies. Methods: Development and validation of a scale conducting a literature review, expert panel, pilot and field studies with 356 chronic primary care patients, to assess content and face validities and reliability. Results: IEXPAC is an 11+1 item scale with adequate metric properties measured by Alpha Chronbach, Goodness of fit index, and satisfactory convergence validity around three factors named: productive interactions, new relational model and person’s self-management. Conclusions: IEXPAC allows measurement of the patient experience of chronic illness care. Together with other indicators, IEXPAC can determine the quality of care provided according to the Triple Aim framework, facilitating health systems reorientation towards integrated patient-centred care.

  14. Validation of 2D flood models with insurance claims

    Science.gov (United States)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  15. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  16. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the Knowledge Workers Productivity Approach. It is hoped that this paper will help managers to implement different corresponding measures. A case study is presented where this model measure and validates at the ...

  17. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  18. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  19. Model validation through long-term promising sustainable maize/pigeon pea residue management in Malawi

    NARCIS (Netherlands)

    Mwale, C.D.; Kabambe, V.H.; Sakale, W.D.; Giller, K.E.; Kauwa, A.A.; Ligowe, I.; Kamalongo, D.

    2013-01-01

    In the 2005/2006 season, the Model Validation Through Long-Term Promising Sustainable Maize/Pigeon Pea Residue Management experiment was in the 11th year at Chitedze and Chitala, and in the 8th year at Makoka and Zombwe. The experiment was a split-plot design with cropping system as the main plot

  20. Validation of Hydrodynamic Numerical Model of a Pitching Wave Energy Converter

    DEFF Research Database (Denmark)

    López, Maria del Pilar Heras; Thomas, Sarah; Kramer, Morten Mejlhede

    2017-01-01

    Validation of numerical model is essential in the development of new technologies. Commercial software and codes available simulating wave energy converters (WECs) have not been proved to work for all the available and upcoming technologies yet. The present paper presents the first stages...... of the validation process of a hydrodynamic numerical model for a pitching wave energy converter. The development of dry tests, wave flume and wave basin experiments are going to be explained, lessons learned shared and results presented....

  1. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  2. Generation of integral experiment covariance data and their impact on criticality safety validation

    International Nuclear Information System (INIS)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-01

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k eff 's, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an application

  3. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  4. Empirical validation of an agent-based model of wood markets in Switzerland

    Science.gov (United States)

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  5. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  6. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  7. Measuring the experience of hospitality : Scale development and validation

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Adriaan T.H.

    2017-01-01

    This paper identifies what customers experience as hospitality and subsequently presents a novel and compact assessment scale for measuring customers’ experience of hospitality at any kind of service organization. The Experience of Hospitality Scale (EH-Scale) takes a broader perspective compared to

  8. Hydraulic Hybrid Excavator—Mathematical Model Validation and Energy Analysis

    Directory of Open Access Journals (Sweden)

    Paolo Casoli

    2016-11-01

    Full Text Available Recent demands to reduce pollutant emissions and improve energy efficiency have driven the implementation of hybrid solutions in mobile machinery. This paper presents the results of a numerical and experimental analysis conducted on a hydraulic hybrid excavator (HHE. The machinery under study is a middle size excavator, whose standard version was modified with the introduction of an energy recovery system (ERS. The proposed ERS layout was designed to recover the potential energy of the boom, using a hydraulic accumulator as a storage device. The recovered energy is utilized through the pilot pump of the machinery which operates as a motor, thus reducing the torque required from the internal combustion engine (ICE. The analysis reported in this paper validates the HHE model by comparing numerical and experimental data in terms of hydraulic and mechanical variables and fuel consumption. The mathematical model shows its capability to reproduce the realistic operating conditions of the realized prototype, tested on the field. A detailed energy analysis comparison between the standard and the hybrid excavator models was carried out to evaluate the energy flows along the system, showing advantages, weaknesses and possibilities to further improve the machinery efficiency. Finally, the fuel consumption estimated by the model and that measured during the experiments are presented to highlight the fuel saving percentages. The HHE model is an important starting point for the development of other energy saving solutions.

  9. First experience from in-core sensor validation based on correlation and neuro-fuzzy techniques

    International Nuclear Information System (INIS)

    Figedy, S.

    2011-01-01

    In this work new types of nuclear reactor in-core sensor validation methods are outlined. The first one is based on combination of correlation coefficients and mutual information indices, which reflect the correlation of signals in linear and nonlinear regions. The method may be supplemented by wavelet transform based signal features extraction and pattern recognition by artificial neural networks and also fuzzy logic based decision making. The second one is based on neuro-fuzzy modeling of residuals between experimental values and their theoretical counterparts obtained from the reactor core simulator calculations. The first experience with this approach is described and further improvements to enhance the outcome reliability are proposed (Author)

  10. Premixing of corium into water during a Fuel-Coolant Interaction. The models used in the 3 field version of the MC3D code and two examples of validation on Billeau and FARO experiments

    Energy Technology Data Exchange (ETDEWEB)

    Berthoud, G.; Crecy, F. de; Duplat, F.; Meignen, R.; Valette, M. [CEA/Grenoble, DRN/DTP, 17 Avenue des Martyrs, 38054 Grenoble Cedex 9 (France)

    1998-01-01

    This paper presents the <> application of the multiphasic 3D computer code MC3D. This application is devoted to the premixing phase of a Fuel Coolant Interaction (FCI) when large amounts of molten corium flow into water and interact with it. A description of the new features of the model is given (a more complete description of the full model is given in annex). Calculations of Billeau experiments (cold or hot spheres dropped into water) and of a FARO test (<> corium dropped into 5 MPa saturated water) are presented. (author)

  11. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  12. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  13. Validating neural-network refinements of nuclear mass models

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  14. Deep ocean model penetrator experiments

    International Nuclear Information System (INIS)

    Freeman, T.J.; Burdett, J.R.F.

    1986-01-01

    Preliminary trials of experimental model penetrators in the deep ocean have been conducted as an international collaborative exercise by participating members (national bodies and the CEC) of the Engineering Studies Task Group of the Nuclear Energy Agency's Seabed Working Group. This report describes and gives the results of these experiments, which were conducted at two deep ocean study areas in the Atlantic: Great Meteor East and the Nares Abyssal Plain. Velocity profiles of penetrators of differing dimensions and weights have been determined as they free-fell through the water column and impacted the sediment. These velocity profiles are used to determine the final embedment depth of the penetrators and the resistance to penetration offered by the sediment. The results are compared with predictions of embedment depth derived from elementary models of a penetrator impacting with a sediment. It is tentatively concluded that once the resistance to penetration offered by a sediment at a particular site has been determined, this quantity can be used to sucessfully predict the embedment that penetrators of differing sizes and weights would achieve at the same site

  15. Measuring experience of hospitality : scale development and validation

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Adriaan T.H.

    This paper describes the development of the Experience of Hospitality Scale (EH-Scale) for assessing hospitality in service environments from a guest point of view. In contrast to other scales, which focus specifically on staff behaviour, the present scale focuses on the experience of hospitality

  16. Validating Computational Cognitive Process Models across Multiple Timescales

    Science.gov (United States)

    Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

    2010-12-01

    Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

  17. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  18. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  19. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...... the performance of personal sound-zone systems....

  20. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference excit...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading.......The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...

  1. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    Science.gov (United States)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  2. The AACES field experiments: SMOS calibration and validation across the Murrumbidgee River catchment

    Science.gov (United States)

    Peischl, S.; Walker, J. P.; Rüdiger, C.; Ye, N.; Kerr, Y. H.; Kim, E.; Bandara, R.; Allahmoradi, M.

    2012-06-01

    Following the launch of the European Space Agency's Soil Moisture and Ocean Salinity (SMOS) mission on 2 November 2009, SMOS soil moisture products need to be rigorously validated at the satellite's approximately 45 km scale and disaggregation techniques for producing maps with finer resolutions tested. The Australian Airborne Cal/val Experiments for SMOS (AACES) provide the basis for one of the most comprehensive assessments of SMOS data world-wide by covering a range of topographic, climatic and land surface variability within an approximately 500 × 100 km2 study area, located in South-East Australia. The AACES calibration and validation activities consisted of two extensive field experiments which were undertaken across the Murrumbidgee River catchment during the Australian summer and winter season of 2010, respectively. The datasets include airborne L-band brightness temperature, thermal infrared and multi-spectral observations at 1 km resolution, as well as extensive ground measurements of near-surface soil moisture and ancillary data, such as soil temperature, soil texture, surface roughness, vegetation water content, dew amount, leaf area index and spectral characteristics of the vegetation. This paper explains the design and data collection strategy of the airborne and ground component of the two AACES campaigns and presents a preliminary analysis of the field measurements including the application and performance of the SMOS core retrieval model on the diverse land surface conditions captured by the experiments. The data described in this paper are publicly available from the website: http://www.moisturemap.monash.edu.au/aaces.

  3. A validated physical model of greenhouse climate.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the

  4. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  5. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  6. The Childbirth Experience Questionnaire (CEQ) - validation of its use in a Danish population

    DEFF Research Database (Denmark)

    Boie, Sidsel; Glavind, Julie; Uldbjerg, Niels

    and again 2 weeks later. Demographic and delivery characteristics were used to establish construct validity of the CEQ applying the method of known-groups validation. The results of the scored CEQ sent out twice were used to measure test-retest reliability of the CEQ by calculating the quadratic weighted...... found for subgroups of women known to report better birth outcome. Test-retest reliability: We found a weighted kappa of 0.76 demonstrating good test reliability Conclusions The Childbirth Experience Questionnaire is a valid and reliable instrument to measure childbirth experience in the Danish......Title The Childbirth Experience Questionnaire (CEQ) - validation the use in a Danish population Introduction Childbirth experience is arguably as important as measuring birth outcomes such as mode of delivery or perinatal morbidity. A robust, validated, Danish tool for evaluating childbirth...

  7. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  8. Modelling and validation of Proton exchange membrane fuel cell (PEMFC)

    Science.gov (United States)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.

    2018-01-01

    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  9. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high...... a negative pressure around the body. The differences in renal function between space and experimental models appear to be explained by the physical forces affecting tissues and hemodynamics as well as by the changes secondary to these forces. These differences may help in selecting experimental models...

  10. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  11. Transfer Entropy as a Tool for Hydrodynamic Model Validation

    Directory of Open Access Journals (Sweden)

    Alicia Sendrowski

    2018-01-01

    Full Text Available The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.

  12. VERIFICATION AND VALIDATION OF THE SPARC MODEL

    Science.gov (United States)

    Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

  13. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  14. Experiments beyond the standard model

    International Nuclear Information System (INIS)

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references

  15. Validating a perceptual distraction model in a personal two-zone sound system

    DEFF Research Database (Denmark)

    Rämö, Jussi; Christensen, Lasse; Bech, Søren

    2017-01-01

    This paper focuses on validating a perceptual distraction model, which aims to predict user’s perceived distraction caused by audio-on-audio interference, e.g., two competing audio sources within the same listening space. Originally, the distraction model was trained with music-on-music stimuli...... using a simple loudspeaker setup, consisting of only two loudspeakers, one for the target sound source and the other for the interfering sound source. Recently, the model was successfully validated in a complex personal sound-zone system with speech-on-music stimuli. Second round of validations were...... conducted by physically altering the sound-zone system and running a set of new listening experiments utilizing two sound zones within the sound-zone system. Thus, validating the model using a different sound-zone system with both speech-on-music and music-on-speech stimuli sets. Preliminary results show...

  16. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  17. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  18. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  19. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  20. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  1. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  2. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  3. Electrically Driven Thermal Management: Flight Validation, Experiment Development, Future Technologies

    Science.gov (United States)

    Didion, Jeffrey R.

    2018-01-01

    Electrically Driven Thermal Management is an active research and technology development initiative incorporating ISS technology flight demonstrations (STP-H5), development of Microgravity Science Glovebox (MSG) flight experiment, and laboratory-based investigations of electrically based thermal management techniques. The program targets integrated thermal management for future generations of RF electronics and power electronic devices. This presentation reviews four program elements: i.) results from the Electrohydrodynamic (EHD) Long Term Flight Demonstration launched in February 2017 ii.) development of the Electrically Driven Liquid Film Boiling Experiment iii.) two University based research efforts iv.) development of Oscillating Heat Pipe evaluation at Goddard Space Flight Center.

  4. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made...

  5. Experimental validation of Swy-2 clay standard's PHREEQC model

    Science.gov (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  6. Dental models made with an intraoral scanner: a validation study.

    Science.gov (United States)

    Cuperus, Anne Margreet R; Harms, Marit C; Rangel, Frits A; Bronkhorst, Ewald M; Schols, Jan G J H; Breuning, K Hero

    2012-09-01

    Our objectives were to determine the validity and reproducibility of measurements on stereolithographic models and 3-dimensional digital dental models made with an intraoral scanner. Ten dry human skulls were scanned; from the scans, stereolithographic models and digital models were made. Two observers measured transversal distances, mesiodistal tooth widths, and arch segments on the skulls and the stereolithographic and digital models. All measurements were repeated 4 times. Arch length discrepancy and tooth size discrepancy were calculated. Statistical analysis was performed by using paired t tests. For the measurements on the stereolithographic and digital models, statistically significant differences were found. However, these differences were considered to be clinically insignificant. Digital models had fewer statistically significant differences and generally the smallest duplicate measurement errors compared with the stereolithographic models. Stereolithographic and digital models made with an intraoral scanner are a valid and reproducible method for measuring distances in a dentition. Copyright © 2012 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  7. Fast Running Urban Dispersion Model for Radiological Dispersal Device (RDD) Releases: Model Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gowardhan, Akshay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Donetti, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Belles, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Eme, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Homann, Steven [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Simpson, Matthew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Nasstrom, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC)

    2017-05-24

    Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a more detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).

  8. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  9. Validated Analytical Model of a Pressure Compensation Drip Irrigation Emitter

    Science.gov (United States)

    Shamshery, Pulkit; Wang, Ruo-Qian; Taylor, Katherine; Tran, Davis; Winter, Amos

    2015-11-01

    This work is focused on analytically characterizing the behavior of pressure-compensating drip emitters in order to design low-cost, low-power irrigation solutions appropriate for off-grid communities in developing countries. There are 2.5 billion small acreage farmers worldwide who rely solely on their land for sustenance. Drip, compared to flood, irrigation leads to up to 70% reduction in water consumption while increasing yields by 90% - important in countries like India which are quickly running out of water. To design a low-power drip system, there is a need to decrease the pumping pressure requirement at the emitters, as pumping power is the product of pressure and flow rate. To efficiently design such an emitter, the relationship between the fluid-structure interactions that occur in an emitter need to be understood. In this study, a 2D analytical model that captures the behavior of a common drip emitter was developed and validated through experiments. The effects of independently changing the channel depth, channel width, channel length and land height on the performance were studied. The model and the key parametric insights presented have the potential to be optimized in order to guide the design of low-pressure, clog-resistant, pressure-compensating emitters.

  10. Experiments with Geometric Non-Linear Coupling for Analytical Validation

    Science.gov (United States)

    2010-03-01

    qualification tests are applied to the joined-wing models. 31 Figure 3.28: Material Dogbone For all the FE models, several versions of Nastran were...used interchangeably: MD Nastran V2007.0, MD Nastran V2008.0, and NX Nastran V5.0. MD Nastran V2007.0 was the primary solver for most analyses. For all...linear solution would only converge to a 15 lb load; all solution attempts above this load failed. Nastran had trouble solving this FE model because of

  11. Validation experiments to determine radiation partitioning of heat flux to an object in a fully turbulent fire.

    Energy Technology Data Exchange (ETDEWEB)

    Ricks, Allen; Blanchat, Thomas K.; Jernigan, Dann A.

    2006-06-01

    It is necessary to improve understanding and develop validation data of the heat flux incident to an object located within the fire plume for the validation of SIERRA/ FUEGO/SYRINX fire and SIERRA/CALORE. One key aspect of the validation data sets is the determination of the relative contribution of the radiative and convective heat fluxes. To meet this objective, a cylindrical calorimeter with sufficient instrumentation to measure total and radiative heat flux had been designed and fabricated. This calorimeter will be tested both in the controlled radiative environment of the Penlight facility and in a fire environment in the FLAME/Radiant Heat (FRH) facility. Validation experiments are specifically designed for direct comparison with the computational predictions. Making meaningful comparisons between the computational and experimental results requires careful characterization and control of the experimental features or parameters used as inputs into the computational model. Validation experiments must be designed to capture the essential physical phenomena, including all relevant initial and boundary conditions. A significant question of interest to modeling heat flux incident to an object in or near a fire is the contribution of the radiation and convection modes of heat transfer. The series of experiments documented in this test plan is designed to provide data on the radiation partitioning, defined as the fraction of the total heat flux that is due to radiation.

  12. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  13. Perceived Importance of Marijuana to the College Experience Scale (PIMCES): Initial Development and Validation.

    Science.gov (United States)

    Pearson, Matthew R; Kholodkov, Tatyana; Gray, Matt J

    2017-03-01

    Internalization of college substance use culture refers to the degree to which an individual perceives the use of that substance to be an integral part of the college experience. Although there is a growing literature characterizing this construct for alcohol, the present study describes the development and validation of a new measure to assess the internalization of the college marijuana use culture, the Perceived Importance of Marijuana to the College Experience Scale (PIMCES). We recruited a large, diverse sample (N = 8,141) of college students from 11 participating universities. We examined the psychometric properties of the PIMCES and evaluated its concurrent validity by examining its associations with marijuana-related outcomes. A single-factor, eight-item PIMCES demonstrated good model fit and high internal consistency (Cronbach's α = .89) and was correlated with marijuana user status, frequency of marijuana use, marijuana consequences, and injunctive norms. Overall, the PIMCES exhibits sound psychometric properties. The PIMCES can serve as a possible mediator of the effects of personality and other factors on marijuana-related outcomes and may be a promising target for marijuana interventions.

  14. SCALE Validation Experience Using an Expanded Isotopic Assay Database for Spent Nuclear Fuel

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Radulescu, Georgeta; Ilas, Germina

    2009-01-01

    The availability of measured isotopic assay data to validate computer code predictions of spent fuel compositions applied in burnup-credit criticality calculations is an essential component for bias and uncertainty determination in safety and licensing analyses. In recent years, as many countries move closer to implementing or expanding the use of burnup credit in criticality safety for licensing, there has been growing interest in acquiring additional high-quality assay data. The well-known open sources of assay data are viewed as potentially limiting for validating depletion calculations for burnup credit due to the relatively small number of isotopes measured (primarily actinides with relatively few fission products), sometimes large measurement uncertainties, incomplete documentation, and the limited burnup and enrichment range of the fuel samples. Oak Ridge National Laboratory (ORNL) recently initiated an extensive isotopic validation study that includes most of the public data archived in the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) electronic database, SFCOMPO, and new datasets obtained through participation in commercial experimental programs. To date, ORNL has analyzed approximately 120 different spent fuel samples from pressurized-water reactors that span a wide enrichment and burnup range and represent a broad class of assembly designs. The validation studies, completed using SCALE 5.1, are being used to support a technical basis for expanded implementation of burnup credit for spent fuel storage facilities, and other spent fuel analyses including radiation source term, dose assessment, decay heat, and waste repository safety analyses. This paper summarizes the isotopic assay data selected for this study, presents validation results obtained with SCALE 5.1, and discusses some of the challenges and experience associated with evaluating the results. Preliminary results obtained using SCALE 6 and ENDF

  15. Scanning L Band Active Passive Validation Experiment 2013

    Science.gov (United States)

    Joseph, A. T.; Kim, E. J.; Faulkner, T.; Patel, H.; Cosh, M. H.

    2014-12-01

    SLAP (Scanning L-band Active Passive) comprises of a fully polarimetric L-band radiometer and fully polarimetric L-band radar with a shared antenna. SLAP is designed to be compatible with several aircrafts; specifically, C-23, Twin Otter, P-3, and C-130. SLAP is designed for simplicity, accuracy, & reliability. It leverages, as much as possible, existing instruments, hardware, and software in order to minimize cost, time, and risk.The SLAP airborne/ground campaign is designed to conduct flight testing and ground truth for the airborne instrument. The campaign took place the third week of December 2013 in Eastern Shore, MD. SLAP contributes to the NASA's core mission because of its ability to serve as an airborne simulator for the SMAP (Soil Moisture Active Passive) satellite mission, one of NASA's flagship missions scheduled to launch in January 2015. A 3-day aircraft validation campaign was conducted where the new SLAP instrument flew three separate days over the proposed sampling region. The study area is a mixed agriculture and forest site located about 1 hour east of Washington, DC on the Eastern Shore (of the Chesapeake Bay). This region is located on the Delmarva Peninsula. The advantages of the selected site are: (1) Site was used before in previous field campaign (SMAPVEX08) (2) ARS HRSL has some established sampling sites within region (3) Dynamic variation in land cover (4) Variety of plant structures and densities. The goal of this campaign was to fly the instrument over the proposed site before a rain event, then have 2 other flights after the rain event to capture a dry down. In conjunction with the aircraft, there was in-situ ground sampling. Ground observations were collected concurrent with aircraft flights. These included soil moisture, soil temperature, surface temperature, surface roughness and vegetation parameters. Forest sites were monitored with small temporary networks of in situ sensors installed prior to the first flight. Soil moisture was

  16. Modeling the Experience of Emotion

    OpenAIRE

    Broekens, Joost

    2009-01-01

    Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers resulting in work that is widely published. The majority of this work consists of computational models of emotion recognition, computational modeling of causal factors of emotion and emotion expression through rendered and robotic faces. A smaller part is concerned with modeling the effects of emotion, formal modeling of cognitive appraisal theory and models of emergent...

  17. An Overlooked Population in Community College: International Students' (In)Validation Experiences With Academic Advising

    Science.gov (United States)

    Zhang, Yi

    2016-01-01

    Objective: Guided by validation theory, this study aims to better understand the role that academic advising plays in international community college students' adjustment. More specifically, this study investigated how academic advising validates or invalidates their academic and social experiences in a community college context. Method: This…

  18. Finite element model validation of bridge based on structural health monitoring—Part II: Uncertainty propagation and model validation

    Directory of Open Access Journals (Sweden)

    Xiaosong Lin

    2015-08-01

    Full Text Available Because of uncertainties involved in modeling, construction, and measurement systems, the assessment of the FE model validation must be conducted based on stochastic measurements to provide designers with confidence for further applications. In this study, based on the updated model using response surface methodology, a practical model validation methodology via uncertainty propagation is presented. Several criteria of testing/analysis correlation are introduced, and the sources of model and testing uncertainties are also discussed. After that, Monte Carlo stochastic finite element (FE method is employed to perform the uncertainty quantification and propagation. The proposed methodology is illustrated with the examination of the validity of a large-span prestressed concrete continuous rigid frame bridge monitored under operational conditions. It can be concluded that the calculated frequencies and vibration modes of the updated FE model of Xiabaishi Bridge are consistent with the measured ones. The relative errors of each frequency are all less than 3.7%. Meanwhile, the overlap ratio indexes of each frequency are all more than 75%; The MAC values of each calculated vibration frequency are all more than 90%. The model of Xiabaishi Bridge is valid in the whole operation space including experimental design space, and its confidence level is upper than 95%. The validated FE model of Xiabaishi Bridge can reflect the current condition of Xiabaishi Bridge, and also can be used as basis of bridge health monitoring, damage identification and safety assessment.

  19. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  20. Validation of the galactic cosmic ray and geomagnetic transmission models

    International Nuclear Information System (INIS)

    Badhwar, G.D.; Truong, A.G.; O'Neill, P.M.; Choutko, Vitaly

    2001-01-01

    A very high-momentum resolution particle spectrometer called the Alpha Magnetic Spectrometer (AMS) was flown in the payload bay of the Space Shuttle in a 51.65 deg. x 380-km orbit during the last solar minimum. This spectrometer has provided the first high statistics data set for galactic cosmic radiation protons, and helium, as well as limited spectral data on carbon and oxygen nuclei in the International Space Station orbit. First measurements of the albedo protons at this inclination were also made. Because of the high-momentum resolution and high statistics, the data can be separated as a function of magnetic latitude. A related investigation, the balloon borne experiment with a superconducting solenoid spectrometer (BESS), has been flown from Lynn Lake, Canada and has also provided excellent high-resolution data on protons and helium. These two data sets have been used here to study the validity of two galactic cosmic ray models and the geomagnetic transmission function developed from the 1990 geomagnetic reference field model. The predictions of both the CREME96 and NASA/JSC models are in good agreement with the AMS data. The shape of the AMS measured albedo proton spectrum, up to 2 GeV, is in excellent agreement with the previous balloon and satellite observations. A new LIS spectrum was developed that is consistent with both previous and new BESS 3 He observations. Because the astronaut radiation exposures onboard ISS will be highest around the time of the solar minimum, these AMS measurements and these models provide important benchmarks for future radiation studies. AMS-02 slated for launch in September 2003, will provide even better momentum resolution and higher statistics data

  1. [Validation of abdominal wound dehiscence's risk model].

    Science.gov (United States)

    Gómez Díaz, Carlos Javier; Rebasa Cladera, Pere; Navarro Soto, Salvador; Hidalgo Rosas, José Manuel; Luna Aufroy, Alexis; Montmany Vioque, Sandra; Corredera Cantarín, Constanza

    2014-02-01

    The aim of this study is to determine the usefulness of the risk model developed by van Ramshorst et al., and a modification of the same, to predict the abdominal wound dehiscence's risk in patients who underwent midline laparotomy incisions. Observational longitudinal retrospective study. Patients who underwent midline laparotomy incisions in the General and Digestive Surgery Department of the Sabadell's Hospital-Parc Taulí's Health and University Corporation-Barcelona, between January 1, 2010 and June 30, 2010. Dependent variable: Abdominal wound dehiscence. Global risk score, preoperative risk score (postoperative variables were excluded), global and preoperative probabilities of developing abdominal wound dehiscence. 176 patients. Patients with abdominal wound dehiscence: 15 (8.5%). The global risk score of abdominal wound dehiscence group (mean: 4.97; IC 95%: 4.15-5.79) was better than the global risk score of No abdominal wound dehiscence group (mean: 3.41; IC 95%: 3.20-3.62). This difference is statistically significant (P<.001). The preoperative risk score of abdominal wound dehiscence group (mean: 3.27; IC 95%: 2.69-3.84) was better than the preoperative risk score of No abdominal wound dehiscence group (mean: 2.77; IC 95%: 2.64-2.89), also a statistically significant difference (P<.05). The global risk score (area under the ROC curve: 0.79) has better accuracy than the preoperative risk score (area under the ROC curve: 0.64). The risk model developed by van Ramshorst et al. to predict the abdominal wound dehiscence's risk in the preoperative phase has a limited usefulness. Additional refinements in the preoperative risk score are needed to improve its accuracy. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  2. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David; Thompson, Sandra E.

    2016-09-17

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  3. UTLS temperature validation of MPI-ESM decadal hindcast experiments with GPS radio occultations

    Directory of Open Access Journals (Sweden)

    Torsten Schmidt

    2016-12-01

    Full Text Available Global Positioning System (GPS radio occultation (RO temperature data are used to validate MPI-ESM (Max Planck Institute – Earth System Model decadal hindcast experiments in the upper troposphere and lower stratosphere (UTLS region between 300 hPa and 10 hPa (8 km and 32 km for the time period between 2002 and 2011. The GPSRO dataset is unique since it is very precise, calibration independent and covers the globe better than the usual radiosonde dataset. In addition it is vertically finer resolved than any of the existing satellite temperature measurements available for the UTLS and provides now a unique one decade long temperature validation dataset. The initialization of the MPI-ESM decadal hindcast runs mostly increases the skill of the atmospheric temperatures when compared to uninitialized climate projections with very high skill scores for lead-year one, and gradually decreases for the later lead-years. A comparison between two different initialization sets (b0, b1 of the low-resolution (LR MPI-ESM shows increased skills in b1-LR in most parts of the UTLS in particular in the tropics. The medium resolution (MR MPI-ESM initializations are characterized by reduced temperature biases in the uninitialized runs as compared to observations and a better capturing of the high latitude northern hemisphere interannual polar vortex variability as compared to the LR model version. Negative skills are found for the b1-MR hindcasts however in the regions around the mid-latitude tropospheric jets on both hemispheres and in the vicinity of the tropical tropopause in comparison to the b1-LR variant. It is interesting to highlight that none of the model experiments can reproduce the observed positive temperature trend in the tropical tropopause region since 2001 as seen by GPSRO data.

  4. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  5. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  6. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  7. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  8. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  9. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  10. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  11. Factor structure and construct validity of the temporal experience of pleasure scales.

    Science.gov (United States)

    Ho, Paul M; Cooper, Andrew J; Hall, Phillip J; Smillie, Luke D

    2015-01-01

    Feelings of pleasure felt in the moment of goal attainment (consummatory pleasure) are thought to be dissociable from feelings of desire connected with the motivated approach of goals (anticipatory pleasure). The Temporal Experience of Pleasure Scales (TEPS; Gard, Gard, Kring, & John, 2006) was developed to assess individual differences in these distinct processes. Recently, an independent evaluation of the psychometric characteristics of a Chinese-translated TEPS suggested a more complex factor structure (Chan et al., 2012). This study aimed to reexamine the factor structure and convergent and divergent validity of the TEPS in two previously unexamined multiethnic samples. University students in the United Kingdom (N = 294) completed the TEPS and university students in Australia (N = 295) completed the TEPS as well as a battery of conceptually related questionnaires. A confirmatory factor analysis of Gard et al.'s (2006) 2-factor model produced inadequate fit, which model-modification indexes suggested might be due to item cross-loadings. This issue was examined further using an exploratory factor analysis, which revealed a clear 2-factor solution despite cross-loadings among some items. Finally, mixed evidence for convergent-divergent validity was obtained, in terms of relationships between the TEPS and measures of anhedonia, approach-motivation, and positive emotion.

  12. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  13. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  14. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    Science.gov (United States)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  15. Verification and Validation of Requirements on the CEV Parachute Assembly System Using Design of Experiments

    Science.gov (United States)

    Schulte, Peter Z.; Moore, James W.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.

  16. Development and validation of models for bubble coalescence and breakup

    International Nuclear Information System (INIS)

    Liao, Yiaxiang

    2013-01-01

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  17. Dynamically Scaled Model Experiment of a Mooring Cable

    Directory of Open Access Journals (Sweden)

    Lars Bergdahl

    2016-01-01

    Full Text Available The dynamic response of mooring cables for marine structures is scale-dependent, and perfect dynamic similitude between full-scale prototypes and small-scale physical model tests is difficult to achieve. The best possible scaling is here sought by means of a specific set of dimensionless parameters, and the model accuracy is also evaluated by two alternative sets of dimensionless parameters. A special feature of the presented experiment is that a chain was scaled to have correct propagation celerity for longitudinal elastic waves, thus providing perfect geometrical and dynamic scaling in vacuum, which is unique. The scaling error due to incorrect Reynolds number seemed to be of minor importance. The 33 m experimental chain could then be considered a scaled 76 mm stud chain with the length 1240 m, i.e., at the length scale of 1:37.6. Due to the correct elastic scale, the physical model was able to reproduce the effect of snatch loads giving rise to tensional shock waves propagating along the cable. The results from the experiment were used to validate the newly developed cable-dynamics code, MooDy, which utilises a discontinuous Galerkin FEM formulation. The validation of MooDy proved to be successful for the presented experiments. The experimental data is made available here for validation of other numerical codes by publishing digitised time series of two of the experiments.

  18. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments

    Directory of Open Access Journals (Sweden)

    Gyöngyi Munkácsy

    2016-01-01

    Full Text Available No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal–Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E−06. Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E−04. There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  19. Validation of ozone measurements from the Atmospheric Chemistry Experiment (ACE

    Directory of Open Access Journals (Sweden)

    E. Dupuy

    2009-01-01

    Full Text Available This paper presents extensive {bias determination} analyses of ozone observations from the Atmospheric Chemistry Experiment (ACE satellite instruments: the ACE Fourier Transform Spectrometer (ACE-FTS and the Measurement of Aerosol Extinction in the Stratosphere and Troposphere Retrieved by Occultation (ACE-MAESTRO instrument. Here we compare the latest ozone data products from ACE-FTS and ACE-MAESTRO with coincident observations from nearly 20 satellite-borne, airborne, balloon-borne and ground-based instruments, by analysing volume mixing ratio profiles and partial column densities. The ACE-FTS version 2.2 Ozone Update product reports more ozone than most correlative measurements from the upper troposphere to the lower mesosphere. At altitude levels from 16 to 44 km, the average values of the mean relative differences are nearly all within +1 to +8%. At higher altitudes (45–60 km, the ACE-FTS ozone amounts are significantly larger than those of the comparison instruments, with mean relative differences of up to +40% (about +20% on average. For the ACE-MAESTRO version 1.2 ozone data product, mean relative differences are within ±10% (average values within ±6% between 18 and 40 km for both the sunrise and sunset measurements. At higher altitudes (~35–55 km, systematic biases of opposite sign are found between the ACE-MAESTRO sunrise and sunset observations. While ozone amounts derived from the ACE-MAESTRO sunrise occultation data are often smaller than the coincident observations (with mean relative differences down to −10%, the sunset occultation profiles for ACE-MAESTRO show results that are qualitatively similar to ACE-FTS, indicating a large positive bias (mean relative differences within +10 to +30% in the 45–55 km altitude range. In contrast, there is no significant systematic difference in bias found for the ACE-FTS sunrise and sunset measurements.

  20. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  1. ALTWAVE: Toolbox for use of satellite L2P altimeter data for wave model validation

    Science.gov (United States)

    Appendini, Christian M.; Camacho-Magaña, Víctor; Breña-Naranjo, José Agustín

    2016-03-01

    To characterize some of the world's ocean physical processes such as its wave height, wind speed and sea surface elevation is a major need for coastal and marine infrastructure planning and design, tourism activities, wave power and storm surge risk assessment, among others. Over the last decades, satellite remote sensing tools have provided quasi-global measurements of ocean altimetry by merging data from different satellite missions. While there is a widely use of altimeter data for model validation, practical tools for model validation remain scarce. Our purpose is to fill this gap by introducing ALTWAVE, a MATLAB user-oriented toolbox for oceanographers and coastal engineers developed to validate wave model results based on visual features and statistical estimates against satellite derived altimetry. Our toolbox uses altimetry information from the GlobWave initiative, and provides a sample application to validate a one year wave hindcast for the Gulf of Mexico. ALTWAVE also offers an effective toolbox to validate wave model results using altimeter data, as well as a guidance for non-experienced satellite data users. This article is intended for wave modelers with no experience using altimeter data to validate their results.

  2. Validity of empirical models of exposure in asphalt paving

    Science.gov (United States)

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  3. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Well-founded cost estimation validated by experience

    International Nuclear Information System (INIS)

    LaGuardia, T.S.

    2005-01-01

    to build consistency into its cost estimates. A standardized list of decommissioning activities needs to be adopted internationally so estimates can be prepared on a consistent basis, and to facilitate tracking of actual costs against the estimate. The OECD/NEA Standardized List incorporates the consensus of international experts as to the elements of cost and activities that should be included in the estimate. A significant effort was made several years ago to promote universal adoption of this standard. Using the standardized list of activities as a template, a questionnaire was distributed to gather actual decommissioning costs (and other parameters) from international projects. The results of cost estimate contributions from many countries were analyzed and evaluated as to reactor types, decommissioning strategies, cost drivers, and waste disposal quantities. The results were reported in the literature A standardized list of activities will only be valuable if the underlying cost elements and methodology is clearly identified in the estimate. While no one would expect perfect correlation of every element of cost in a large project estimate versus actual cost comparison, the variants should be visible so the basis for the difference can be examined and evaluated. For the nuclear power industry to grow to meet the increasing demand for electricity, the investors, regulators and the public must understand the total cost of the nuclear fuel cycle. The costs for decommissioning and the funding requirements to provide for safe closure and dismantling of these units are well recognized to represent a significant liability to the owner utilities and governmental agencies. Owners and government regulatory agencies need benchmarked decommissioning costs to test the validity of each proposed cost and funding request. The benchmarking process requires the oversight of decommissioning experts to evaluate contributed cost data in a meaningful manner. An international

  5. Proton resonance frequency chemical shift thermometry: experimental design and validation toward high-resolution noninvasive temperature monitoring and in vivo experience in a nonhuman primate model of acute ischemic stroke.

    Science.gov (United States)

    Dehkharghani, S; Mao, H; Howell, L; Zhang, X; Pate, K S; Magrath, P R; Tong, F; Wei, L; Qiu, D; Fleischer, C; Oshinski, J N

    2015-06-01

    Applications for noninvasive biologic temperature monitoring are widespread in biomedicine and of particular interest in the context of brain temperature regulation, where traditionally costly and invasive monitoring schemes limit their applicability in many settings. Brain thermal regulation, therefore, remains controversial, motivating the development of noninvasive approaches such as temperature-sensitive nuclear MR phenomena. The purpose of this work was to compare the utility of competing approaches to MR thermometry by using proton resonance frequency chemical shift. We tested 3 methodologies, hypothesizing the feasibility of a fast and accurate approach to chemical shift thermometry, in a phantom study at 3T. A conventional, paired approach (difference [DIFF]-1), an accelerated single-scan approach (DIFF-2), and a new, further accelerated strategy (DIFF-3) were tested. Phantom temperatures were modulated during real-time fiber optic temperature monitoring, with MR thermometry derived simultaneously from temperature-sensitive changes in the water proton chemical shift (∼0.01 ppm/°C). MR thermometry was subsequently performed in a series of in vivo nonhuman primate experiments under physiologic and ischemic conditions, testing its reproducibility and overall performance. Chemical shift thermometry demonstrated excellent agreement with phantom temperatures for all 3 approaches (DIFF-1: linear regression R(2) = 0.994; P thermometry and present in vivo applications under physiologic and ischemic conditions in a primate stroke model. © 2015 by American Journal of Neuroradiology.

  6. Calibration and validation of a model describing complete autotrophic nitrogen removal in a granular SBR system

    DEFF Research Database (Denmark)

    Vangsgaard, Anna Katrine; Mutlu, Ayten Gizem; Gernaey, Krist

    2013-01-01

    BACKGROUND: A validated model describing the nitritation-anammox process in a granular sequencing batch reactor (SBR) system is an important tool for: a) design of future experiments and b) prediction of process performance during optimization, while applying process control, or during system scale......-up. RESULTS: A model was calibrated using a step-wise procedure customized for the specific needs of the system. The important steps in the procedure were initialization, steady-state and dynamic calibration, and validation. A fast and effective initialization approach was developed to approximate pseudo...... screening of the parameter space proposed by Sin et al. (2008) - to find the best fit of the model to dynamic data. Finally, the calibrated model was validated with an independent data set. CONCLUSION: The presented calibration procedure is the first customized procedure for this type of system...

  7. Docking Validation Resources: Protein Family and Ligand Flexibility Experiments

    Science.gov (United States)

    Mukherjee, Sudipto; Balius, Trent E.; Rizzo, Robert C.

    2010-01-01

    A database consisting of 780 ligand-receptor complexes, termed SB2010, has been derived from the Protein Databank to evaluate the accuracy of docking protocols for regenerating bound ligand conformations. The goal is to provide easily accessible community resources for development of improved procedures to aid virtual screening for ligands with a wide range of flexibilities. Three core experiments using the program DOCK, which employ rigid (RGD), fixed anchor (FAD), and flexible (FLX) protocols, were used to gauge performance by several different metrics: (1) global results, (2) ligand flexibility, (3) protein family, and (4) crossdocking. Global spectrum plots of successes and failures vs rmsd reveal well-defined inflection regions, which suggest the commonly used 2 Å criteria is a reasonable choice for defining success. Across all 780 systems, success tracks with the relative difficulty of the calculations: RGD (82.3%) > FAD (78.1%) > FLX (63.8%). In general, failures due to scoring strongly outweigh those due to sampling. Subsets of SB2010 grouped by ligand flexibility (7-or-less, 8-to-15, and 15-plus rotatable bonds) reveal success degrades linearly for FAD and FLX protocols, in contrast to RGD which remains constant. Despite the challenges associated with FLX anchor orientation and on-the-fly flexible growth, success rates for the 7-or-less (74.5%), and in particular the 8-to-15 (55.2%) subset, are encouraging. Poorer results for the very flexible 15-plus set (39.3%) indicate substantial room for improvement. Family-based success appears largely independent of ligand flexibility suggesting a strong dependence on the binding site environment. For example, zinc-containing proteins are generally problematic despite moderately flexible ligands. Finally, representative crossdocking examples, for carbonic anhydrase, thermolysin, and neuraminidase families, show the utility of family-based analysis for rapid identification of particularly good or bad docking trends

  8. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate...... empirical data for validation of DSF modeling with building simulation software were produced within the International Energy Agency (IEA) SHCTask 34 / ECBCS Annex 43. This paper describes the full-scale outdoor experimental test facility, the experimental set-up and the measurements procedure...

  9. Numerical experiments modelling turbulent flows

    Science.gov (United States)

    Trefilík, Jiří; Kozel, Karel; Příhoda, Jaromír

    2014-03-01

    The work aims at investigation of the possibilities of modelling transonic flows mainly in external aerodynamics. New results are presented and compared with reference data and previously achieved results. For the turbulent flow simulations two modifications of the basic k - ω model are employed: SST and TNT. The numerical solution was achieved by using the MacCormack scheme on structured non-ortogonal grids. Artificial dissipation was added to improve the numerical stability.

  10. Numerical experiments modelling turbulent flows

    Directory of Open Access Journals (Sweden)

    Trefilík Jiří

    2014-03-01

    Full Text Available The work aims at investigation of the possibilities of modelling transonic flows mainly in external aerodynamics. New results are presented and compared with reference data and previously achieved results. For the turbulent flow simulations two modifications of the basic k – ω model are employed: SST and TNT. The numerical solution was achieved by using the MacCormack scheme on structured non-ortogonal grids. Artificial dissipation was added to improve the numerical stability.

  11. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  12. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability?

    Science.gov (United States)

    Janssen, Ellen M; Marshall, Deborah A; Hauber, A Brett; Bridges, John F P

    2017-12-01

    The recent endorsement of discrete-choice experiments (DCEs) and other stated-preference methods by regulatory and health technology assessment (HTA) agencies has placed a greater focus on demonstrating the validity and reliability of preference results. Areas covered: We present a practical overview of tests of validity and reliability that have been applied in the health DCE literature and explore other study qualities of DCEs. From the published literature, we identify a variety of methods to assess the validity and reliability of DCEs. We conceptualize these methods to create a conceptual model with four domains: measurement validity, measurement reliability, choice validity, and choice reliability. Each domain consists of three categories that can be assessed using one to four procedures (for a total of 24 tests). We present how these tests have been applied in the literature and direct readers to applications of these tests in the health DCE literature. Based on a stakeholder engagement exercise, we consider the importance of study characteristics beyond traditional concepts of validity and reliability. Expert commentary: We discuss study design considerations to assess the validity and reliability of a DCE, consider limitations to the current application of tests, and discuss future work to consider the quality of DCEs in healthcare.

  13. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  14. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L. [Utah State Univ., Logan, UT (United States). Dept. of Mechanical and Aerospace Engineering

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  15. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2016-01-01

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  16. Experiment designs offered for discussion preliminary to an LLNL field scale validation experiment in the Yucca Mountain Exploratory Shaft Facility

    International Nuclear Information System (INIS)

    Lowry, B.; Keller, C.

    1988-01-01

    It has been proposed (''Progress Report on Experiment Rationale for Validation of LLNL Models of Ground Water Behavior Near Nuclear Waste Canisters,'' Keller and Lowry, Dec. 7, 1988) that a heat generating spent fuel canister emplaced in unsaturated tuff, in a ventilated hole, will cause a net flux of water into the borehole during the heating cycle of the spent fuel. Accompanying this mass flux will be the formation of mineral deposits near the borehole wall as the water evaporates and leaves behind its dissolved solids. The net effect of this process upon the containment of radioactive wastes is a function of (1) where and how much solid material is deposited in the tuff matrix and cracks, and (2) the resultant effect on the medium flow characteristics. Experimental concepts described in this report are designed to quantify the magnitude and relative location of solid mineral deposit formation due to a heated and vented borehole environment. The most simple tests address matrix effects only; after the process is understood in the homogeneous matrix, fracture effects would be investigated. Three experiment concepts have been proposed. Each has unique advantages and allows investigation of specific aspects of the precipitate formation process. All could be done in reasonable time (less than a year) and none of them are extremely expensive (the most expensive is probably the structurally loaded block test). The calculational ability exists to analyze the ''real'' situation and each of the experiment designs, and produce a credible series of tests. None of the designs requires the acquisition of material property data beyond current capabilities. The tests could be extended, if our understanding is consistent with the data produced, to analyze fracture effects. 7 figs

  17. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  18. Validation of ASTECV2.1 based on the QUENCH-08 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Gómez-García-Toraño, Ignacio, E-mail: ignacio.torano@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Sánchez-Espinoza, Víctor-Hugo; Stieglitz, Robert [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Stuckert, Juri [Karlsruhe Institute of Technology, Institute for Applied Materials-Applied Materials Physics (IAM-AWP), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Laborde, Laurent; Belon, Sébastien [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), Nuclear Safety Division/Safety Research/Severe Accident Department, Saint Paul Lez Durance 13115 (France)

    2017-04-01

    Highlights: • ASTECV2.1 can reproduce QUENCH-08 experimental trends e.g. hydrogen generation. • Radial temperature gradient and heat transfer through argon gap are underestimated. • Mesh sizes lower than 55 mm needed to capture the strong axial temperature gradient. • Minor variations of external electrical resistance strongly affect bundle heat-up. • Modelling of a bypass and inclusion of currents partially overcome discrepancies. - Abstract: The Fukushima accidents have shown that further improvements of Severe Accident Management Guidelines (SAMGs) are still necessary. Hence, the enhancement of severe accident codes and their validation based on integral experiments is pursued worldwide. In particular, the capabilities of the European integral severe accident ASTECV2.1 code are being extended within the CESAM project through the improvement of physical models, code numerics and an extensive code validation. Among the different strategies encompassed in the plant SAMGs, one of the most important ones to prevent core damage is the injection of water into the overheated core (reflooding). However, under certain conditions, reflooding may trigger a sharp hydrogen generation that may jeopardize the containment. Within this work, ASTECV2.1 models describing the early in-vessel phase of the severe accident and its termination by core reflooding are validated against data from the QUENCH test facility. The QUENCH-08, involving the injection of 15 g/s (about 0.6 g/s/rod) of saturated steam at a bundle temperature of 2073 K, has been selected for this comparison. Results show that ASTECV2.1 is able to reproduce the experimental temperatures and oxide thicknesses at representative bundle locations. The predicted total hydrogen generation (76 g) is similar to the experimental one (84 g). In addition, the choices of an axial mesh size lower than 55 mm and of an external electrical resistance of a 7 mΩ/rod have been justified with parametric analyses. Finally, new

  19. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  20. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  1. Experimental Validation of a Mathematical Model for Seabed Liquefaction Under Waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2012-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt (d(50) = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range of 7.7-18 cm, 55-cm water depth and 1.6-s wave period enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data were used to validate the model. A numerical example...

  2. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  3. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... velocity is non-zero. This is the case in FSVs, where it results in an additional dampening effect, which is of relevance when analyzing contact-impact. Experimental data from different tests cases of a FSV has been gathered, with the plunger moving through a medium of either oil or air. This data is used...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  4. Modeling and validation of microwave ablations with internal vaporization.

    Science.gov (United States)

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L

    2015-02-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this paper, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10, and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intraprocedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard index of 0.27, 0.49, 0.61, 0.67, and 0.69 at 1, 2, 3, 4, and 5 min, respectively. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally.

  5. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  6. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  7. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  8. Proton Resonance Frequency Chemical Shift Thermometry: Experimental Design and Validation Towards High-Resolution Non-Invasive Temperature Monitoring, and in vivo Experience in a Non-human Primate Model of Acute Ischemic Stroke

    Science.gov (United States)

    Dehkharghani, Seena; Mao, Hui; Howell, Leonard; Zhang, Xiaodong; Pate, K S; Magrath, P R; Tong, Frank; Wei, L; Qiu, D; Fleischer, C; Oshinski, J N

    2016-01-01

    BACKGROUND AND PURPOSE Applications for non-invasive biological temperature monitoring are widespread in biomedicine, and of particular interest in the context of brain temperature regulation, where traditionally costly and invasive monitoring schemes limit their applicability in many settings. Brain thermal regulation therefore remains controversial, motivating the development of non-invasive approaches such as temperature-sensitive NMR phenomena. The purpose of this work was to compare the utility of competing approaches to MR thermometry (MRT) employing proton resonance frequency chemical shift. Three methodologies were tested, hypothesizing the feasibility of a fast and accurate approach to chemical shift thermometry, in a phantom study at 3.0 Tesla. MATERIALS AND METHODS A conventional, paired approach (DIFF-1), an accelerated single-scan approach (DIFF-2), and a new, further accelerated strategy (DIFF-3) were tested. Phantom temperatures were modulated during real-time fiber optic temperature monitoring, with MRT derived simultaneously from temperature-sensitive changes in the water proton chemical shift (~0.01 ppm/°C). MRT was subsequently performed in a series of in vivo non-human primate experiments under physiologic and ischemic conditions testing its reproducibility and overall performance. RESULTS Chemical shift thermometry demonstrated excellent agreement with phantom temperatures for all three approaches (DIFF-1 linear regression R2=0.994, p<0.001, acquisition time 4 min 40 s; DIFF-2 R2=0.996, p<0.001, acquisition time 4 min; DIFF-3 R2=0.998, p<0.001, acquisition time 40 s). CONCLUSION These findings confirm the comparability in performance of three competing approaches MRT, and present in vivo applications under physiologic and ischemic conditions in a primate stroke model. PMID:25655874

  9. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  10. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  11. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  12. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  13. A Comparison and Validation of Two Surface Ship Readiness Models

    Science.gov (United States)

    1994-09-01

    they cannot be considered validated. Any application of these programs without additional verification is at the risk of the user. vii Vifi TABLE OF...contains the S’AS code that was used to perform the full model run for the SIM. // SIMIC JOB USER=S6402,CLASS--C 1/ EXEC SAS //WORK DD UNIT=SYSDASPACE

  14. Validating Work Discrimination and Coping Strategy Models for Sexual Minorities

    Science.gov (United States)

    Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

    2009-01-01

    The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

  15. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  16. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  17. Validating soil phosphorus routines in the SWAT model

    Science.gov (United States)

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  18. Experience economy meets business model design

    DEFF Research Database (Denmark)

    Gudiksen, Sune Klok; Smed, Søren Graakjær; Poulsen, Søren Bolvig

    2012-01-01

    companies automatically get a higher prize when offering an experience setting to the customer illustrated by the coffee example. Organizations that offer experiences still have an advantage but when an increasing number of organizations enter the experience economy the competition naturally gets tougher......Through the last decade the experience economy has found solid ground and manifested itself as a parameter where business and organizations can differentiate from competitors. The fundamental premise is the one found in Pine & Gilmores model from 1999 over 'the progression of economic value' where...... produced, designed or staged experience that gains the most profit or creates return of investment. It becomes more obvious that other parameters in the future can be a vital part of the experience economy and one of these is business model innovation. Business model innovation is about continuous...

  19. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During the recent years the attention to the double skin facade (DSF) concept has greatly increased. Nevertheless, the application of the concept depends on whether a reliable model for simulation of the DSF performance will be developed or pointed out. This is, however, not possible to do, until...... the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...... of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...

  20. The database for reaching experiments and models.

    Directory of Open Access Journals (Sweden)

    Ben Walker

    Full Text Available Reaching is one of the central experimental paradigms in the field of motor control, and many computational models of reaching have been published. While most of these models try to explain subject data (such as movement kinematics, reaching performance, forces, etc. from only a single experiment, distinct experiments often share experimental conditions and record similar kinematics. This suggests that reaching models could be applied to (and falsified by multiple experiments. However, using multiple datasets is difficult because experimental data formats vary widely. Standardizing data formats promises to enable scientists to test model predictions against many experiments and to compare experimental results across labs. Here we report on the development of a new resource available to scientists: a database of reaching called the Database for Reaching Experiments And Models (DREAM. DREAM collects both experimental datasets and models and facilitates their comparison by standardizing formats. The DREAM project promises to be useful for experimentalists who want to understand how their data relates to models, for modelers who want to test their theories, and for educators who want to help students better understand reaching experiments, models, and data analysis.

  1. Validation Study of Unnotched Charpy and Taylor-Anvil Impact Experiments using Kayenta

    Energy Technology Data Exchange (ETDEWEB)

    Kamojjala, Krishna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lacy, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chu, Henry S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brannon, Rebecca [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    Validation of a single computational model with multiple available strain-to-failure fracture theories is presented through experimental tests and numerical simulations of the standardized unnotched Charpy and Taylor-anvil impact tests, both run using the same material model (Kayenta). Unnotched Charpy tests are performed on rolled homogeneous armor steel. The fracture patterns using Kayenta’s various failure options that include aleatory uncertainty and scale effects are compared against the experiments. Other quantities of interest include the average value of the absorbed energy and bend angle of the specimen. Taylor-anvil impact tests are performed on Ti6Al4V titanium alloy. The impact speeds of the specimen are 321 m/s and 393 m/s. The goal of the numerical work is to reproduce the damage patterns observed in the laboratory. For the numerical study, the Johnson-Cook failure model is used as the ductile fracture criterion, and aleatory uncertainty is applied to rate-dependence parameters to explore its effect on the fracture patterns.

  2. Validation of Fatigue Modeling Predictions in Aviation Operations

    Science.gov (United States)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  3. A model to predict element redistribution in unsaturated soil: Its simplification and validation

    International Nuclear Information System (INIS)

    Sheppard, M.I.; Stephens, M.E.; Davis, P.A.; Wojciechowski, L.

    1991-01-01

    A research model has been developed to predict the long-term fate of contaminants entering unsaturated soil at the surface through irrigation or atmospheric deposition, and/or at the water table through groundwater. The model, called SCEMR1 (Soil Chemical Exchange and Migration of Radionuclides, Version 1), uses Darcy's law to model water movement, and the soil solid/liquid partition coefficient, K d , to model chemical exchange. SCEMR1 has been validated extensively on controlled field experiments with several soils, aeration statuses and the effects of plants. These validation results show that the model is robust and performs well. Sensitivity analyses identified soil K d , annual effective precipitation, soil type and soil depth to be the four most important model parameters. SCEMR1 consumes too much computer time for incorporation into a probabilistic assessment code. Therefore, we have used SCEMR1 output to derive a simple assessment model. The assessment model reflects the complexity of its parent code, and provides a more realistic description of containment transport in soils than would a compartment model. Comparison of the performance of the SCEMR1 research model, the simple SCEMR1 assessment model and the TERRA compartment model on a four-year soil-core experiment shows that the SCEMR1 assessment model generally provides conservative soil concentrations. (15 refs., 3 figs.)

  4. HELOKA-HP thermal-hydraulic model validation and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Xue Zhou; Ghidersa, Bradut-Eugen; Badea, Aurelian Florin

    2016-11-01

    Highlights: • The electrical heater in HELOKA-HP has been modeled with RELAP5-3D using experimental data as input. • The model has been validated using novel techniques for assimilating experimental data and the representative model parameters with BEST-EST. • The methodology is successfully used for reducing the model uncertainties and provides a quantitative measure of the consistency between the experimental data and the model. - Abstract: The Helium Loop Karlsruhe High Pressure (HELOKA-HP) is an experimental facility for the testing of various helium-cooled components at high temperature (500 °C) and high pressure (8 MPa) for nuclear fusion applications. For modeling the loop thermal dynamics, a thermal-hydraulic model has been created using the system code RELAP5-3D. Recently, new experimental data covering the behavior of the loop components under relevant operational conditions have been made available giving the possibility of validating and calibrating the existing models in order to reduce the uncertainties of the simulated responses. This paper presents an example where such process has been applied for the HELOKA electrical heater model. Using novel techniques for assimilating experimental data, implemented in the computational module BEST-EST, the representative parameters of the model have been calibrated.

  5. Turbulence Models: Data from Other Experiments: FAITH Hill 3-D Separated Flow

    Data.gov (United States)

    National Aeronautics and Space Administration — Exp: FAITH Hill 3-D Separated Flow. This web page provides data from experiments that may be useful for the validation of turbulence models. This resource is...

  6. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  7. Neural network models of learning and categorization in multigame experiments

    Directory of Open Access Journals (Sweden)

    Davide eMarchiori

    2011-12-01

    Full Text Available Previous research has shown that regret-driven neural networks predict behavior in repeated completely mixed games remarkably well, substantially equating the performance of the most accurate established models of learning. This result prompts the question of what is the added value of modeling learning through neural networks. We submit that this modeling approach allows for models that are able to distinguish among and respond differently to different payoff structures. Moreover, the process of categorization of a game is implicitly carried out by these models, thus without the need of any external explicit theory of similarity between games. To validate our claims, we designed and ran two multigame experiments in which subjects faced, in random sequence, different instances of two completely mixed 2x2 games. Then, we tested on our experimental data two regret-driven neural network models, and compared their performance with that of other established models of learning and Nash equilibrium.

  8. Validation of a parametric finite element human femur model.

    Science.gov (United States)

    Klein, Katelyn F; Hu, Jingwen; Reed, Matthew P; Schneider, Lawrence W; Rupp, Jonathan D

    2017-05-19

    Finite element (FE) models with geometry and material properties that are parametric with subject descriptors, such as age and body shape/size, are being developed to incorporate population variability into crash simulations. However, the validation methods currently being used with these parametric models do not assess whether model predictions are reasonable in the space over which the model is intended to be used. This study presents a parametric model of the femur and applies a unique validation paradigm to this parametric femur model that characterizes whether model predictions reproduce experimentally observed trends. FE models of male and female femurs with geometries that are parametric with age, femur length, and body mass index (BMI) were developed based on existing statistical models that predict femur geometry. These parametric FE femur models were validated by comparing responses from combined loading tests of femoral shafts to simulation results from FE models of the corresponding femoral shafts whose geometry was predicted using the associated age, femur length, and BMI. The effects of subject variables on model responses were also compared with trends in the experimental data set by fitting similarly parameterized statistical models to both the results of the experimental data and the corresponding FE model results and then comparing fitted model coefficients for the experimental and predicted data sets. The average error in impact force at experimental failure for the parametric models was 5%. The coefficients of a statistical model fit to simulation data were within one standard error of the coefficients of a similarly parameterized model of the experimental data except for the age parameter, likely because material properties used in simulations were not varied with specimen age. In simulations to explore the effects of femur length, BMI, and age on impact response, only BMI significantly affected response for both men and women, with increasing

  9. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  10. Validation of mathematical models to describe fluid dynamics of a cold riser by gamma ray attenuation

    International Nuclear Information System (INIS)

    Melo, Ana Cristina Bezerra Azedo de

    2004-12-01

    The fluid dynamic behavior of a riser in a cold type FCC model was investigated by means of catalyst concentration distribution measured with gamma attenuation and simulated with a mathematical model. In the riser of the cold model, MEF, 0,032 m in diameter, 2,30 m in length the fluidized bed, whose components are air and FCC catalyst, circulates. The MEF is operated by automatic control and instruments for measuring fluid dynamic variables. An axial catalyst concentration distribution was measured using an Am-241 gamma source and a NaI detector coupled to a multichannel provided with a software for data acquisition and evaluation. The MEF was adapted for a fluid dynamic model validation which describes the flow in the riser, for example, by introducing an injector for controlling the solid flow in circulation. Mathematical models were selected from literature, analyzed and tested to simulate the fluid dynamic of the riser. A methodology for validating fluid dynamic models was studied and implemented. The stages of the work were developed according to the validation methodology, such as data planning experiments, study of the equations which describe the fluidodynamic, computational solvers application and comparison with experimental data. Operational sequences were carried out keeping the MEF conditions for measuring catalyst concentration and simultaneously measuring the fluid dynamic variables, velocity of the components and pressure drop in the riser. Following this, simulated and experimental values were compared and statistical data treatment done, aiming at the required precision to validate the fluid dynamic model. The comparison tests between experimental and simulated data were carried out under validation criteria. The fluid dynamic behavior of the riser was analyzed and the results and the agreement with literature were discussed. The adopt model was validated under the MEF operational conditions, for a 3 to 6 m/s gas velocity in the riser and a slip

  11. Modeling Choice and Valuation in Decision Experiments

    Science.gov (United States)

    Loomes, Graham

    2010-01-01

    This article develops a parsimonious descriptive model of individual choice and valuation in the kinds of experiments that constitute a substantial part of the literature relating to decision making under risk and uncertainty. It suggests that many of the best known "regularities" observed in those experiments may arise from a tendency for…

  12. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  13. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  14. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  15. AEROTAXI ground static test and finite element model validation

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2011-06-01

    Full Text Available In this presentation, we will concentrate on typical Ground Static Test (GST and Finite Element (FE software comparisons. It is necessary to note, that standard GST are obligatory for any new aircraft configuration. We can mention here the investigations of the AeroTAXITM, a small aircraft configuration, using PRODERA® equipment. A Finite Element Model (FEM of the AeroTAXITM has been developed in PATRAN/NASTRAN®, partly from a previous ANSYS® model. FEM can be used to investigate potential structural modifications or changes with realistic component corrections. Model validation should be part of every modern engineering analysis and quality assurance procedure.

  16. Circumplex Model VII: validation studies and FACES III.

    Science.gov (United States)

    Olson, D H

    1986-09-01

    This paper reviews some of the recent empirical studies validating the Circumplex Model and describes the newly developed self-report measure, FACES III. Studies testing hypotheses derived from the Circumplex Model regarding the three dimensions of cohesion, change, and communication are reviewed. Case illustrations using FACES III and the Clinical Rating Scale are presented. These two assessment tools can be used for making a diagnosis of family functioning and for assessing changes over the course of treatment. This paper reflects the continuing attempt to develop further the Circumplex Model and to bridge more adequately research, theory, and practice.

  17. Firn Model Intercomparison Experiment (FirnMICE)

    DEFF Research Database (Denmark)

    Lundin, Jessica M.D.; Stevens, C. Max; Arthern, Robert

    2017-01-01

    Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes in tempe......Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes...

  18. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....

  19. Computer models experiences in radiological safety

    International Nuclear Information System (INIS)

    Ferreri, J.C.; Grandi, G.M.; Ventura, M.A.; Doval, A.S.

    1989-01-01

    A review in the formulation and use of numerical methods in fluid dynamics and heat and mass transfer in nuclear safety is presented. A wide range of applications is covered, namely: nuclear reactor's thermohydraulics, natural circulation in closed loops, experiments for the validation of numerical methods, thermohydraulics of fractured-porous media and radionuclide migration. The results of the experience accumulated is a research line dealing at the present with moving grids in computational fluid dynamics and the use of artificial intelligence techniques. As a consequence some recent experience in the development of expert systems and the considerations that should be taken into account for its use in radiological safety is also reviewed. (author)

  20. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Model-based testing of a vehicle instrument cluster for design validation using machine vision

    International Nuclear Information System (INIS)

    Huang, Yingping; McMurran, Ross; Dhadyalla, Gunwant; Jones, R Peter; Mouzakitis, Alexandros

    2009-01-01

    This paper presents an advanced testing system, combining model-based testing and machine vision technologies, for automated design validation of a vehicle instrument cluster. In the system, a hardware-in-the-loop (HIL) tester, supported by model-based approaches, simulates vehicle operations in real time and dynamically provides all essential signals to the instrument cluster under test. A machine vision system with advanced image processing algorithms is designed to inspect the visual displays. Experiments demonstrate that the system developed is accurate for measuring the pointer position, bar graph position, pointer angular velocity and indicator flash rate, and is highly robust for validating various functionalities including warning lights status, symbol and text displays. Moreover, the system developed greatly eases the task of tedious validation testing and makes onerous repeated tests possible

  2. An Empirical Validation of Building Simulation Software for Modelling of Double-Skin Facade (DSF)

    DEFF Research Database (Denmark)

    Larsen, Olena Kalyanova; Heiselberg, Per; Felsmann, Clemens

    2009-01-01

    , TRNSYS-TUD and BSim) was carried out in the framework of IEA SHC Task 34 /ECBCS Annex 43 "Testing and Validation of Building Energy Simulation Tools". The experimental data for the validation was gathered in a full-scale outdoor test facility. The empirical data sets comprise the key-functioning modes...... of DSF: 1. Thermal buffer mode (closed DSF cavity) and 2. External air curtain mode (naturally ventilated DSF cavity with the top and bottom openings open to outdoors). By carrying out the empirical tests, it was concluded that all models experience difficulties in predictions during the peak solar loads...

  3. First experiments results about the engineering model of Rapsodie

    International Nuclear Information System (INIS)

    Chalot, A.; Ginier, R.; Sauvage, M.

    1964-01-01

    This report deals with the first series of experiments carried out on the engineering model of Rapsodie and on an associated sodium facility set in a laboratory hall of Cadarache. It conveys more precisely: 1/ - The difficulties encountered during the erection and assembly of the engineering model and a compilation of the results of the first series of experiments and tests carried out on this installation (loading of the subassemblies preheating, thermal chocks...). 2/ - The experiments and tests carried out on the two prototypes control rod drive mechanisms which brought to the choice for the design of the definitive drive mechanism. As a whole, the results proved the validity of the general design principles adopted for Rapsodie. (authors) [fr

  4. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  5. Status Update on the GPM Ground Validation Iowa Flood Studies (IFloodS) Field Experiment

    Science.gov (United States)

    Petersen, Walt; Krajewski, Witold

    2013-04-01

    The overarching objective of integrated hydrologic ground validation activities supporting the Global Precipitation Measurement Mission (GPM) is to provide better understanding of the strengths and limitations of the satellite products, in the context of hydrologic applications. To this end, the GPM Ground Validation (GV) program is conducting the first of several hydrology-oriented field efforts: the Iowa Flood Studies (IFloodS) experiment. IFloodS will be conducted in the central to northeastern part of Iowa in Midwestern United States during the months of April-June, 2013. Specific science objectives and related goals for the IFloodS experiment can be summarized as follows: 1. Quantify the physical characteristics and space/time variability of rain (rates, DSD, process/"regime") and map to satellite rainfall retrieval uncertainty. 2. Assess satellite rainfall retrieval uncertainties at instantaneous to daily time scales and evaluate propagation/impact of uncertainty in flood-prediction. 3. Assess hydrologic predictive skill as a function of space/time scales, basin morphology, and land use/cover. 4. Discern the relative roles of rainfall quantities such as rate and accumulation as compared to other factors (e.g. transport of water in the drainage network) in flood genesis. 5. Refine approaches to "integrated hydrologic GV" concept based on IFloodS experiences and apply to future GPM Integrated GV field efforts. These objectives will be achieved via the deployment of the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms with attendant soil moisture and temperature probes, a large network of both 2D Video and Parsivel disdrometers, and USDA-ARS gauge and soil-moisture measurements (in collaboration with the NASA SMAP mission). The aforementioned measurements will be used to complement existing operational WSR-88D S-band polarimetric radar measurements

  6. Modeling of laser-driven hydrodynamics experiments

    Science.gov (United States)

    di Stefano, Carlos; Doss, Forrest; Rasmus, Alex; Flippo, Kirk; Desjardins, Tiffany; Merritt, Elizabeth; Kline, John; Hager, Jon; Bradley, Paul

    2017-10-01

    Correct interpretation of hydrodynamics experiments driven by a laser-produced shock depends strongly on an understanding of the time-dependent effect of the irradiation conditions on the flow. In this talk, we discuss the modeling of such experiments using the RAGE radiation-hydrodynamics code. The focus is an instability experiment consisting of a period of relatively-steady shock conditions in which the Richtmyer-Meshkov process dominates, followed by a period of decaying flow conditions, in which the dominant growth process changes to Rayleigh-Taylor instability. The use of a laser model is essential for capturing the transition. also University of Michigan.

  7. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  8. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  9. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  10. Validation of a heat conduction model for finite domain, non-uniformly heated, laminate bodies

    Science.gov (United States)

    Desgrosseilliers, Louis; Kabbara, Moe; Groulx, Dominic; White, Mary Anne

    2016-07-01

    Infrared thermographic validation is shown for a closed-form analytical heat conduction model for non-uniformly heated, laminate bodies with an insulated domain boundary. Experiments were conducted by applying power to rectangular electric heaters and cooled by natural convection in air, but also apply to constant-temperature heat sources and forced convection. The model accurately represents two-dimensional laminate heat conduction behaviour giving rise to heat spreading using one-dimensional equations for the temperature distributions and heat transfer rates under steady-state and pseudo-steady-state conditions. Validation of the model with an insulated boundary (complementing previous studies with an infinite boundary) provides useful predictions of heat spreading performance and simplified temperature uniformity calculations (useful in log-mean temperature difference style heat exchanger calculations) for real laminate systems such as found in electronics heat sinks, multi-ply stovetop cookware and interface materials for supercooled salt hydrates. Computational determinations of implicit insulated boundary condition locations in measured data, required to assess model equation validation, were also demonstrated. Excellent goodness of fit was observed (both root-mean-square error and R 2 values), in all cases except when the uncertainty of low temperatures measured via infrared thermography hindered the statistical significance of the model fit. The experimental validation in all other cases supports use of the model equations in design calculations and heat exchange simulations.

  11. Validation of a fluid-structure interaction numerical model for predicting flow transients in arteries.

    Science.gov (United States)

    Kanyanta, V; Ivankovic, A; Karac, A

    2009-08-07

    Fluid-structure interaction (FSI) numerical models are now widely used in predicting blood flow transients. This is because of the importance of the interaction between the flowing blood and the deforming arterial wall to blood flow behaviour. Unfortunately, most of these FSI models lack rigorous validation and, thus, cannot guarantee the accuracy of their predictions. This paper presents the comprehensive validation of a two-way coupled FSI numerical model, developed to predict flow transients in compliant conduits such as arteries. The model is validated using analytical solutions and experiments conducted on polyurethane mock artery. Flow parameters such as pressure and axial stress (and precursor) wave speeds, wall deformations and oscillating frequency, fluid velocity and Poisson coupling effects, were used as the basis of this validation. Results show very good comparison between numerical predictions, analytical solutions and experimental data. The agreement between the three approaches is generally over 95%. The model also shows accurate prediction of Poisson coupling effects in unsteady flows through flexible pipes, which up to this stage have only being predicted analytically. Therefore, this numerical model can accurately predict flow transients in compliant vessels such as arteries.

  12. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  13. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  14. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  15. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  16. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  17. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  18. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  19. Validating unit commitment models: A case for benchmark test systems

    OpenAIRE

    Melhorn, Alexander C.; Li, Mingsong; Carroll, Paula; Flynn, Damian

    2016-01-01

    Due to increasing penetration of non-traditional power system resources; e.g. renewable generation, electric vehicles, demand response, etc. and computational power there has been an increased interest in research on unit commitment. It therefore may be important to take another look at how unit commitment models and algorithms are validated especially as improvements in solutions and algorithmic performance are desired to combat the added complexity of additional constraints. This paper expl...

  20. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  1. Experimental Validation of a Permeability Model for Enrichment Membranes

    International Nuclear Information System (INIS)

    Orellano, Pablo; Brasnarof, Daniel; Florido Pablo

    2003-01-01

    An experimental loop with a real scale diffuser, in a single enrichment-stage configuration, was operated with air at different process conditions, in order to characterize the membrane permeability.Using these experimental data, an analytical geometric-and-morphologic-based model was validated.It is conclude that a new set of independent measurements, i.e. enrichment, is necessary in order to fully characterize diffusers, because of its internal parameters are not univocally determinated with permeability experimental data only

  2. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  3. (In)validation in the Minority: The Experiences of Latino Students Enrolled in an HBCU

    Science.gov (United States)

    Allen, Taryn Ozuna

    2016-01-01

    This qualitative, phenomenological study examined the academic and interpersonal validation experiences of four female and four male Latino students who were enrolled in their second- to fifth-year at an HBCU in Texas. Using interviews, campus observations, a questionnaire, and analytic memos, this study sought to understand the role of in- and…

  4. Service validity and service reliability of search, experience and credence services. A scenario study

    NARCIS (Netherlands)

    Galetzka, Mirjam; Verhoeven, J.W.M.; Pruyn, Adriaan T.H.

    2006-01-01

    The purpose of this research is to add to our understanding of the antecedents of customer satisfaction by examining the effects of service reliability (Is the service “correctly” produced?) and service validity (Is the “correct” service produced?) of search, experience and credence services.

  5. Parameterization and validation of an ungulate-pasture model.

    Science.gov (United States)

    Pekkarinen, Antti-Juhani; Kumpula, Jouko; Tahvonen, Olli

    2017-10-01

    Ungulate grazing and trampling strongly affect pastures and ecosystems throughout the world. Ecological population models are used for studying these systems and determining the guidelines for sustainable and economically viable management. However, the effect of trampling and other resource wastage is either not taken into account or quantified with data in earlier models. Also, the ability of models to describe the herbivore impact on pastures is usually not validated. We used a detailed model and data to study the level of winter- and summertime lichen wastage by reindeer and the effects of wastage on population sizes and management. We also validated the model with respect to its ability of predicting changes in lichen biomass and compared the actual management in herding districts with model results. The modeling efficiency value (0.75) and visual comparison between the model predictions and data showed that the model was able to describe the changes in lichen pastures caused by reindeer grazing and trampling. At the current lichen biomass levels in the northernmost Finland, the lichen wastage varied from 0 to 1 times the lichen intake during winter and from 6 to 10 times the intake during summer. With a higher value for wastage, reindeer numbers and net revenues were lower in the economically optimal solutions. Higher wastage also favored the use of supplementary feeding in the optimal steady state. Actual reindeer numbers in the districts were higher than in the optimal steady-state solutions for the model in 18 herding districts out of 20. Synthesis and applications . We show that a complex model can be used for analyzing ungulate-pasture dynamics and sustainable management if the model is parameterized and validated for the system. Wastage levels caused by trampling and other causes should be quantified with data as they strongly affect the results and management recommendations. Summertime lichen wastage caused by reindeer is higher than expected, which

  6. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  7. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  8. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  9. Modelling the Grimsel migration field experiments at PSI

    International Nuclear Information System (INIS)

    Heer, W.

    1997-01-01

    For several years tracer migration experiments have been performed at Nagra's Grimsel Test Site as a joint undertaking of Nagra, PNC and PSI. The aims of modelling the migration experiments are (1) to better understand the nuclide transport through crystalline rock; (2) to gain information on validity of methods and correlating parameters; (3) to improve models for safety assessments. The PSI modelling results, presented here, show a consistent picture for the investigated tracers (the non-sorbing uranine, the weakly sorbing sodium, the moderately sorbing strontium and the more strongly sorbing cesium). They represent an important step in building up confidence in safety assessments for radioactive waste repositories. (author) 5 figs., 1 tab., 12 refs

  10. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  11. Psychometric validation of the experience with allergic rhinitis nasal spray questionnaire

    Directory of Open Access Journals (Sweden)

    Crawford B

    2011-06-01

    Full Text Available Bruce Crawford1, Richard H Stanford2, Audrey Y Wong3, Anand A Dalal2, Martha S Bayliss11Mapi Values, Boston, MA, USA; 2GlaxoSmithKline, Research Triangle Park, NC, USA; 3BioMedical Insights, San Francisco, CA, USABackground: Patient experience and preference are critical factors influencing compliance in patients with allergic rhinitis (AR receiving intranasal corticosteroids. The Experience with Allergic Rhinitis Nasal Spray Questionnaire (EARNS-Q was developed to measure subject experiences with and preferences for nasal sprays.Objective: To describe the psychometric validation of the EARNS-Q modules.Methods: An observational study was conducted with subjects aged 18–65 years with physician-diagnosed vasomotor, seasonal, and/or perennial allergic rhinitis who were using a prescription nasal spray. Subjects completed the experience module of the EARNS-Q and the Treatment Satisfaction Questionnaire with Medication (TSQM at baseline and after 2 weeks. Further validation analyses were conducted in a 3-week, randomized, single-blind, crossover, multicenter clinical study in which subjects ≥18 years of age with documented seasonal AR received flunisolide and beclomethasone and completed the EARNS-Q experience module on days 1 and 8, the EARNS-Q preference module on day 22, and the TSQM on days 8 and 22.Results: The observational and clinical studies were completed by 121 and 89 subjects, respectively. Both modules demonstrated acceptable reliability (α = 0.72 experience module; α = 0.93 preference module global scores and validity (intraclass correlation coefficient or ICC 0.64 to 0.82 test–retest validity. Correlations among the experience and preference modules were moderate (r = 0.39 to 0.79 and within internal consistency reliability estimates, indicating measurement of distinct constructs.Conclusion: The EARNS-Q is a patient-reported outcomes measure that enables reliable and valid measurement of subject experience with, and preference

  12. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  13. [Perception scales of validated food insecurity: the experience of the countries in Latin America and the Caribbean].

    Science.gov (United States)

    Sperandio, Naiara; Morais, Dayane de Castro; Priore, Silvia Eloiza

    2018-02-01

    The scope of this systematic review was to compare the food insecurity scales validated and used in the countries in Latin America and the Caribbean, and analyze the methods used in validation studies. A search was conducted in the Lilacs, SciELO and Medline electronic databases. The publications were pre-selected by titles and abstracts, and subsequently by a full reading. Of the 16,325 studies reviewed, 14 were selected. Twelve validated scales were identified for the following countries: Venezuela, Brazil, Colombia, Bolivia, Ecuador, Costa Rica, Mexico, Haiti, the Dominican Republic, Argentina and Guatemala. Besides these, there is the Latin American and Caribbean scale, the scope of which is regional. The scales ranged from the standard reference used, number of questions and diagnosis of insecurity. The methods used by the studies for internal validation were calculation of Cronbach's alpha and the Rasch model; for external validation the authors calculated association and /or correlation with socioeconomic and food consumption variables. The successful experience of Latin America and the Caribbean in the development of national and regional scales can be an example for other countries that do not have this important indicator capable of measuring the phenomenon of food insecurity.

  14. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  15. Validation of infrared thermography in serotonin-induced itch model in rats

    DEFF Research Database (Denmark)

    Dagnæs-Hansen, Frederik; Jasemian, Yousef; Gazerani, Parisa

    The number of scratching bouts is generally used as a standard method in animal models of itch. The aim of the present study was to validate the application of infrared thermography (IR-Th) in a serotonin-induced itch model in rats. Adult Sprague-Dawley male rats (n = 24) were used in 3 consecutive...... experiments. The first experiment evaluated vasomotor response (IR-Th) and scratching behavior (number of bouts) induced by intradermal serotonin (10 μl, 2%). Isotonic saline (control: 10 μl, 0.9%) and Methysergide (antagonist: 10 μl, 0.047 mg/ml) were used. The second experiment evaluated the dose......-response effect of intradermal serotonin (1%, 2% and 4%) on local temperature. The third experiment evaluated the anesthetized rats to test the local vasomotor responses in absent of scratching. Serotonin elicited significant scratching and lowered the local temperature at the site of injection. A dose...

  16. Thermo-mechanical analyses and model validation in the HAW test field. Final report

    International Nuclear Information System (INIS)

    Heijdra, J.J.; Broerse, J.; Prij, J.

    1995-01-01

    An overview is given of the thermo-mechanical analysis work done for the design of the High Active Waste experiment and for the purpose of validation of the used models through comparison with experiments. A brief treatise is given on the problems of validation of models used for the prediction of physical behaviour which cannot be determined with experiments. The analysis work encompasses investigations into the initial state of stress in the field, the constitutive relations, the temperature rise, and the pressure on the liner tubes inserted in the field to guarantee the retrievability of the radioactive sources used for the experiment. The measurements of temperatures, deformations, and stresses are described and an evaluation is given of the comparison of measured and calculated data. An attempt has been made to qualify or even quantify the discrepancies, if any, between measurements and calculations. It was found that the model for the temperature calculations performed adequately. For the stresses the general tendency was good, however, large discrepancies exist mainly due to inaccuracies in the measurements. For the deformations again the general tendency of the model predictions was in accordance with the measurements. However, from the evaluation it appears that in spite of the efforts to estimate the correct initial rock pressure at the location of the experiment, this pressure has been underestimated. The evaluation has contributed to a considerable increase in confidence in the models and gives no reason to question the constitutive model for rock salt. However, due to the quality of the measurements of the stress and the relatively short period of the experiments no quantitatively firm support for the constitutive model is acquired. Collections of graphs giving the measured and calculated data are attached as appendices. (orig.)

  17. Validity and Reliability of Willingness-to-Pay Estimates: Evidence from Two Overlapping Discrete-Choice Experiments.

    Science.gov (United States)

    Telser, Harry; Becker, Karolin; Zweifel, Peter

    2008-12-01

    Discrete-choice experiments (DCEs), while becoming increasingly popular, have rarely been tested for validity and reliability. To address the issues of validity and reliability of willingness-to-accept (WTA) values obtained from DCEs. In particular, to examine whether differences in the attribute set describing a hypothetical product have an influence on preferences and willingness-to-pay (WTP) values of respondents. Two DCEs were designed, featuring hypothetical insurance contracts for Swiss healthcare. The contract attributes were pre-selected in expert sessions with representatives of the Swiss healthcare system, and their relevance was checked in a pre-test. Experiment A contained rather radical health system reform options, while experiment B concentrated on more familiar elements such as co-payment and the benefit catalogue. Three attributes were present in both experiments: delayed access to innovation ('innovation'), restricted drug benefit ('generics'), and the change in the monthly premium ('premium'). The issue to be addressed was whether WTA values for the overlapping attributes were similar, even though they were embedded in widely differing choice sets.Two representative telephone surveys with 1000 people aged >25 years were conducted independently in the German and French parts of Switzerland during September 2003. Socioeconomic variables collected included age, sex, education, total household income, place of residence, occupation, and household size. Three models were estimated (a simple linear model, a model allowing interaction of the price attribute with socioeconomic characteristics, and a model with a full set of interaction terms). The socioeconomic characteristics of the two samples were very similar. Theoretical validity tends to receive empirical support in both experiments in all cases where economic theory makes predictions concerning differences between socioeconomic groups. However, a systematic inappropriate influence on measured WTA

  18. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  19. Validation and Scaling of Soil Moisture in a Semi-Arid Environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    Science.gov (United States)

    Colliander, Andreas; Cosh, Michael H.; Misra, Sidharth; Jackson, Thomas J.; Crow, Wade T.; Chan, Steven; Bindlish, Rajat; Chae, Chun; Holifield Collins, Chandra; Yueh, Simon H.

    2017-01-01

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data products. The main goals of the experiment were to address issues regarding the spatial disaggregation methodologies for improvement of soil moisture products and validation of the in situ measurement upscaling techniques. To support these objectives high-resolution soil moisture maps were acquired with the airborne PALS (Passive Active L-band Sensor) instrument over an area in southeast Arizona that includes the Walnut Gulch Experimental Watershed (WGEW), and intensive ground sampling was carried out to augment the permanent in situ instrumentation. The objective of the paper was to establish the correspondence and relationship between the highly heterogeneous spatial distribution of soil moisture on the ground and the coarse resolution radiometer-based soil moisture retrievals of SMAP. The high-resolution mapping conducted with PALS provided the required connection between the in situ measurements and SMAP retrievals. The in situ measurements were used to validate the PALS soil moisture acquired at 1-km resolution. Based on the information from a dense network of rain gauges in the study area, the in situ soil moisture measurements did not capture all the precipitation events accurately. That is, the PALS and SMAP soil moisture estimates responded to precipitation events detected by rain gauges, which were in some cases not detected by the in situ soil moisture sensors. It was also concluded that the spatial distribution of the soil moisture resulted from the relatively small spatial extents of the typical convective storms in this region was not completely captured with the in situ stations. After removing those cases (approximately10 of the observations) the following metrics were obtained: RMSD (root mean square difference) of0.016m3m3 and correlation of 0.83. The

  20. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2011-03-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  1. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2012-02-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  2. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  3. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... of firn compaction to correct ICESat measurements and assessing the present mass loss of the Greenland ice sheet. Validation of the model against the radar data gives good results and confidence in using the model to answer important questions. Questions such as; how large is the firn compaction...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  4. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    Science.gov (United States)

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  5. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  6. Proceedings of the first SRL model validation workshop

    International Nuclear Information System (INIS)

    Buckner, M.R.

    1981-10-01

    The Clean Air Act and its amendments have added importance to knowing the accuracy of mathematical models used to assess transport and diffusion of environmental pollutants. These models are the link between air quality standards and emissions. To test the accuracy of a number of these models, a Model Validation Workshop was held. The meteorological, source-term, and Kr-85 concentration data bases for emissions from the separations areas of the Savannah River Plant during 1975 through 1977 were used to compare calculations from various atmospheric dispersion models. The results of statistical evaluation of the models show a degradation in the ability to predict pollutant concentrations as the time span over which the calculations are made is reduced. Forecasts for annual time periods were reasonably accurate. Weighted-average squared correlation coefficients (R 2 ) were 0.74 for annual, 0.28 for monthly, 0.21 for weekly, and 0.18 for twice-daily predictions. Model performance varied within each of these four categories; however, the results indicate that the more complex, three-dimensional models provide only marginal increases in accuracy. The increased costs of running these codes is not warranted for long-term releases or for conditions of relatively simple terrain and meteorology. The overriding factor in the calculational accuracy is the accurate description of the wind field. Further improvements of the numerical accuracy of the complex models is not nearly as important as accurate calculations of the meteorological transport conditions

  7. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  8. Applicability of U.S. Army tracer test data to model validation needs of ERDA

    International Nuclear Information System (INIS)

    Shearer, D.L.; Minott, D.H.

    1976-06-01

    This report covers the first phase of an atmospheric dispersion model validation project sponsored by the Energy Research and Development Administration (ERDA). The project will employ dispersion data generated during an extensive series of field tracer experiments that were part of a meteorological research program which was conducted by the U. S. Army Dugway Proving Ground, Utah, from the late 1950's to the early 1970's. The tests were conducted at several locations in the U. S., South America, Germany, and Norway chosen to typify the effects of certain environmental factors on atmospheric dispersion. The purpose of the Phase I work of this project was to identify applicable portions of the Army data, obtain and review that data, and make recommendations for its uses for atmospheric dispersion model validations. This report presents key information in three formats. The first is a tabular listing of the Army dispersion test reports summarizing the test data contained in each report. This listing is presented in six separate tables with each tabular list representing a different topical area that is based on model validation requirements and the nature of the Army data base. The second format for presenting key information is a series of discussions of the Army test information assigned to each of the six topical areas. These discussions relate the extent and quality of the available data, as well as its prospective use for model validation. The third format is a series of synopses for each Army test report

  9. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  10. Validity and reliability of the Malay version of WHO Women's Health and Life Experiences Questionnaire.

    Science.gov (United States)

    Saddki, Norkhafizah; Sulaiman, Zaharah; Ali, Siti Hawa; Tengku Hassan, Tengku Nur Fadzilah; Abdullah, Sarimah; Ab Rahman, Azriani; Tengku Ismail, Tengku Alina; Abdul Jalil, Rohana; Baharudin, Zabedah

    2013-08-01

    The Women's Health and Life Experiences questionnaire measures the prevalence, health implications, and risk factors for domestic violence. This cross-sectional study was conducted to determine the validity and reliability of the Malay version of World Health Organization (WHO) Women's Health and Life Experiences Questionnaire. Construct validity and reliability assessment of the Malay version of the questionnaire was done on 20 specific items that measure four types of intimate partner violence (IPV) act; controlling behaviors (CB), emotional violence (EV), physical violence (PV), and sexual violence (SV), which were considered as the domains of interest. Face-to-face interviewing method was used for data collection. A total of 922 women completed the interviews. The results showed that exploratory factor analysis of four factors with eigenvalues above 1 accounted for 63.83% of the variance. Exploratory factor analysis revealed that all items loaded above 0.40 and the majority of items loaded on factors that were generally consistent with the proposed construct. The internal consistency reliability was good. The Cronbach's α values ranged from 0.767 to 0.858 across domains. The Malay version of WHO Women's Health and Life Experiences Questionnaire is a valid and reliable measure of women's health and experiences of IPV in Malaysia.

  11. Non-destructive measurements of nuclear wastes. Validation and industrial operating experience

    International Nuclear Information System (INIS)

    Saas, A.; Tchemitciieff, E.

    1993-01-01

    After a short survey of the means employed for the non-destructive measurement of specific activities (γ and X-ray) in waste packages and raw waste, the performances of the device and the ANDRA requirements are presented. The validation of the γ and X-ray measurements on packages is obtained through determining, by destructive means, the same activity on coring samples. The same procedure is used for validating the homogeneity measurements on packages (either homogeneous or heterogeneous). Different operating experiences are then exposed for several kinds of packages and waste. Up to now, about twenty different types of packages have been examined and more than 200 packages have allowed the calibration, validation, and control

  12. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  13. Experimental validation of the multiphase extended Leblond's model

    Science.gov (United States)

    Weisz-Patrault, Daniel

    2017-10-01

    Transformation induced plasticity is a crucial contribution of the simulation of several forming processes involving phase transitions under mechanical loads, resulting in large irreversible strain even though the applied stress is under the yield stress. One of the most elegant and widely used models is based on analytic homogenization procedures and has been proposed by Leblond et al. [1-4]. Very recently, a simple extension of the Leblond's model has been developed by Weisz-Patrault [8]. Several product phases are taken into account and several assumptions are relaxed in order to extend the applicability of the model. The present contribution compares experimental tests with numerical computations, in order to discuss the validity of the developed theory. Thus, experimental results extracted from the existing literature are analyzed. Results show a good agreement between measurements and theoretical computations.

  14. Numerical model for radio-frequency ablation of the endocardium and its experimental validation.

    Science.gov (United States)

    Labonté, S

    1994-02-01

    A theoretical model for the study of the radio-frequency (RF) ablation technique is presented. The model relies on a finite-element time-domain calculation of the temperature distribution in a block of tissue, resulting from the flow of RF (cooling effect of the blook flow and a transient analysis. Furthermore, the nonlinearity caused by the temperature dependence of the tissue properties is also considered. The complexity of the model being appreciable, an experiment demonstrating its validity is also described. While remaining workable, the experiment is sophisticated enough to lead to convincing conclusions. It consists in measuring the temperature distribution and the time-dependent electrode resistance during "ablation" of a tissue-equivalent material. Various electrode configurations and electrical excitations are investigated. In all cases, the experimental results agree reasonably well with the numerical calculations. This confirms that the model is accurate for the investigation of RF ablation.

  15. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  16. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  17. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  18. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  19. Numerical modeling of shock-sensitivity experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, A.L.; Forest, C.A.; Kershner, J.D.; Mader, C.L.; Pimbley, G.H.

    1981-01-01

    The Forest Fire rate model of shock initiation of heterogeneous explosives has been used to study several experiments commonly performed to measure the sensitivity of explosives to shock and to study initiation by explosive-formed jets. The minimum priming charge test, the gap test, the shotgun test, sympathetic detonation, and jet initiation have been modeled numerically using the Forest Fire rate in the reactive hydrodynamic codes SIN and 2DE.

  20. Bicycle Rider Control: Observations, Modeling & Experiments

    OpenAIRE

    Kooijman, J.D.G.

    2012-01-01

    Bicycle designers traditionally develop bicycles based on experience and trial and error. Adopting modern engineering tools to model bicycle and rider dynamics and control is another method for developing bicycles. This method has the potential to evaluate the complete design space, and thereby develop well handling bicycles for specific user groups in a much shorter time span. The recent benchmarking of the Whipple bicycle model for the balance and steer of a bicycle is an opening enabling t...

  1. Modelling of isotope exchange experiments in JET

    International Nuclear Information System (INIS)

    Ehrenberg, J.

    1987-01-01

    Isotope exchange experiments from hydrogen to deuterium in JET are theoretically described by employing a simple global isotope exchange model. Experimental results for discharges with limiter temperature around 250 0 C can be approximated by this model if an additional slow diffusion process of hydrogen in the limiter bulk is assumed. In discharges where thermal desorption occurs due to higher limiter temperatures (> or approx. 1000 0 C) (post carbonisation discharges) the change over process seems to be predominantly governed by thermal processes. (orig.)

  2. Water balance at an arid site: a model validation study of bare soil evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.L.; Campbell, G.S.; Gee, G.W.

    1984-03-01

    This report contains results of model validation studies conducted by Pacific Northwest Laboratory (PNL) for the Department of Energy's (DOE) National Low Level Waste Management Program (NLLWMP). The model validation tests consisted of using unsaturated water flow models to simulate water balance experiments conducted at the Buried Waste Test Facility (BWTF) located at the Department of Energy's Hanford site, near Richland, Washington. The BWTF is a lysimeter facility designed to collect field data on long-term water balance and radionuclide tracer movement. It has been operated by PNL for the NLLWMP since 1978. An experimental test case, developed from data collected at the BWTF, was used to evaluate predictions from different water flow models. The major focus of the validation study was to evaluate how the use of different evaporation models affected the accuracy of predictions of evaporation, storage, and drainage made by the whole model. Four evaporation models were tested including two empirical models and two mechanistic models. The empirical models estimate actual evaporation from potential evaporation; the mechanistic models describe water vapor diffusion within the soil profile and between the soil and the atmosphere in terms of fundamental soil properties, and transport processes. The water flow models that included the diffusion-type evaporation submodels performed best overall. The empirical models performed poorly in their description of evaporation and profile water storage during summer months. The predictions of drainage were supported quite well by the experimental data. This indicates that the method used to estimate hydraulic conductivity needed for the Darcian submodel was adequate. This important result supports recommendations for these procedures that were made previously based on laboratory results.

  3. Validation of an extracted tooth model of endodontic irrigation.

    Science.gov (United States)

    Hope, C K; Burnside, G; Chan, S N; Giles, L H; Jarad, F D

    2011-01-01

    An extracted tooth model of endodontic irrigation, incorporating reproducible inoculation and irrigation procedures, was tested against Enterococcus faecalis using a variety of different irrigants in a Latin square methodology. ANOVA revealed no significant variations between the twelve teeth or experiments undertaken on different occasions; however, variation between irrigants was significant. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Experiment of Laser Pointing Stability on Different Surfaces to validate Micrometric Positioning Sensor

    CERN Document Server

    AUTHOR|(SzGeCERN)721924; Mainaud Durand, Helene; Piedigrossi, Didier; Sandomierski, Jacek; Sosin, Mateusz; Geiger, Alain; Guillaume, Sebastien

    2014-01-01

    CLIC requires 10 μm precision and accuracy over 200m for the pre-alignment of beam related components. A solution based on laser beam as straight line reference is being studied at CERN. It involves camera/shutter assemblies as micrometric positioning sensors. To validate the sensors, it is necessary to determine an appropriate material for the shutter in terms of laser pointing stability. Experiments are carried out with paper, metal and ceramic surfaces. This paper presents the standard deviations of the laser spot coordinates obtained on the different surfaces, as well as the measurement error. Our experiments validate the choice of paper and ceramic for the shutter of the micrometric positioning sensor. It also provides an estimate of the achievable precision and accuracy of the determination of the laser spot centre with respect to the shutter coordinate system defined by reference targets.

  5. Validation of a numerical FSI simulation of an aortic BMHV by in vitro PIV experiments.

    Science.gov (United States)

    Annerel, S; Claessens, T; Degroote, J; Segers, P; Vierendeels, J

    2014-08-01

    In this paper, a validation of a recently developed fluid-structure interaction (FSI) coupling algorithm to simulate numerically the dynamics of an aortic bileaflet mechanical heart valve (BMHV) is performed. This validation is done by comparing the numerical simulation results with in vitro experiments. For the in vitro experiments, the leaflet kinematics and flow fields are obtained via the particle image velocimetry (PIV) technique. Subsequently, the same case is numerically simulated by the coupling algorithm and the resulting leaflet kinematics and flow fields are obtained. Finally, the results are compared, revealing great similarity in leaflet motion and flow fields between the numerical simulation and the experimental test. Therefore, it is concluded that the developed algorithm is able to capture very accurately all the major leaflet kinematics and dynamics and can be used to study and optimize the design of BMHVs. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. Modeling Root Growth, Crop Growth and N Uptake of Winter Wheat Based on SWMS_2D: Model and Validation

    Directory of Open Access Journals (Sweden)

    Dejun Yang

    Full Text Available ABSTRACT Simulations for root growth, crop growth, and N uptake in agro-hydrological models are of significant concern to researchers. SWMS_2D is one of the most widely used physical hydrologically related models. This model solves equations that govern soil-water movement by the finite element method, and has a public access source code. Incorporating key agricultural components into the SWMS_2D model is of practical importance, especially for modeling some critical cereal crops such as winter wheat. We added root growth, crop growth, and N uptake modules into SWMS_2D. The root growth model had two sub-models, one for root penetration and the other for root length distribution. The crop growth model used was adapted from EU-ROTATE_N, linked to the N uptake model. Soil-water limitation, nitrogen limitation, and temperature effects were all considered in dry-weight modeling. Field experiments for winter wheat in Bouwing, the Netherlands, in 1983-1984 were selected for validation. Good agreements were achieved between simulations and measurements, including soil water content at different depths, normalized root length distribution, dry weight and nitrogen uptake. This indicated that the proposed new modules used in the SWMS_2D model are robust and reliable. In the future, more rigorous validation should be carried out, ideally under 2D situations, and attention should be paid to improve some modules, including the module simulating soil N mineralization.

  7. Bicycle Rider Control : Observations, Modeling & Experiments

    NARCIS (Netherlands)

    Kooijman, J.D.G.

    2012-01-01

    Bicycle designers traditionally develop bicycles based on experience and trial and error. Adopting modern engineering tools to model bicycle and rider dynamics and control is another method for developing bicycles. This method has the potential to evaluate the complete design space, and thereby

  8. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Science.gov (United States)

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  9. Methodology and issues of integral experiments selection for nuclear data validation

    Science.gov (United States)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  10. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  11. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  12. The Role of Laboratory Experiments in the Validation of Field Data

    DEFF Research Database (Denmark)

    Mouneyrac, Catherine; Lagarde, Fabienne; Chatel, Amelie

    2017-01-01

    wide range of materials with different sizes, shapes, chemical natures and physicochemical properties, and their quantities/concentrations are highly variable depending on location and sampling and quantification protocols. To provide comprehensive data, interactions of MPs with the environment (water...... of Field Data | Request PDF. Available from: https://www.researchgate.net/publication/310360438_The_Role_of_Laboratory_Experiments_in_the_Validation_of_Field_Data [accessed Jan 15 2018]....

  13. Ovarian volume throughout life: a validated normative model.

    Science.gov (United States)

    Kelsey, Thomas W; Dodwell, Sarah K; Wilkinson, A Graham; Greve, Tine; Andersen, Claus Y; Anderson, Richard A; Wallace, W Hamish B

    2013-01-01

    The measurement of ovarian volume has been shown to be a useful indirect indicator of the ovarian reserve in women of reproductive age, in the diagnosis and management of a number of disorders of puberty and adult reproductive function, and is under investigation as a screening tool for ovarian cancer. To date there is no normative model of ovarian volume throughout life. By searching the published literature for ovarian volume in healthy females, and using our own data from multiple sources (combined n=59,994) we have generated and robustly validated the first model of ovarian volume from conception to 82 years of age. This model shows that 69% of the variation in ovarian volume is due to age alone. We have shown that in the average case ovarian volume rises from 0.7 mL (95% CI 0.4-1.1 mL) at 2 years of age to a peak of 7.7 mL (95% CI 6.5-9.2 mL) at 20 years of age with a subsequent decline to about 2.8 mL (95% CI 2.7-2.9 mL) at the menopause and smaller volumes thereafter. Our model allows us to generate normal values and ranges for ovarian volume throughout life. This is the first validated normative model of ovarian volume from conception to old age; it will be of use in the diagnosis and management of a number of diverse gynaecological and reproductive conditions in females from birth to menopause and beyond.

  14. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  15. Validation experiments of nuclear characteristics of the fast-thermal system HERBE

    International Nuclear Information System (INIS)

    Pesic, M.; Zavaljevski, N.; Marinkovic, P.; Stefanovis, D.; Nikolic, D.; Avdic, S.

    1992-01-01

    In 1988/90 a coupled fast-thermal system HERBE at RB reactor, based on similar facilities, is designed and realized. Fast core of HERBE is built of natural U fuel in RB reactor center surrounded by the neutron filter and neutron converter located in an independent Al tank. Fast zone is surrounded by thermal neutron core driver. Designed nuclear characteristics of HERBE core are validated in the experiments described in the paper. HERBE cell parameters were calculated with developed computer codes: VESNA and DENEB. HERBE system criticality calculation are performed with 4G 2D RZ computer codes GALER and TWENTY GRAND, 1D multi-group AVERY code and 3D XYZ few-group TRITON computer code. The experiments for determination of critical level, dρ/dH, and reactivity of safety rods are accomplished in order to validate calculation results. Specific safety experiment is performed in aim to determine reactivity of flooded fast zone in possible accident. A very good agreements with calculation results are obtained and the validation procedures are presented. It is expected that HERBE will offer qualitative new opportunities for work with fast neutrons at RB reactor including nuclear data determination. (author)

  16. Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.

    Science.gov (United States)

    Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian

    2016-05-01

    to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Modeling Clinically Validated Physical Activity Assessments Using Commodity Hardware.

    Science.gov (United States)

    Winfree, Kyle N; Dominick, Gregory

    2018-03-01

    Consumer-grade wearable activity devices such as Fitbits are increasingly being used in research settings to promote physical activity (PA) due to their low-cost and widespread popularity. However, Fitbit-derived measures of activity intensity are consistently reported to be less accurate than intensity estimates obtained from research-grade accelerometers (i.e., ActiGraph). As such, the potential for using a Fitbit to measure PA intensity within research contexts remains limited. This study aims to model ActiGraph-based intensity estimates from the validated Freedson vector magnitude (VM3) algorithm using measures of steps, metabolic equivalents, and intensity levels obtained from Fitbit. Minute-level data collected from 19 subjects, who concurrently wore the ActiGraph GT3X and Fitbit Flex devices for an average of 1.8 weeks, were used to generate the model. After testing several modeling methods, a naïve Bayes classifier was chosen based on the lowest achieved error rate. Overall, the model reduced Fitbit to ActiGraph errors from 19.97% to 16.32%. Moreover, the model reduced misclassification of Fitbit-based estimates of moderate-to-vigorous physical activity (MVPA) by 40%, eliminating a statistically significant difference between MVPA estimates derived from ActiGraph and Fitbit. Study findings support the general utility of the model for measuring MVPA with the Fitbit Flex in place of the more costly ActiGraph GT3X accelerometer for young healthy adults.

  18. Validation of a non-linear model of health.

    Science.gov (United States)

    Topolski, Stefan; Sturmberg, Joachim

    2014-12-01

    The purpose of this study was to evaluate the veracity of a theoretically derived model of health that describes a non-linear trajectory of health from birth to death with available population data sets. The distribution of mortality by age is directly related to health at that age, thus health approximates 1/mortality. The inverse of available all-cause mortality data from various time periods and populations was used as proxy data to compare with the theoretically derived non-linear health model predictions, using both qualitative approaches and quantitative one-sample Kolmogorov-Smirnov analysis with Monte Carlo simulation. The mortality data's inverse resembles a log-normal distribution as predicted by the proposed health model. The curves have identical slopes from birth and follow a logarithmic decline from peak health in young adulthood. A majority of the sampled populations had a good to excellent quantitative fit to a log-normal distribution, supporting the underlying model assumptions. Post hoc manipulation showed the model predictions to be stable. This is a first theory of health to be validated by proxy data, namely the inverse of all-cause mortality. This non-linear model, derived from the notion of the interaction of physical, environmental, mental, emotional, social and sense-making domains of health, gives physicians a more rigorous basis to direct health care services and resources away from disease-focused elder care towards broad-based biopsychosocial interventions earlier in life. © 2014 John Wiley & Sons, Ltd.

  19. Nonlinear ultrasound modelling and validation of fatigue damage

    Science.gov (United States)

    Fierro, G. P. Malfense; Ciampa, F.; Ginzburg, D.; Onder, E.; Meo, M.

    2015-05-01

    Nonlinear ultrasound techniques have shown greater sensitivity to microcracks and they can be used to detect structural damages at their early stages. However, there is still a lack of numerical models available in commercial finite element analysis (FEA) tools that are able to simulate the interaction of elastic waves with the materials nonlinear behaviour. In this study, a nonlinear constitutive material model was developed to predict the structural response under continuous harmonic excitation of a fatigued isotropic sample that showed anharmonic effects. Particularly, by means of Landau's theory and Kelvin tensorial representation, this model provided an understanding of the elastic nonlinear phenomena such as the second harmonic generation in three-dimensional solid media. The numerical scheme was implemented and evaluated using a commercially available FEA software LS-DYNA, and it showed a good numerical characterisation of the second harmonic amplitude generated by the damaged region known as the nonlinear response area (NRA). Since this process requires only the experimental second-order nonlinear parameter and rough damage size estimation as an input, it does not need any baseline testing with the undamaged structure or any dynamic modelling of the fatigue crack growth. To validate this numerical model, the second-order nonlinear parameter was experimentally evaluated at various points over the fatigue life of an aluminium (AA6082-T6) coupon and the crack propagation was measured using an optical microscope. A good correlation was achieved between the experimental set-up and the nonlinear constitutive model.

  20. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  1. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  2. Benchmarking Multilayer-HySEA model for landslide generated tsunami. HTHMP validation process.

    Science.gov (United States)

    Macias, J.; Escalante, C.; Castro, M. J.

    2017-12-01

    Landslide tsunami hazard may be dominant along significant parts of the coastline around the world, in particular in the USA, as compared to hazards from other tsunamigenic sources. This fact motivated NTHMP about the need of benchmarking models for landslide generated tsunamis, following the same methodology already used for standard tsunami models when the source is seismic. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory data sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. A total of 7 benchmarks. The Multilayer-HySEA model including non-hydrostatic effects has been used to perform all the benchmarking problems dealing with laboratory experiments proposed in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017 by NTHMP. The aim of this presentation is to show some of the latest numerical results obtained with the Multilayer-HySEA (non-hydrostatic) model in the framework of this validation effort.Acknowledgements. This research has been partially supported by the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and University of Malaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).

  3. Study of archaeological analogs for the validation of nuclear glass long-term behavior models

    International Nuclear Information System (INIS)

    Verney-Carron, A.

    2008-10-01

    Fractured archaeological glass blocks collected from a shipwreck discovered in the Mediterranean Sea near Embiez Island (Var) were investigated because of their morphological analogy with vitrified nuclear waste and of a known and stable environment. These glasses are fractured due to a fast cooling after they were melted (like nuclear glass) and have been altered for 1800 years in seawater. This work results in the development and the validation of a geochemical model able to simulate the alteration of a fractured archaeological glass block over 1800 years. The kinetics associated with the different mechanisms (interdiffusion and dissolution) and the thermodynamic parameters of the model were determined by leaching experiments. The model implemented in HYTEC software was used to simulate crack alteration over 1800 years. The consistency between simulated alteration thicknesses and measured data on glass blocks validate the capacity of the model to predict long-term alteration. This model is able to account for the results from the characterization of crack network and its state of alteration. The cracks in the border zone are the most altered due to a fast renewal of the leaching solution, whereas internal cracks are thin because of complex interactions between glass alteration and transport of elements in solution (influence of initial crack aperture and of the crack sealing). The lowest alteration thicknesses, as well as their variability, can be explained. The analog behavior of archaeological and nuclear glasses from leaching experiments makes possible the transposition of the model to nuclear glass in geological repository. (author)

  4. Validation of a numerical model of acoustic ceiling combined with TABS

    DEFF Research Database (Denmark)

    Rage, Nils; Kazanci, Ongun Berk; Olesen, Bjarne W.

    2016-01-01

    in the heat flow from TABS; the difference between the numerical results and measurements is in the range of -6.9% to +5.2%. The second evaluates the impact on TABS cooling capacity coefficient and room temperatures. The simulated cases led to absolute differences +4.3% higher in average for the cooling...... Elements) developed to simulate partially covered suspended ceilings such as hanging sound absorbers. The tool is validated by numerically modelling a set of similar experiments carried out in full-scale by a previous study. For this, a total of 12 scenarios from two case studies have been modelled...

  5. Experimental investigations and validation of two dimensional model for multistream plate fin heat exchangers

    Science.gov (United States)

    Goyal, Mukesh; Chakravarty, Anindya; Atrey, M. D.

    2017-03-01

    Experimental investigations are carried out using a specially developed three-layer plate fin heat exchanger (PFHE), with helium as the working fluid cooled to cryogenic temperatures using liquid nitrogen (LN2) as a coolant. These results are used for validation of an already proposed and reported numerical model based on finite volume analysis for multistream (MS) plate fin heat exchangers (PFHE) for cryogenic applications (Goyal et al., 2014). The results from the experiments are presented and a reasonable agreement is observed with the already reported numerical model.

  6. Development and validation of a new two-dimensional wake model for wind turbine wakes

    DEFF Research Database (Denmark)

    Tian, Linlin; Zhu, Wei Jun; Shen, Wen Zhong

    2015-01-01

    , wind tunnel experiments, and results of an advanced k-ω turbulence model as well as large eddy simulations. From the comparisons, it is found that the proposed new wake model gives a good prediction in terms of both shape and velocity amplitude of the wake deficit, especially in the far wake which......A new two-dimensional (2D) wake model is developed and validated in this article to predict the velocity and turbulence distribution in the wake of a wind turbine. Based on the classical Jensen wake model, this model is further employing a cosine shape function to redistribute the spread...... of the wake deficit in the crosswind direction. Moreover, a variable wake decay rate is proposed to take into account both the ambient turbulence and the rotor generated turbulence, different from a constant wake decay rate used in the Jensen model. The obtained results are compared to field measurements...

  7. Off-take Model of the SPACE Code and Its Validation

    International Nuclear Information System (INIS)

    Oh, Myung Taek; Park, Chan Eok; Sohn, Jong Joo

    2011-01-01

    Liquid entrainment and vapor pull-through models of horizontal pipe have been implemented in the SPACE code. The model of SPACE accounts for the phase separation phenomena and computes the flux of mass and energy through an off-take attached to a horizontal pipe when stratified conditions occur in the horizontal pipe. This model is referred to as the off-take model. The importance of predicting the fluid conditions through an off-take in a small-break LOCA has been well known. In this case, the occurrence of the stratification can affect the break node void fraction and thus the break flow discharged from the primary system. In order to validate the off-take model newly developed for the SPACE code, a simulation of the HDU experiments has been performed. The main feature of the off-take model and its application results will be presented in this paper

  8. Validation results of satellite mock-up capturing experiment using nets

    Science.gov (United States)

    Medina, Alberto; Cercós, Lorenzo; Stefanescu, Raluca M.; Benvenuto, Riccardo; Pesce, Vincenzo; Marcon, Marco; Lavagna, Michèle; González, Iván; Rodríguez López, Nuria; Wormnes, Kjetil

    2017-05-01

    The PATENDER activity (Net parametric characterization and parabolic flight), funded by the European Space Agency (ESA) via its Clean Space initiative, was aiming to validate a simulation tool for designing nets for capturing space debris. This validation has been performed through a set of different experiments under microgravity conditions where a net was launched capturing and wrapping a satellite mock-up. This paper presents the architecture of the thrown-net dynamics simulator together with the set-up of the deployment experiment and its trajectory reconstruction results on a parabolic flight (Novespace A-310, June 2015). The simulator has been implemented within the Blender framework in order to provide a highly configurable tool, able to reproduce different scenarios for Active Debris Removal missions. The experiment has been performed over thirty parabolas offering around 22 s of zero-g conditions. Flexible meshed fabric structure (the net) ejected from a container and propelled by corner masses (the bullets) arranged around its circumference have been launched at different initial velocities and launching angles using a pneumatic-based dedicated mechanism (representing the chaser satellite) against a target mock-up (the target satellite). High-speed motion cameras were recording the experiment allowing 3D reconstruction of the net motion. The net knots have been coloured to allow the images post-process using colour segmentation, stereo matching and iterative closest point (ICP) for knots tracking. The final objective of the activity was the validation of the net deployment and wrapping simulator using images recorded during the parabolic flight. The high-resolution images acquired have been post-processed to determine accurately the initial conditions and generate the reference data (position and velocity of all knots of the net along its deployment and wrapping of the target mock-up) for the simulator validation. The simulator has been properly

  9. Methane emissions from rice paddies. Experiments and modelling

    International Nuclear Information System (INIS)

    Van Bodegom, P.M.

    2000-01-01

    This thesis describes model development and experimentation on the comprehension and prediction of methane (CH4) emissions from rice paddies. The large spatial and temporal variability in CH4 emissions and the dynamic non-linear relationships between processes underlying CH4 emissions impairs the applicability of empirical relations. Mechanistic concepts are therefore starting point of analysis throughout the thesis. The process of CH4 production was investigated by soil slurry incubation experiments at different temperatures and with additions of different electron donors and acceptors. Temperature influenced conversion rates and the competitiveness of microorganisms. The experiments were used to calibrate and validate a mechanistic model on CH4 production that describes competition for acetate and H2/CO2, inhibition effects and chemolithotrophic reactions. The redox sequence leading eventually to CH4 production was well predicted by the model, calibrating only the maximum conversion rates. Gas transport through paddy soil and rice plants was quantified by experiments in which the transport of SF6 was monitored continuously by photoacoustics. A mechanistic model on gas transport in a flooded rice system based on diffusion equations was validated by these experiments and could explain why most gases are released via plant mediated transport. Variability in root distribution led to highly variable gas transport. Experiments showed that CH4 oxidation in the rice rhizosphere was oxygen (O2) limited. Rice rhizospheric O2 consumption was dominated by chemical iron oxidation, and heterotrophic and methanotrophic respiration. The most abundant methanotrophs and heterotrophs were isolated and kinetically characterised. Based upon these experiments it was hypothesised that CH4 oxidation mainly occurred at microaerophilic, low acetate conditions not very close to the root surface. A mechanistic rhizosphere model that combined production and consumption of O2, carbon and iron

  10. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  11. Experimental validation of models for Plasma Focus devices

    International Nuclear Information System (INIS)

    Rodriguez Palomino, Luis; Gonzalez, Jose; Clausse, Alejandro

    2003-01-01

    Plasma Focus(PF) Devices are thermonuclear pulsators that produce short pulsed radiation (X-ray, charged particles and neutrons). Since Filippov and Mather, investigations have been used to study plasma properties. Nowadays the interest about PF is focused in technology applications, related to the use of these devices as pulsed neutron sources. In the numerical calculus the Inter institutional PLADEMA (PLAsmas DEnsos MAgnetizados) network is developing three models. Each one is useful in different engineering stages of the Plasma Focus design. One of the main objectives in this work is a comparative study on the influence of the different parameters involved in each models. To validate these results, several experimental measurements under different geometry and initial conditions were performed. (author)

  12. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  13. Cultural consensus modeling to measure transactional sex in Swaziland: Scale building and validation.

    Science.gov (United States)

    Fielding-Miller, Rebecca; Dunkle, Kristin L; Cooper, Hannah L F; Windle, Michael; Hadley, Craig

    2016-01-01

    Transactional sex is associated with increased risk of HIV and gender based violence in southern Africa and around the world. However the typical quantitative operationalization, "the exchange of gifts or money for sex," can be at odds with a wide array of relationship types and motivations described in qualitative explorations. To build on the strengths of both qualitative and quantitative research streams, we used cultural consensus models to identify distinct models of transactional sex in Swaziland. The process allowed us to build and validate emic scales of transactional sex, while identifying key informants for qualitative interviews within each model to contextualize women's experiences and risk perceptions. We used logistic and multinomial logistic regression models to measure associations with condom use and social status outcomes. Fieldwork was conducted between November 2013 and December 2014 in the Hhohho and Manzini regions. We identified three distinct models of transactional sex in Swaziland based on 124 Swazi women's emic valuation of what they hoped to receive in exchange for sex with their partners. In a clinic-based survey (n = 406), consensus model scales were more sensitive to condom use than the etic definition. Model consonance had distinct effects on social status for the three different models. Transactional sex is better measured as an emic spectrum of expectations within a relationship, rather than an etic binary relationship type. Cultural consensus models allowed us to blend qualitative and quantitative approaches to create an emicly valid quantitative scale grounded in qualitative context. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Modeling Users' Experiences with Interactive Systems

    CERN Document Server

    Karapanos, Evangelos

    2013-01-01

    Over the past decade the field of Human-Computer Interaction has evolved from the study of the usability of interactive products towards a more holistic understanding of how they may mediate desired human experiences.  This book identifies the notion of diversity in usersʼ experiences with interactive products and proposes methods and tools for modeling this along two levels: (a) interpersonal diversity in usersʽ responses to early conceptual designs, and (b) the dynamics of usersʼ experiences over time. The Repertory Grid Technique is proposed as an alternative to standardized psychometric scales for modeling interpersonal diversity in usersʼ responses to early concepts in the design process, and new Multi-Dimensional Scaling procedures are introduced for modeling such complex quantitative data. iScale, a tool for the retrospective assessment of usersʼ experiences over time is proposed as an alternative to longitudinal field studies, and a semi-automated technique for the analysis of the elicited exper...

  15. Experimental validation of solid rocket motor damping models

    Science.gov (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  16. Design-validation of a hand exoskeleton using musculoskeletal modeling.

    Science.gov (United States)

    Hansen, Clint; Gosselin, Florian; Ben Mansour, Khalil; Devos, Pierre; Marin, Frederic

    2018-04-01

    Exoskeletons are progressively reaching homes and workplaces, allowing interaction with virtual environments, remote control of robots, or assisting human operators in carrying heavy loads. Their design is however still a challenge as these robots, being mechanically linked to the operators who wear them, have to meet ergonomic constraints besides usual robotic requirements in terms of workspace, speed, or efforts. They have in particular to fit the anthropometry and mobility of their users. This traditionally results in numerous prototypes which are progressively fitted to each individual person. In this paper, we propose instead to validate the design of a hand exoskeleton in a fully digital environment, without the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers' joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated. Our results show that the proposed exoskeleton design does not influence fingers' joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R 2 ¯=0.93) and the nRMSE consistently low (nRMSE¯ = 5.42°). These results are promising and this approach combining musculoskeletal and robotic modeling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi

    2016-02-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  18. Blast Load Simulator Experiments for Computational Model Validation: Report 1

    Science.gov (United States)

    2016-08-01

    to 2 psi) related to failures of conventional annealed glass and hollow concrete masonry unit walls. It can also simulate higher blast pressures for...Army, Air Force, Navy , and De- fense Special Weapons Agency 1998, Hyde 2003) calculations were con- ducted to produce a waveform that matched both peak...the structures located downstream of the cascade section of the BLS. ERDC/GSL TR-16-27 26 References Department of the Army, Air Force, Navy

  19. Refining Grasp Affordance Models by Experience

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Buch, Anders Glent

    2010-01-01

    grasps. These affordances are represented probabilistically with grasp densities, which correspond to continuous density functions defined on the space of 6D gripper poses. A grasp density characterizes an object’s grasp affordance; densities are linked to visual stimuli through registration...... with a visual model of the object they characterize. We explore a batch-oriented, experience-based learning paradigm where grasps sampled randomly from a density are performed, and an importance-sampling algorithm learns a refined density from the outcomes of these experiences. The first such learning cycle...

  20. Development and validation of a habitat suitability model for ...

    Science.gov (United States)

    We developed a spatially-explicit, flexible 3-parameter habitat suitability model that can be used to identify and predict areas at higher risk for non-native dwarf eelgrass (Zostera japonica) invasion. The model uses simple environmental parameters (depth, nearshore slope, and salinity) to quantitatively describe habitat suitable for Z. japonica invasion based on ecology and physiology from the primary literature. Habitat suitability is defined with values ranging from zero to one, where one denotes areas most conducive to Z. japonica and zero denotes areas not likely to support Z. japonica growth. The model was applied to Yaquina Bay, Oregon, USA, an area that has well documented Z. japonica expansion over the last two decades. The highest suitability values for Z. japonica occurred in the mid to upper portions of the intertidal zone, with larger expanses occurring in the lower estuary. While the upper estuary did contain suitable habitat, most areas were not as large as in the lower estuary, due to inappropriate depth, a steeply sloping intertidal zone, and lower salinity. The lowest suitability values occurred below the lower intertidal zone, within the Yaquina River channel. The model was validated by comparison to a multi-year time series of Z. japonica maps, revealing a strong predictive capacity. Sensitivity analysis performed to evaluate the contribution of each parameter to the model prediction revealed that depth was the most important factor. Sh

  1. Validation of Storm Water Management Model Storm Control Measures Modules

    Science.gov (United States)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  2. Developing and investigating validity of a knowledge management game simulation model

    NARCIS (Netherlands)

    Tsjernikova, Irina

    2009-01-01

    The goals of this research project were to develop a game simulation model which supports learning knowledge management in a game environment and to investigate the validity of that model. The validity of the model is approached from two perspectives: educational validity and representational

  3. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  4. Shock, release and reshock of PBX 9502: experiments and modeling

    Science.gov (United States)

    Aslam, Tariq; Gustavsen, Richard; Whitworh, Nicholas; Menikoff, Ralph; Tarver, Craig; Handley, Caroline; Bartram, Brian

    2017-06-01

    We examine shock, release and reshock into the tri-amino-tri-nitro-benzene (TATB) based explosive PBX 9502 (95% TATB, 5% Kel-F 800) from both an experimental and modeling point of view. The experiments are performed on the 2-stage light gas gun at Los Alamos National Laboratory and are composed of a multi-layered impactor impinging on PBX 9502 backed by a polymethylmethacrylate window. The objective is to initially shock the PBX 9502 in the 7 GPa range (too weak to start significant reaction), then allow a rarefaction fan to release the material to a lower pressure/temperature state. Following this release, a strong second shock will recompress the PBX. If the rarefaction fan releases the PBX to a very low pressure, the ensuing second shock can increase the entropy and temperature substantially more than in previous double-shock experiments without an intermediate release. Predictions from a variety of reactive burn models (AWSD, CREST, Ignition and Growth, SURF) demonstrate significantly different behaviors and thus the experiments are an excellent validation test of the models, and may suggest improvements for subsequent modeling efforts.

  5. Estimating the predictive validity of diabetic animal models in rosiglitazone studies.

    Science.gov (United States)

    Varga, O E; Zsíros, N; Olsson, I A S

    2015-06-01

    For therapeutic studies, predictive validity of animal models - arguably the most important feature of animal models in terms of human relevance - can be calculated retrospectively by obtaining data on treatment efficacy from human and animal trials. Using rosiglitazone as a case study, we aim to determine the predictive validity of animal models of diabetes, by analysing which models perform most similarly to humans during rosiglitazone treatment in terms of changes in standard diabetes diagnosis parameters (glycosylated haemoglobin [HbA1c] and fasting glucose levels). A further objective of this paper was to explore the impact of four covariates on the predictive capacity: (i) diabetes induction method; (ii) drug administration route; (iii) sex of animals and (iv) diet during the experiments. Despite the variable consistency of animal species-based models with the human reference for glucose and HbA1c treatment effects, our results show that glucose and HbA1c treatment effects in rats agreed better with the expected values based on human data than in other species. Induction method was also found to be a substantial factor affecting animal model performance. The study concluded that regular reassessment of animal models can help to identify human relevance of each model and adapt research design for actual research goals. © 2015 World Obesity.

  6. Validation of a "Kane's Dynamics" Model for the Active Rack Isolation System

    Science.gov (United States)

    Beech, Geoffrey S.; Hampton, R. David

    2000-01-01

    Many microgravity space-science experiments require vibratory acceleration levels unachievable without active isolation. The Boeing Corporation's Active Rack Isolation System (ARIS) employs a novel combination of magnetic actuation and mechanical linkages, to address these isolation requirements on the International Space Station (ISS). ARIS provides isolation at the rack (international Standard Payload Rack, or ISPR) level. Effective model-based vibration isolation requires (1) an isolation device, (2) an adequate dynamic (i.e., mathematical) model of that isolator, and (3) a suitable, corresponding controller, ARIS provides the ISS response to the first requirement. In November 1999, the authors presented a response to the second ("A 'Kane's Dynamics' model for the Active Rack Isolation System", Hampton and Beech) intended to facilitate an optimal-controls approach to the third. This paper documents the validation of that high-fidelity dynamic model of ARIS. As before, this model contains the full actuator dynamics, however, the umbilical models are not included in this presentation. The validation of this dynamics model was achieved by utilizing two Commercial Off the Shelf (COTS) software tools: Deneb's ENVISION, and Online Dynamics' AUTOLEV. ENVISION is a robotics software package developed for the automotive industry that employs 3-dimensional (3-D) Computer Aided Design (CAD) models to facilitate both forward and inverse kinematics analyses. AUTOLEV is a DOS based interpreter that is designed in general to solve vector based mathematical problems and specifically to solve Dynamics problems using Kane's method.

  7. Modeling variability in porescale multiphase flow experiments

    Energy Technology Data Exchange (ETDEWEB)

    Ling, Bowen; Bao, Jie; Oostrom, Mart; Battiato, Ilenia; Tartakovsky, Alexandre M.

    2017-07-01

    Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  8. Modeling variability in porescale multiphase flow experiments

    Science.gov (United States)

    Ling, Bowen; Bao, Jie; Oostrom, Mart; Battiato, Ilenia; Tartakovsky, Alexandre M.

    2017-07-01

    Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e., fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rates. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  9. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  10. Modelling, simulation and validation of the industrial robot

    Directory of Open Access Journals (Sweden)

    Aleksandrov Slobodan Č.

    2014-01-01

    Full Text Available In this paper, a DH model of industrial robot, with anthropomorphic configuration and five degrees of freedom - Mitsubishi RV2AJ, is developed. The model is verified on the example robot Mitsubishi RV2AJ. In paper detailed represented the complete mathematical model of the robot and the parameters of the programming. On the basis of this model, simulation of robot motion from point to point is performed, as well as the continuous movement of the pre-defined path. Also, programming of industrial robots identical to simulation programs is made, and comparative analysis of real and simulated experiment is shown. In the final section, a detailed analysis of robot motion is described.

  11. System-Level Validation High-Level Modeling and Directed Test Generation Techniques

    CERN Document Server

    Chen, Mingsong; Koo, Heon-Mo; Mishra, Prabhat

    2013-01-01

    This book covers state-of-the art techniques for high-level modeling and validation of complex hardware/software systems, including those with multicore architectures.  Readers will learn to avoid time-consuming and error-prone validation from the comprehensive coverage of system-level validation, including high-level modeling of designs and faults, automated generation of directed tests, and efficient validation methodology using directed tests and assertions.  The methodologies described in this book will help designers to improve the quality of their validation, performing as much validation as possible in the early stages of the design, while reducing the overall validation effort and cost.

  12. Prediction model and experimental validation for the thermal deformation of motorized spindle

    Science.gov (United States)

    Zhang, Lixiu; Li, Jinpeng; Wu, Yuhou; Zhang, Ke; Wang, Yawen

    2018-02-01

    The thermal deformation of motorized spindle has a great influence on the precision of numerical control (NC) machine tools. Thus, it is crucial to predict the thermal deformation in the design and operation control phase by numerical simulation and improve the precision of NC machine tools. In order to achieve this, an accurate thermal deformation prediction model for motorized spindle is required. In this paper, a model for predicting the thermal error of motorized spindle based on finite element method and parameter optimization is proposed. Levenberg-Marquardt (LM) method is applied to optimize the heat transfer coefficient of motorized spindle by using surface temperature data measured. The optimized heat transfer coefficient is then taken as one of the boundary condition of the finite element model. The boundary conditions about heat of the finite element model are obtained by energy loss experiment. The proposed model is validated by experimental results, and the results have shown well correlation.

  13. The erythrocyte sedimentation rates: some model experiments.

    Science.gov (United States)

    Cerny, L C; Cerny, E L; Granley, C R; Compolo, F; Vogels, M

    1988-01-01

    In order to obtain a better understanding of the erythrocyte sedimentation rate (ESR), several models are presented. The first directs attention to the importance of geometrical models to represent the structure of mixtures. Here it is our intention to understand the effect of the structure on the packing of red blood cells. In this part of the study, "Cheerios" (trademark General Mills) are used as a macroscopic model. It is interesting that a random sampling of "Cheerios" has the same volume distribution curve that is found for erythrocytes with a Coulter Sizing Apparatus. In order to examine the effect of rouleaux formation, the "Cheerios" are stacked one on top of another and then glued. Rouleaux of 2,3,4,5, 7 and 10 discs were used. In order to examine a more realistic biological model, the experiments of Dintenfass were used. These investigations were performed in a split-capillary photo viscometer using whole blood from patients with a variety of diseases. The novel part of this research is the fact that the work was performed at 1g and at near zero gravity in the space shuttle "Discovery." The size of the aggregates and/or rouleaux clearly showed a dependence upon the gravity of the experiment. The purpose of this model was to examine the condition of self-similarity and fractal behavior. Calculations are reported which clearly indicate that there is general agreement in the magnitude of the fractal dimension from the "Cheerios" model, the "Discovery" experiment with those determined with the automatic sedimentimeter. The final aspect of this work examines the surface texture of the sedimention tube. A series of tubes were designed with "roughened" interiors. A comparison of the sedimentation rates clearly indicates a more rapid settling in "roughened" tubes than in ones with a smooth interior surface.

  14. Validity of the Neuromuscular Recovery Scale: a measurement model approach.

    Science.gov (United States)

    Velozo, Craig; Moorhouse, Michael; Ardolino, Elizabeth; Lorenz, Doug; Suter, Sarah; Basso, D Michele; Behrman, Andrea L

    2015-08-01

    To determine how well the Neuromuscular Recovery Scale (NRS) items fit the Rasch, 1-parameter, partial-credit measurement model. Confirmatory factor analysis (CFA) and principal components analysis (PCA) of residuals were used to determine dimensionality. The Rasch, 1-parameter, partial-credit rating scale model was used to determine rating scale structure, person/item fit, point-measure item correlations, item discrimination, and measurement precision. Seven NeuroRecovery Network clinical sites. Outpatients (N=188) with spinal cord injury. Not applicable. NRS. While the NRS met 1 of 3 CFA criteria, the PCA revealed that the Rasch measurement dimension explained 76.9% of the variance. Ten of 11 items and 91% of the patients fit the Rasch model, with 9 of 11 items showing high discrimination. Sixty-nine percent of the ratings met criteria. The items showed a logical item-difficulty order, with Stand retraining as the easiest item and Walking as the most challenging item. The NRS showed no ceiling or floor effects and separated the sample into almost 5 statistically distinct strata; individuals with an American Spinal Injury Association Impairment Scale (AIS) D classification showed the most ability, and those with an AIS A classification showed the least ability. Items not meeting the rating scale criteria appear to be related to the low frequency counts. The NRS met many of the Rasch model criteria for construct validity. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Validation of a Simplified Model to Generate Multispectral Synthetic Images

    Directory of Open Access Journals (Sweden)

    Ion Sola

    2015-03-01

    Full Text Available A new procedure to assess the quality of topographic correction (TOC algorithms applied to remote sensing imagery was previously proposed by the authors. This procedure was based on a model that simulated synthetic scenes, representing the radiance an optical sensor would receive from an area under some specific conditions. TOC algorithms were then applied to synthetic scenes and the resulting corrected scenes were compared with a horizontal synthetic scene free of topographic effect. This comparison enabled an objective and quantitative evaluation of TOC algorithms. This approach showed promising results but had some shortcomings that are addressed herein. First, the model, originally built to simulate only broadband panchromatic scenes, is extended to multispectral scenes in the visible, near infrared (NIR, and short wave infrared (SWIR bands. Next, the model is validated by comparing synthetic scenes with four Satellite pour l'Observation de la Terre 5 (SPOT5 real scenes acquired on different dates and different test areas along the Pyrenees mountain range (Spain. The results obtained show a successful simulation of all the spectral bands. Therefore, the model is deemed accurate enough for its purpose of evaluating TOC algorithms.

  16. Validation of Vehicle Model Response with an Instrumented Experimental Vehicle

    Directory of Open Access Journals (Sweden)

    Harun Mohamad Hafiz

    2017-01-01

    Full Text Available A steering aid system called active steering is evaluated by simulating different kinds of driving events. The main purpose of the steering system is to allow the driver control the vehicle independently. A full car vehicle model is simulated in Matlab/Simulink with 14 degree of freedom of equations which include the ride vehicle model and also the handling model. The steering angle is the input of the vehicle model that should be focused on. The angle of the steering system between the tires when turning the vehicle is taken in consideration. Simulations are made on different road conditions effect and also side wind disturbances. Different values are applied to the simulation to reduce the effect of the driving events. Therefore, these simulations results to provide a better improvement to the steering system. The aim for this work is to validate the vehicle response with an instrumented experiemental vehicle. Specific driving events in these simulations are the road adhesions and lateral side wind disturbances.

  17. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  18. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  19. Dynamic crack initiation toughness : experiments and peridynamic modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Foster, John T.

    2009-10-01

    This is a dissertation on research conducted studying the dynamic crack initiation toughness of a 4340 steel. Researchers have been conducting experimental testing of dynamic crack initiation toughness, K{sub Ic}, for many years, using many experimental techniques with vastly different trends in the results when reporting K{sub Ic} as a function of loading rate. The dissertation describes a novel experimental technique for measuring K{sub Ic} in metals using the Kolsky bar. The method borrows from improvements made in recent years in traditional Kolsky bar testing by using pulse shaping techniques to ensure a constant loading rate applied to the sample before crack initiation. Dynamic crack initiation measurements were reported on a 4340 steel at two different loading rates. The steel was shown to exhibit a rate dependence, with the recorded values of K{sub Ic} being much higher at the higher loading rate. Using the knowledge of this rate dependence as a motivation in attempting to model the fracture events, a viscoplastic constitutive model was implemented into a peridynamic computational mechanics code. Peridynamics is a newly developed theory in solid mechanics that replaces the classical partial differential equations of motion with integral-differential equations which do not require the existence of spatial derivatives in the displacement field. This allows for the straightforward modeling of unguided crack initiation and growth. To date, peridynamic implementations have used severely restricted constitutive models. This research represents the first implementation of a complex material model and its validation. After showing results comparing deformations to experimental Taylor anvil impact for the viscoplastic material model, a novel failure criterion is introduced to model the dynamic crack initiation toughness experiments. The failure model is based on an energy criterion and uses the K{sub Ic} values recorded experimentally as an input. The failure model

  20. Development and validation of a cystic fibrosis patient and family member experience of care survey.

    Science.gov (United States)

    Homa, Karen; Sabadosa, Kathryn A; Nelson, Eugene C; Rogers, William H; Marshall, Bruce C

    2013-01-01

    The purpose of this study was to develop a cystic fibrosis (CF)-specific patient and family experience of care survey that CF care centers could use to inform quality improvement efforts. A literature search and query of CF care centers was conducted to identify existing surveys. Individuals with CF, their families, and health care professionals were also asked what to include. Following this process, a draft survey was developed and then reviewed by focus groups. Finally, a version was piloted at 25 CF care centers to validate and further refine the instrument. No CF-specific surveys were found in the literature. Focus group participants stated that they understood the survey questions and that they covered important aspects of care, particularly infection control. The pilot test of the instrument with 485 participants supported its validity by demonstrating significant differences across centers and that most of the 3 care dimensions had acceptable internal consistency (Cronbach α: adults, 0.71-0.85; children, 0.68-0.79). A CF-specific patient and family experience of care survey was developed with input from individuals with CF, their families, and health care professionals. The instrument was validated and has been deployed to CF care centers.

  1. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  2. PIV validation of blood-heart valve leaflet interaction modelling.

    Science.gov (United States)

    Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

    2007-07-01

    The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated.

  3. Model of an Evaporating Drop Experiment

    Science.gov (United States)

    Rodriguez, Nicolas

    2017-11-01

    A computational model of an experimental procedure to measure vapor distributions surrounding sessile drops is developed to evaluate the uncertainty in the experimental results. Methanol, which is expected to have predominantly diffusive vapor transport, is chosen as a validation test for our model. The experimental process first uses a Fourier transform infrared spectrometer to measure the absorbance along lines passing through the vapor cloud. Since the measurement contains some errors, our model allows adding random noises to the computational integrated absorbance to mimic this. Then the resulting data are interpolated before passing through a computed tomography routine to generate the vapor distribution. Next, the gradients of the vapor distribution are computed along a given control volume surrounding the drop so that the diffusive flux can be evaluated as the net rate of diffusion out of the control volume. Our model of methanol evaporation shows that the accumulated errors of the whole experimental procedure affect the diffusive fluxes at different control volumes and are sensitive to how the noisy data of integrated absorbance are interpolated. This indicates the importance of investigating a variety of data fitting methods to choose which is best to present the data. Trinity University Mach Fellowship.

  4. The Development and Validation of the Social Networking Experiences Questionnaire: A Measure of Adolescent Cyberbullying and Its Impact.

    Science.gov (United States)

    Dredge, Rebecca; Gleeson, John; Garcia, Xochitl de la Piedad

    2015-01-01

    The measurement of cyberbullying has been marked by several inconsistencies that lead to difficulties in cross-study comparisons of the frequency of occurrence and the impact of cyberbullying. Consequently, the first aim of this study was to develop a measure of experience with and impact of cyberbullying victimization in social networking sites in adolescents. The second aim was to investigate the psychometric properties of a purpose-built measure (Social Networking Experiences Questionnaire [SNEQ]). Exploratory factor analysis on 253 adolescent social networking sites users produced a six-factor model of impact. However, one factor was removed because of low internal consistency. Cronbach's alpha was higher than .76 for the victimization and remaining five impact subscales. Furthermore, correlation coefficients for the Victimization scale and related dimensions showed good construct validity. The utility of the SNEQ for victim support personnel, research, and cyberbullying education/prevention programs is discussed.

  5. Ceramic bar impact experiments for improved material model

    International Nuclear Information System (INIS)

    Brar, N.S.; Proud, W.G.; Rajendran, A.M.

    2004-01-01

    Ceramic bar-on-bar (uniaxial stress) experiments are performed to extend uniaxial strain deformation states imposed in flyer plate impact experiments. A number of investigators engaged in modeling the bar-on-bar experiments have varying degrees of success in capturing the observed fracture modes in bars and correctly simulating the measured in-situ axial stress or free surface velocity histories. The difficulties encountered are related to uncertainties in understanding the dominant failure mechanisms as a function of different stress states imposed in bar impacts. Free surface velocity of the far end of the target AD998 bar were measured using a VISAR in a series of bar-on-bar impact experiments at nominal impact speeds of 100 m/s, 220 m/s, and 300 m/s. Velocity history data at an impact of 100 m/s show the material response as elastic. At higher impact velocities of 200 m/s and 300 m/s the velocity history data suggest an inelastic material response. A high-speed (Imacon) camera was employed to examine the fracture and failure of impactor and target bars. High speed photographs provide comprehensive data on geometry of damage and failure patterns as a function of time to check the validity of a particular constitutive material model for AD998 alumina used in numerical simulations of fracture and failure of the bars on impact

  6. An Overview of the Second SAGE III Ozone Loss and Validation Experiment (SOLVE-II)

    Science.gov (United States)

    Newman, P. A.

    2003-12-01

    The SOLVE II Field mission was a field campaign designed to investigate polar ozone loss, polar stratospheric clouds, processes that lead to ozone loss, the dynamics of the polar stratosphere, and to acquire correlative data needed to validate satellite measurements of the polar stratosphere. The campaign was closely coordinated with VINTERSOL-EUPLEX campaigns. This combined international campaign was staged over the course of the winter of 2002-2003. SOLVE-II measurements were made from the NASA DC-8 aircraft, ozonesondes and other balloon payloads, ground-based instruments, and satellites. In particular SOLVE-II was designed to validate the Meteor-3M/Stratospheric Aerosol and Gas Experiment (SAGE) III satellite mission. We will review the overall objectives of the combined campaigns, discuss some of the broad observations of the winter of 2002-2003, and highlight the major findings of this campaign.

  7. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Utilizing Chamber Data for Developing and Validating Climate Change Models

    Science.gov (United States)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  9. PIV-validated numerical modeling of pulsatile flows in distal coronary end-to-side anastomoses.

    Science.gov (United States)

    Xiong, F L; Chong, C K

    2007-01-01

    This study employed particle image velocimetry (PIV) to validate a numerical model in a complementary approach to quantify hemodynamic factors in distal coronary anastomoses and to gain more insights on their relationship with anastomotic geometry. Instantaneous flow fields and wall shear stresses (WSS) were obtained from PIV measurement in a modified life-size silastic anastomosis model adapted from a conventional geometry by incorporating a smooth graft-artery transition. The results were compared with those predicted by a concurrent numerical model. The numerical method was then used to calculate cycle-averaged WSS (WSS(cyc)) an