WorldWideScience

Sample records for model validation experiments

  1. Full-Scale Cookoff Model Validation Experiments

    Energy Technology Data Exchange (ETDEWEB)

    McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

    2003-11-25

    This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

  2. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  3. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    O’Daniel, 2016. Blast load simulator experiments for computational model validation – Report 1. ERDC/GSL TR-16-27. Vicksburg, MS: U.S. Army Engineer ...ER D C/ G SL T R- 16 -2 7 Blast Load Simulator Experiments for Computational Model Validation Report 2 G eo te ch ni ca l a nd S tr uc...Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and Development Center (ERDC) solves the nation’s toughest

  4. Model validation for karst flow using sandbox experiments

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Tao, X.; Zhao, J.

    2015-12-01

    The study of flow in karst is complex due of the high heterogeneity of the porous media. Several approaches have been proposed in the literature to study overcome the natural complexity of karst. Some of those methods are the single continuum, double continuum and the discrete network of conduits coupled with the single continuum. Several mathematical and computing models are available in the literature for each approach. In this study one computer model has been selected for each category to validate its usefulness to model flow in karst using a sandbox experiment. The models chosen are: Modflow 2005, Modflow CFPV1 and Modflow CFPV2. A sandbox experiment was implemented in such way that all the parameters required for each model can be measured. The sandbox experiment was repeated several times under different conditions. The model validation will be carried out by comparing the results of the model simulation and the real data. This model validation will allows ud to compare the accuracy of each model and the applicability in Karst. Also we will be able to evaluate if the results of the complex models improve a lot compared to the simple models specially because some models require complex parameters that are difficult to measure in the real world.

  5. Calibration of Predictor Models Using Multiple Validation Experiments

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  6. MODEL IMPROVEMENT AND EXPERI-MENT VALIDATION OF PNEUMATIC ARTIFICIAL MUSCLES

    Institute of Scientific and Technical Information of China (English)

    Zhou Aiguo; Shi Guanglin; Zhong Tingxiu

    2004-01-01

    According to the deficiency of the present model of pneumatic artificial muscles (PAM), a serial model is built up based on the PAM's essential working principle with the elastic theory, it is validated by the quasi-static and dynamic experiment results, which are gained from two experiment systems.The experiment results and the simulation results illustrate that the serial model has made a great success compared with Chou's model, which can describe the force characteristics of PAM more precisely.A compensation item considering the braid's elasticity and the coulomb damp is attached to the serial model based on the analysis of the experiment results.The dynamic experiment proves that the viscous damp of the PAM could be ignored in order to simplify the model of PAM.Finally, an improved serial model of PAM is obtained.

  7. Structure Modeling and Validation applied to Source Physics Experiments (SPEs)

    Science.gov (United States)

    Larmat, C. S.; Rowe, C. A.; Patton, H. J.

    2012-12-01

    The U. S. Department of Energy's Source Physics Experiments (SPEs) comprise a series of small chemical explosions used to develop a better understanding of seismic energy generation and wave propagation for low-yield explosions. In particular, we anticipate improved understanding of the processes through which shear waves are generated by the explosion source. Three tests, 100, 1000 and 1000 kg yields respectively, were detonated in the same emplacement hole and recorded on the same networks of ground motion sensors in the granites of Climax Stock at the Nevada National Security Site. We present results for the analysis and modeling of seismic waveforms recorded close-in on five linear geophone lines extending radially from ground zero, having offsets from 100 to 2000 m and station spacing of 100 m. These records exhibit azimuthal variations of P-wave arrival times, and phase velocity, spreading and attenuation properties of high-frequency Rg waves. We construct a 1D seismic body-wave model starting from a refraction analysis of P-waves and adjusting to address time-domain and frequency-domain dispersion measurements of Rg waves between 2 and 9 Hz. The shallowest part of the structure we address using the arrival times recorded by near-field accelerometers residing within 200 m of the shot hole. We additionally perform a 2D modeling study with the Spectral Element Method (SEM) to investigate which structural features are most responsible for the observed variations, in particular anomalously weak amplitude decay in some directions of this topographically complicated locality. We find that a near-surface, thin, weathered layer of varying thickness and low wave speeds plays a major role on the observed waveforms. We anticipate performing full 3D modeling of the seismic near-field through analysis and validation of waveforms on the 5 radial receiver arrays.

  8. Contribution to a dynamic wind turbine model validation from a wind farm islanding experiment

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Kaas; Pedersen, Knud Ole Helgesen; Poulsen, Niels Kjølstad;

    2003-01-01

    and possible discrepancies are explained. The work with the wind turbine model validation relates to the dynamic stability investigations on incorporation of large amount of wind power in the Danish power grid, where the dynamic wind turbine model is applied.......Measurements from an islanding experiment on the Rejsby Hede wind farm, Denmark, are used for the validation of the dynamic model of grid-connected, stall-controlled wind turbines equipped with induction generators. The simulated results are found to be in good agreement with the measurements...

  9. Numerical studies and metric development for validation of magnetohydrodynamic models on the HIT-SI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Victor, B.; Morgan, K.; Hossack, A.; Sutherland, D. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Jarboe, T.; Nelson, B. A. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States)

    2015-05-15

    We present application of three scalar metrics derived from the Biorthogonal Decomposition (BD) technique to evaluate the level of agreement between macroscopic plasma dynamics in different data sets. BD decomposes large data sets, as produced by distributed diagnostic arrays, into principal mode structures without assumptions on spatial or temporal structure. These metrics have been applied to validation of the Hall-MHD model using experimental data from the Helicity Injected Torus with Steady Inductive helicity injection experiment. Each metric provides a measure of correlation between mode structures extracted from experimental data and simulations for an array of 192 surface-mounted magnetic probes. Numerical validation studies have been performed using the NIMROD code, where the injectors are modeled as boundary conditions on the flux conserver, and the PSI-TET code, where the entire plasma volume is treated. Initial results from a comprehensive validation study of high performance operation with different injector frequencies are presented, illustrating application of the BD method. Using a simplified (constant, uniform density and temperature) Hall-MHD model, simulation results agree with experimental observation for two of the three defined metrics when the injectors are driven with a frequency of 14.5 kHz.

  10. Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rao, Rekha Ranjana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelden, Bion [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Hern, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wyatt, Nicholas B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hileman, Michael Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Urquhart, Alexander [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle Richard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, David Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  11. Experiments to populate and validate a processing model for polyurethane foam :

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann; Rao, Rekha Ranjana; Shelden, Bion; Soehnel, Melissa Marie; O' Hern, Timothy J.; Grillet, Anne; Celina, Mathias Christopher; Wyatt, Nicholas B.; Russick, Edward Mark; Bauer, Stephen J.; Hileman, Michael Bryan; Urquhart, Alexander; Thompson, Kyle Richard; Smith, David Michael

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  12. Modeling short wave radiation and ground surface temperature: a validation experiment in the Western Alps

    Science.gov (United States)

    Pogliotti, P.; Cremonese, E.; Dallamico, M.; Gruber, S.; Migliavacca, M.; Morra di Cella, U.

    2009-12-01

    Permafrost distribution in high-mountain areas is influenced by topography (micro-climate) and high variability of ground covers conditions. Its monitoring is very difficult due to logistical problems like accessibility, costs, weather conditions and reliability of instrumentation. For these reasons physically-based modeling of surface rock/ground temperatures (GST) is fundamental for the study of mountain permafrost dynamics. With this awareness a 1D version of GEOtop model (www.geotop.org) is tested in several high-mountain sites and its accuracy to reproduce GST and incoming short wave radiation (SWin) is evaluated using independent field measurements. In order to describe the influence of topography, both flat and near-vertical sites with different aspects are considered. Since the validation of SWin is difficult on steep rock faces (due to the lack of direct measures) and validation of GST is difficult on flat sites (due to the presence of snow) the two parameters are validated as independent experiments: SWin only on flat morphologies, GST only on the steep ones. The main purpose is to investigate the effect of: (i) distance between driving meteo station location and simulation point location, (ii) cloudiness, (iii) simulation point aspect, (iv) winter/summer period. The temporal duration of model runs is variable from 3 years for the SWin experiment to 8 years for the validation of GST. The model parameterization is constant and tuned for a common massive bedrock of crystalline rock like granite. Ground temperature profile is not initialized because rock temperature is measured at only 10cm depth. A set of 9 performance measures is used for comparing model predictions and observations (including: fractional mean bias (FB), coefficient of residual mass (CMR), mean absolute error (MAE), modelling efficiency (ME), coefficient of determination (R2)). Results are very encouraging. For both experiments the distance (Km) between location of the driving meteo

  13. Optimal Design and Model Validation for Combustion Experiments in a Shock Tube

    KAUST Repository

    Long, Quan

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate functions. The control parameters are the initial hydrogen concentration and the temperature. First, we build a polynomial based surrogate model for the observable related to the reactions in the shock tube. Second, we use a novel MAP based approach to estimate the expected information gain in the proposed experiments and select the best experimental set-ups corresponding to the optimal expected information gains. Third, we use the synthetic data to carry out virtual validation of our methodology.

  14. Validating a dust production model by field experiment in Mu Us Desert, China

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Field experiments of dust production have been seldom performed in Chinese deserts that are identified as one of main dust sources in the world, and such an experiment to validate a dust production model (DPM) has not been made in China until now. Saltation flux, dust emission flux, surface features and meteorological parameters (U*, Z0, Ri, etc.) were investigated in Mu Us Desert of China to verify the DPM model and accumulate dust emission data during the spring of 2002. It indicates that observed saltation fluxes from 0.07 to 8.00 g·m-1·s-1 are in good agreement with the ones predicted by the DPM model when the constant of the saltation flux equation is tuned to about 2.61, which corresponds to wind friction velocities from 0.26 to 0.35 m·s-1. Unfortunately, during three local dust emission events, the observed dust fluxes are from 1 to 3μg· m-2· s-1, lower than the modeled ones, implying that the model needs to be improved further under lower wind velocity. In comparison with data from a sandy soil with physical crust and a loam soil, saltation fluxes of the loose sandy soil in Mu Us Desert are obviously higher, which suggests that deserts and sandy land with desertification are major dust sources in northern China.

  15. Parallel labeling experiments validate Clostridium acetobutylicum metabolic network model for (13)C metabolic flux analysis.

    Science.gov (United States)

    Au, Jennifer; Choi, Jungik; Jones, Shawn W; Venkataramanan, Keerthi P; Antoniewicz, Maciek R

    2014-11-01

    In this work, we provide new insights into the metabolism of Clostridium acetobutylicum ATCC 824 obtained using a systematic approach for quantifying fluxes based on parallel labeling experiments and (13)C-metabolic flux analysis ((13)C-MFA). Here, cells were grown in parallel cultures with [1-(13)C]glucose and [U-(13)C]glucose as tracers and (13)C-MFA was used to quantify intracellular metabolic fluxes. Several metabolic network models were compared: an initial model based on current knowledge, and extended network models that included additional reactions that improved the fits of experimental data. While the initial network model did not produce a statistically acceptable fit of (13)C-labeling data, an extended network model with five additional reactions was able to fit all data with 292 redundant measurements. The model was subsequently trimmed to produce a minimal network model of C. acetobutylicum for (13)C-MFA, which could still reproduce all of the experimental data. The flux results provided valuable new insights into the metabolism of C. acetobutylicum. First, we found that TCA cycle was effectively incomplete, as there was no measurable flux between α-ketoglutarate and succinyl-CoA, succinate and fumarate, and malate and oxaloacetate. Second, an active pathway was identified from pyruvate to fumarate via aspartate. Third, we found that isoleucine was produced exclusively through the citramalate synthase pathway in C. acetobutylicum and that CAC3174 was likely responsible for citramalate synthase activity. These model predictions were confirmed in several follow-up tracer experiments. The validated metabolic network model established in this study can be used in future investigations for unbiased (13)C-flux measurements in C. acetobutylicum. Copyright © 2014 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  16. Validation of scaffold design optimization in bone tissue engineering: finite element modeling versus designed experiments.

    Science.gov (United States)

    Uth, Nicholas; Mueller, Jens; Smucker, Byran; Yousefi, Azizeh-Mitra

    2017-02-21

    This study reports the development of biological/synthetic scaffolds for bone tissue engineering (TE) via 3D bioplotting. These scaffolds were composed of poly(L-lactic-co-glycolic acid) (PLGA), type I collagen, and nano-hydroxyapatite (nHA) in an attempt to mimic the extracellular matrix of bone. The solvent used for processing the scaffolds was 1,1,1,3,3,3-hexafluoro-2-propanol. The produced scaffolds were characterized by scanning electron microscopy, microcomputed tomography, thermogravimetric analysis, and unconfined compression test. This study also sought to validate the use of finite-element optimization in COMSOL Multiphysics for scaffold design. Scaffold topology was simplified to three factors: nHA content, strand diameter, and strand spacing. These factors affect the ability of the scaffold to bear mechanical loads and how porous the structure can be. Twenty four scaffolds were constructed according to an I-optimal, split-plot designed experiment (DE) in order to generate experimental models of the factor-response relationships. Within the design region, the DE and COMSOL models agreed in their recommended optimal nHA (30%) and strand diameter (460 μm). However, the two methods disagreed by more than 30% in strand spacing (908 μm for DE; 601 μm for COMSOL). Seven scaffolds were 3D-bioplotted to validate the predictions of DE and COMSOL models (4.5-9.9 MPa measured moduli). The predictions for these scaffolds showed relative agreement for scaffold porosity (mean absolute percentage error of 4% for DE and 13% for COMSOL), but were substantially poorer for scaffold modulus (51% for DE; 21% for COMSOL), partly due to some simplifying assumptions made by the models. Expanding the design region in future experiments (e.g., higher nHA content and strand diameter), developing an efficient solvent evaporation method, and exerting a greater control over layer overlap could allow developing PLGA-nHA-collagen scaffolds to meet the mechanical requirements for

  17. Validation of Global Ozone Monitoring Experiment zone profiles and evaluation of stratospheric transport in a global chemistry transport model

    NARCIS (Netherlands)

    Laat, A.T.J.de; Landgraf, J.; Aben, I.; Hasekamp, O.; Bregman, B.

    2007-01-01

    This paper presents a validation of Global Ozone Monitoring Experiment (GOME) ozone (O3) profiles which are used to evaluate stratospheric transport in the chemistry transport model (CTM) Tracer Model version 5 (TM5) using a linearized stratospheric O3 chemistry scheme. A comparison of GOME O3 profi

  18. Validation of spectral gas radiation models under oxyfuel conditions. Part A: Gas cell experiments

    DEFF Research Database (Denmark)

    Becher, Valentin; Clausen, Sønnik; Fateev, Alexander;

    2011-01-01

    AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis for the validat......AbstractCombustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition as combustion with air. Standard CFD spectral gas radiation models for air combustion are out of their validity range. The series of three articles provides a common spectral basis...

  19. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    Science.gov (United States)

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford

  20. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  1. WEC-Sim Phase 1 Validation Testing: Numerical Modeling of Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-06-24

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  2. Characterization of materials for a reactive transport model validation experiment: Interim report on the caisson experiment. Yucca Mountain Site Characterization Project

    Energy Technology Data Exchange (ETDEWEB)

    Siegel, M.D.; Cheng, W.C. [Sandia National Labs., Albuquerque, NM (United States); Ward, D.B.; Bryan, C.R. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Earth and Planetary Sciences

    1995-08-01

    Models used in performance assessment and site characterization activities related to nuclear waste disposal rely on simplified representations of solute/rock interactions, hydrologic flow field and the material properties of the rock layers surrounding the repository. A crucial element in the design of these models is the validity of these simplifying assumptions. An intermediate-scale experiment is being carried out at the Experimental Engineered Test Facility at Los Alamos Laboratory by the Los Alamos and Sandia National Laboratories to develop a strategy to validate key geochemical and hydrological assumptions in performance assessment models used by the Yucca Mountain Site Characterization Project.

  3. Modelling the human pharyngeal airway: validation of numerical simulations using in vitro experiments

    CERN Document Server

    Chouly, Franz; Lagrée, Pierre-Yves; Pelorson, Xavier; Payan, Yohan; 10.1007/s11517-008-0412-1

    2008-01-01

    In the presented study, a numerical model which predicts the flow-induced collapse within the pharyngeal airway is validated using in vitro measurements. Theoretical simplifications were considered to limit the computation time. Systematic comparisons between simulations and measurements were performed on an in vitro replica, which reflects asymmetries of the geometry and of the tissue properties at the base of the tongue and in pathological conditions (strong initial obstruction). First, partial obstruction is observed and predicted. Moreover, the prediction accuracy of the numerical model is of 4.2% concerning the deformation (mean quadratic error on the constriction area). It shows the ability of the assumptions and method to predict accurately and quickly a fluid-structure interaction.

  4. Validation of SSiB Model over Grassland with CHeRES Field Experiment Data in 2001

    Institute of Scientific and Technical Information of China (English)

    孙岚; 薛永康

    2004-01-01

    The Simplified Simple Biosphere model (SSiB) is validated in off-line simulations against field measurements in the summer of 2001 from the China Heavy Rainfall Experiment and Study (CHeRES) over a grassland site located in the lower reaches of the Yangtze River. When initialized and driven by the observed atmospheric forcing, the model reproduced the observed surface heat fluxes and surface skin temperature realistically. The model was also able to well simulate the variation of soil water content. The sensitivity experiments found that the leaf reflectance was the most significant parameter in improving the estimation of surface albedo during both wet and dry periods. This study suggests that the model is capable of simulating the physical processes and of assessing the impact of biophysical parameters that relate to land-atmosphere interactions over the eastern Asian monsoon regions, which is crucial for mesoscale atmospheric models.

  5. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Giron, Nicholas Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150°C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  6. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams.

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R.; Celina, Mathias C.; Giron, Nicholas Henry; Long, Kevin Nicholas; Russick, Edward M.

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150 o C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  7. Direct-contact condensers for open-cycle OTEC applications: Model validation with fresh water experiments for structured packings

    Science.gov (United States)

    Bharathan, D.; Parsons, B. K.; Althof, J. A.

    1988-10-01

    The objective of the reported work was to develop analytical methods for evaluating the design and performance of advanced high-performance heat exchangers for use in open-cycle thermal energy conversion (OC-OTEC) systems. This report describes the progress made on validating a one-dimensional, steady-state analytical computer of fresh water experiments. The condenser model represents the state of the art in direct-contact heat exchange for condensation for OC-OTEC applications. This is expected to provide a basis for optimizing OC-OTEC plant configurations. Using the model, we examined two condenser geometries, a cocurrent and a countercurrent configuration. This report provides detailed validation results for important condenser parameters for cocurrent and countercurrent flows. Based on the comparisons and uncertainty overlap between the experimental data and predictions, the model is shown to predict critical condenser performance parameters with an uncertainty acceptable for general engineering design and performance evaluations.

  8. Direct-contact condensers for open-cycle OTEC applications: Model validation with fresh water experiments for structured packings

    Energy Technology Data Exchange (ETDEWEB)

    Bharathan, D.; Parsons, B.K.; Althof, J.A.

    1988-10-01

    The objective of the reported work was to develop analytical methods for evaluating the design and performance of advanced high-performance heat exchangers for use in open-cycle thermal energy conversion (OC-OTEC) systems. This report describes the progress made on validating a one-dimensional, steady-state analytical computer of fresh water experiments. The condenser model represents the state of the art in direct-contact heat exchange for condensation for OC-OTEC applications. This is expected to provide a basis for optimizing OC-OTEC plant configurations. Using the model, we examined two condenser geometries, a cocurrent and a countercurrent configuration. This report provides detailed validation results for important condenser parameters for cocurrent and countercurrent flows. Based on the comparisons and uncertainty overlap between the experimental data and predictions, the model is shown to predict critical condenser performance parameters with an uncertainty acceptable for general engineering design and performance evaluations. 33 refs., 69 figs., 38 tabs.

  9. A Large-Scale Multibody Manipulator Soft Sensor Model and Experiment Validation

    Directory of Open Access Journals (Sweden)

    Wu Ren

    2014-01-01

    Full Text Available Stress signal is difficult to obtain in the health monitoring of multibody manipulator. In order to solve this problem, a soft sensor method is presented. In the method, stress signal is considered as dominant variable and angle signal is regarded as auxiliary variable. By establishing the mathematical relationship between them, a soft sensor model is proposed. In the model, the stress information can be deduced by angle information which can be easily measured for such structures by experiments. Finally, test of ground and wall working conditions is done on a multibody manipulator test rig. The results show that the stress calculated by the proposed method is closed to the test one. Thus, the stress signal is easier to get than the traditional method. All of these prove that the model is correct and the method is feasible.

  10. Validation of Wall Friction Model in SPACE-3D Module with Two-Phase Cross Flow Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chi-Jin; Yang, Jin-Hwa; Cho, Hyoung-Kyu; Park, Goon-Cher [Seoul National University, Seoul (Korea, Republic of); Euh, Dong-Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this study, SPACE-3D was used to simulate the Yang's experiment, and obtained the local variables. Then, the wall friction model used in SPACE-3D was validated by comparing the two-phase cross flow experimental results with the calculated local variables. In this study, the two-phase cross flow experiment was modeled by SPACE-3D to validate the wall friction model in multi-dimensional module. Considering the realistic phenomena in the reactor, however, recent trends in safety analysis codes have tended to adopt multi-dimensional module to simulate the complex flow more accurately. Even though the module was applied to deal the multi-dimensional phenomena, implemented models in that are one-dimensional empirical models. Therefore, prior to applying the multi-dimensional module, the constitutive models implemented in the codes need to be validated. In the downcomer of Advanced Power Reactor 1400 (APR1400) which has direct vessel injection (DVI) lines as an emergency core cooling system, multi-dimensional two-phase flow may occur due to the Loss-of-Coolant-Accident (LOCA). The accurate prediction about that is high relevance to evaluation of the integrity of the reactor core. For this reason, Yang performed an experiment that was to investigate the two-dimensional film flow which simulated the two-phase cross flow in the upper downcomer, and obtained the local liquid film velocity and thickness data. From these data, it could be possible to validate the friction models in multi-dimensional module of system analysis codes. Compared with the experiment, SPACE-3D underestimated the liquid film velocity and overestimated the liquid film thickness. From these results, it was clarified that the Wallis correlation which is used as a wall friction model in SPACE-3D overestimates the wall friction. On the other hand, H.T.F.S. correlation which is used as the wall friction in MARS-multiD underestimates the wall friction.

  11. Investigation of density-dependent gas advection of trichloroethylene: Experiment and a model validation exercise

    Science.gov (United States)

    Lenhard, R. J.; Oostrom, M.; Simmons, C. S.; White, M. D.

    1995-07-01

    An experiment was conducted to evaluate whether vapor-density effects are significant in transporting volatile organic compounds (VOC's) with high vapor pressure and molecular mass through the subsurface. Trichloroethylene (TCE) was chosen for the investigation because it is a common VOC contaminant with high vapor pressure and molecular mass. For the investigation, a 2-m-long by 1-m-high by 7.5-cm-thick flow cell was constructed with a network of sampling ports. The flow cell was packed with sand, and a water table was established near the lower boundary. Liquid TCE was placed near the upper boundary of the flow cell in a chamber from which vapors could enter and migrate through the sand. TCE concentrations in the gas phase were measured by extracting 25-μl gas samples with an air-tight syringe and analyzing them with a gas chromatograph. The evolution of the TCE gas plume in the sand was investigated by examining plots of TCE concentrations over the domain for specific times and for particular locations as a function of time. To help in this analysis, a numerical model was developed that can predict the simultaneous movements of a gas, a nonaqueous liquid and water in porous media. The model also considers interphase mass transfer by employing the phase equilibrium assumption. The model was tested with one- and two-dimensional analytical solutions of fluid flow before it was used to simulate the experiment. Comparisons between experimental data and simulation results when vapor-density effects are considered were very good. When vapor-density effects were ignored, agreement was poor. These analyses suggest that vapor-density effects should be considered and that density-driven vapor advection may be an important mechanism for moving VOC's with high vapor pressures and molecular mass through the subsurface.

  12. Droplet evaporation from porous surfaces; model validation from field and wind tunnel experiments for sand and concrete

    Science.gov (United States)

    Griffiths, R. F.; Roberts, I. D.

    The evaporation model of Roberts and Griffiths (1995 Atmospheric Environment 29, 1307-1317) has been subjected to an extensive validation exercise based on a major campaign of field experiments on evaporation from surfaces composed of sand and of concrete. This complements the previous validation which was limited to wind tunnel experiments on sand surfaces. Additionally, the validation using wind tunnel data has been extended to include concrete surfaces. The model describes the constant-rate and falling-rate periods that characterise evaporation from porous media. During the constant-rate period, the evaporation is solely determined by the vapour transport rate into the air. During the falling-rate period, the process in the porous medium is modelled as a receding evaporation front, the overall evaporation rate being determined by the combined effects of vapour transport through the pore network and subsequently into the air. The field trials programme was conducted at sites in the USA and the UK, and examined the evaporation of diethyl malonate droplets from sand and concrete surfaces. Vapour concentrations at several heights in the plume were measured at the centre of a 1 m radius annular source (of width 10 cm) contaminated by uniformly sized droplets (2.4 or 4.1 mm in diameter), key meteorological data being measured at the same time. The evaporation was quantified by coupling concentration and wind speed data. In all, 22 trials were performed on sand and concrete; a further 8 were performed on non-porous surfaces (aluminium foil and slate) as references. The model performance was evaluated against the experimental data in terms of two quantities, the initial evaporation rate of the embedded droplets, and the mass-fraction remaining in the substrate at intervals over the evaporation episode. Overall, the model performance was best in the case of the field experiments for concrete, and the wind tunnel experiments for sand; the performance for wind tunnel

  13. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  14. Sodium vapor cell laser guide star experiments for continuous wave model validation

    Science.gov (United States)

    Pedreros Bustos, Felipe; Holzlöhner, Ronald; Budker, Dmitry; Lewis, Steffan; Rochester, Simon

    2016-07-01

    Recent numerical simulations and experiments on sodium Laser Guide Star (LGS) have shown that a continuous wave (CW) laser with circular polarization and re-pumping should maximize the fluorescent photon return flux to the wavefront sensor for adaptive optics applications. The orientation and strength of the geomagnetic field in the sodium layer also play an important role affecting the LGS return ux. Field measurements of the LGS return flux show agreement with the CW LGS model, however, fluctuations in the sodium column abundance and geomagnetic field intensity, as well as atmospheric turbulence, induce experimental uncertainties. We describe a laboratory experiment to measure the photon return flux from a sodium vapor cell illuminated with a 589 nm CW laser beam, designed to approximately emulate a LGS under controlled conditions. Return flux measurements are carried out controlling polarization, power density, re-pumping, laser linewidth, and magnetic field intensity and orientation. Comparison with the numerical CW simulation package Atomic Density Matrix are presented and discussed.

  15. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  16. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    Science.gov (United States)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we

  17. Relation of validation experiments to applications.

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, James R. (New Mexico State University, Las Cruces, NM); Hills, Richard Guy

    2009-02-01

    Computational and mathematical models are developed in engineering to represent the behavior of physical systems to various system inputs and conditions. These models are often used to predict at other conditions, rather than to just reproduce the behavior of data obtained at the experimental conditions. For example, the boundary or initial conditions, time of prediction, geometry, material properties, and other model parameters can be different at test conditions than those for an anticipated application of a model. Situations for which the conditions may differ include those for which (1) one is in the design phase and a prototype of the system has not been constructed and tested under the anticipated conditions, (2) only one version of a final system can be built and destructive testing is not feasible, or (3) the anticipated design conditions are variable and one cannot easily reproduce the range of conditions with a limited number of carefully controlled experiments. Because data from these supporting experiments have value in model validation, even if the model was tested at different conditions than an anticipated application, methodology is required to evaluate the ability of the validation experiments to resolve the critical behavior for the anticipated application. The methodology presented uses models for the validation experiments and a model for the application to address how well the validation experiments resolve the application. More specifically, the methodology investigates the tradeoff that exists between the uncertainty (variability) in the behavior of the resolved critical variables for the anticipated application and the ability of the validation experiments to resolve this behavior. The important features of this approach are demonstrated through simple linear and non-linear heat conduction examples.

  18. Plans and Specifications for a Full-Scale Towing Model Validation Experiment

    Science.gov (United States)

    1989-05-01

    extreme tension statistics of towing hawsers. Doctoral dissertation, Massachusetts Institute of Technology, Department of Ocean Engineering, Cambridge...ship’s speed, as read from the installed speed log, every ten minutes 135 TESI PLAN - NAVSEA TWO BODY TOWING EXPERIMENT 3. Environmental Data a. Log wind

  19. Testing and validating environmental models

    Science.gov (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  20. RELAP5 Model Description and Validation for the BR2 Loss-of-Flow Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States); Van den Branden, G. [Argonne National Lab. (ANL), Argonne, IL (United States); Sikik, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Koonen, E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-07-01

    This paper presents a description of the RELAP5 model, the calibration method used to obtain the minor loss coefficients from the available hydraulic data and the LOFA simulation results compared to the 1963 experimental tests for HEU fuel.

  1. Model and experiences of initiating collaboration with traditional healers in validation of ethnomedicines for HIV/AIDS in Namibia

    Directory of Open Access Journals (Sweden)

    Chinsembu Kazhila C

    2009-10-01

    Full Text Available Abstract Many people with Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome (HIV/AIDS in Namibia have access to antiretroviral drugs but some still use traditional medicines to treat opportunistic infections and offset side-effects from antiretroviral medication. Namibia has a rich biodiversity of indigenous plants that could contain novel anti-HIV agents. However, such medicinal plants have not been identified and properly documented. Various ethnomedicines used to treat HIV/AIDS opportunistic infections have not been scientifically validated for safety and efficacy. These limitations are mostly attributable to the lack of collaboration between biomedical scientists and traditional healers. This paper presents a five-step contextual model for initiating collaboration with Namibian traditional healers in order that candidate plants that may contain novel anti-HIV agents are identified, and traditional medicines used to treat HIV/AIDS opportunistic infections are subjected to scientific validation. The model includes key structures and processes used to initiate collaboration with traditional healers in Namibia; namely, the National Biosciences Forum, a steering committee with the University of Namibia (UNAM as the focal point, a study tour to Zambia and South Africa where other collaborative frameworks were examined, commemorations of the African Traditional Medicine Day (ATMD, and consultations with stakeholders in north-eastern Namibia. Experiences from these structures and processes are discussed. All traditional healers in north-eastern Namibia were willing to collaborate with UNAM in order that their traditional medicines could be subjected to scientific validation. The current study provides a framework for future collaboration with traditional healers and the selection of candidate anti-HIV medicinal plants and ethnomedicines for scientific testing in Namibia.

  2. Model and experiences of initiating collaboration with traditional healers in validation of ethnomedicines for HIV/AIDS in Namibia.

    Science.gov (United States)

    Chinsembu, Kazhila C

    2009-10-23

    Many people with Human Immunodeficiency Virus/Acquired Immunodeficiency Syndrome (HIV/AIDS) in Namibia have access to antiretroviral drugs but some still use traditional medicines to treat opportunistic infections and offset side-effects from antiretroviral medication. Namibia has a rich biodiversity of indigenous plants that could contain novel anti-HIV agents. However, such medicinal plants have not been identified and properly documented. Various ethnomedicines used to treat HIV/AIDS opportunistic infections have not been scientifically validated for safety and efficacy. These limitations are mostly attributable to the lack of collaboration between biomedical scientists and traditional healers. This paper presents a five-step contextual model for initiating collaboration with Namibian traditional healers in order that candidate plants that may contain novel anti-HIV agents are identified, and traditional medicines used to treat HIV/AIDS opportunistic infections are subjected to scientific validation. The model includes key structures and processes used to initiate collaboration with traditional healers in Namibia; namely, the National Biosciences Forum, a steering committee with the University of Namibia (UNAM) as the focal point, a study tour to Zambia and South Africa where other collaborative frameworks were examined, commemorations of the African Traditional Medicine Day (ATMD), and consultations with stakeholders in north-eastern Namibia. Experiences from these structures and processes are discussed. All traditional healers in north-eastern Namibia were willing to collaborate with UNAM in order that their traditional medicines could be subjected to scientific validation. The current study provides a framework for future collaboration with traditional healers and the selection of candidate anti-HIV medicinal plants and ethnomedicines for scientific testing in Namibia.

  3. Computational model for nanocarrier binding to endothelium validated using in vivo, in vitro, and atomic force microscopy experiments.

    Science.gov (United States)

    Liu, Jin; Weller, Gregory E R; Zern, Blaine; Ayyaswamy, Portonovo S; Eckmann, David M; Muzykantov, Vladimir R; Radhakrishnan, Ravi

    2010-09-21

    A computational methodology based on Metropolis Monte Carlo (MC) and the weighted histogram analysis method (WHAM) has been developed to calculate the absolute binding free energy between functionalized nanocarriers (NC) and endothelial cell (EC) surfaces. The calculated NC binding free energy landscapes yield binding affinities that agree quantitatively when directly compared against analogous measurements of specific antibody-coated NCs (100 nm in diameter) to intracellular adhesion molecule-1 (ICAM-1) expressing EC surface in in vitro cell-culture experiments. The effect of antibody surface coverage (σ(s)) of NC on binding simulations reveals a threshold σ(s) value below which the NC binding affinities reduce drastically and drop lower than that of single anti-ICAM-1 molecule to ICAM-1. The model suggests that the dominant effect of changing σ(s) around the threshold is through a change in multivalent interactions; however, the loss in translational and rotational entropies are also important. Consideration of shear flow and glycocalyx does not alter the computed threshold of antibody surface coverage. The computed trend describing the effect of σ(s) on NC binding agrees remarkably well with experimental results of in vivo targeting of the anti-ICAM-1 coated NCs to pulmonary endothelium in mice. Model results are further validated through close agreement between computed NC rupture-force distribution and measured values in atomic force microscopy (AFM) experiments. The three-way quantitative agreement with AFM, in vitro (cell-culture), and in vivo experiments establishes the mechanical, thermodynamic, and physiological consistency of our model. Hence, our computational protocol represents a quantitative and predictive approach for model-driven design and optimization of functionalized nanocarriers in targeted vascular drug delivery.

  4. Well-characterized open pool experiment data and analysis for model validation and development.

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, David W.; Brown, Alexander L.; Blanchat, Thomas K.

    2006-12-01

    Four Well-Characterized Open Pool fires were conducted by Fire Science and Technology Department. The focus of the Well-Characterized Open Pool fire series was to provide environmental information for open pool fires on a physics first principal basis. The experiments measured the burning rate of liquid fuel in an open pool and the resultant heat flux to a weapon-sized object and the surrounding environment with well-characterized boundary and initial conditions. Results presented in this report include a general description of test observation (pre- and post-test), wind measurements, fire plume topology, average fuel recession and heat release rates, and incident heat flux to the pool and to the calorimeters. As expected, results of the experiments show a strong correlation between wind conditions, fuel vaporization (mass loss) rate, and incident heat flux to the fuel and ground surface and calorimeters. Numerical fire simulations using both temporally- and spatially-dependant wind boundary conditions were performed using the Vulcan fire code. Comparisons of data to simulation predictions showed similar trends; however, simulation-predicted incident heat fluxes were lower than measured.

  5. Flight Experiments on Swept-Wing Roughness Receptivity: Validation Data for Modeling and Computations

    Science.gov (United States)

    2010-09-22

    Photodetector which is cooled to 70 K with an onboard Stirling motor cooler . The camera has a sensitivity of 0.02 C at a temperature of 30 C...a Kemo VBF44 bandpass filter12. A new coordinate system was developed for studying this unconventional, vertically-mounted airfoil. The aircraft...psid, 16-bit, Pressure Systems scanner. The scanner was imbedded inside the model to reduce pressure lag time within the tubing. For all Cp flights

  6. A multimodal detection model of dolphins to estimate abundance validated by field experiments.

    Science.gov (United States)

    Akamatsu, Tomonari; Ura, Tamaki; Sugimatsu, Harumi; Bahl, Rajendar; Behera, Sandeep; Panda, Sudarsan; Khan, Muntaz; Kar, S K; Kar, C S; Kimura, Satoko; Sasaki-Yamamoto, Yukiko

    2013-09-01

    Abundance estimation of marine mammals requires matching of detection of an animal or a group of animal by two independent means. A multimodal detection model using visual and acoustic cues (surfacing and phonation) that enables abundance estimation of dolphins is proposed. The method does not require a specific time window to match the cues of both means for applying mark-recapture method. The proposed model was evaluated using data obtained in field observations of Ganges River dolphins and Irrawaddy dolphins, as examples of dispersed and condensed distributions of animals, respectively. The acoustic detection probability was approximately 80%, 20% higher than that of visual detection for both species, regardless of the distribution of the animals in present study sites. The abundance estimates of Ganges River dolphins and Irrawaddy dolphins fairly agreed with the numbers reported in previous monitoring studies. The single animal detection probability was smaller than that of larger cluster size, as predicted by the model and confirmed by field data. However, dense groups of Irrawaddy dolphins showed difference in cluster sizes observed by visual and acoustic methods. Lower detection probability of single clusters of this species seemed to be caused by the clumped distribution of this species.

  7. CFD Recombiner Modelling and Validation on the H2-Par and Kali-H2 Experiments

    Directory of Open Access Journals (Sweden)

    Stéphane Mimouni

    2011-01-01

    Full Text Available A large amount of Hydrogen gas is expected to be released within the dry containment of a pressurized water reactor (PWR, shortly after the hypothetical beginning of a severe accident leading to the melting of the core. According to local gas concentrations, the gaseous mixture of hydrogen, air and steam can reach the flammability limit, threatening the containment integrity. In order to prevent mechanical loads resulting from a possible conflagration of the gas mixture, French and German reactor containments are equipped with passive autocatalytic recombiners (PARs which preventively oxidize hydrogen for concentrations lower than that of the flammability limit. The objective of the paper is to present numerical assessments of the recombiner models implemented in CFD solvers NEPTUNE_CFD and Code_Saturne. Under the EDF/EPRI agreement, CEA has been committed to perform 42 tests of PARs. The experimental program named KALI-H2, consists checking the performance and behaviour of PAR. Unrealistic values for the gas temperature are calculated if the conjugate heat transfer and the wall steam condensation are not taken into account. The combined effects of these models give a good agreement between computational results and experimental data.

  8. Lung Motion Model Validation Experiments, Free-Breathing Tissue Densitometry, and Ventilation Mapping using Fast Helical CT Imaging

    Science.gov (United States)

    Dou, Hsiang-Tai

    geometries, employed as ground truth data. Image similarity between the simulated and ground truth scans was evaluated. The model validation experiments were conducted in a patient cohort of seventeen patients to assess the model robustness and inter-patient variation. The model error averaged over multiple tracked positions from several breathing cycles was found to be on the order of one millimeter. In modeling the density change under free breathing condition, the determinant of Jacobian matrix from the registration-derived deformation vector field yielded volume change information of the lung tissues. Correlation of the Jacobian values to the corresponding voxel Housfield units (HU) reveals that the density variation for the majority of lung tissues can be very well described by mass conservation relationship. Different tissue types were identified and separately modeled. Large trials of validation experiments were performed. The averaged deviation between the modeled and the reference lung density was 30 HU, which was estimated to be the background CT noise level. In characterizing the lung ventilation function, a novel method was developed to determine the extent of lung tissue volume change. Information on volume change was derived from the deformable image registration of the fast helical CT images in terms of Jacobian values with respect to a reference image. Assuming the multiple volume change measurements are independently and identically distributed, statistical formulation was derived to model ventilation distribution of each lung voxels and empirical minimum and maximum probability distribution of the Jacobian values was computed. Ventilation characteristic was evaluated as the difference of the expectation value from these extremal distributions. The resulting ventilation map was compared with an independently obtained ventilation image derived directly from the lung intensities and good correlation was found using statistical test. In addition, dynamic

  9. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....

  10. Estimation of Land Surface Parameters by LDAS-UT: Model Development and Validation on Tanashi Field Experiment

    Science.gov (United States)

    Lu, H.; Koike, T.; Yang, K.; Li, X.; Graf, T.; Boussetta, S.; Tsutsui, H.; Kuria, D. N.

    2007-12-01

    The estimation of soil moisture and surface energy fluxes at various temporal and spatial scales remains to be an outstanding problem in hydrologic and meteorological researches. Remote sensed data retrieval algorithms, land surface models and data assimilation systems are highly expected to provide a solution to this problem. But the parameters required by those algorithms and systems, such as the soil texture, porosity, roughness parameters and so on, are highly variable or unavailable. In this study, a land data assimilation system (LDAS- UT) is employed to inversely estimate the optimal values of those land surface parameters with meteorological forcing data and remote sensed data. And a field experiment is designed to provide a well-controlled data set for the system validation. The Tanashi experiment has been in operation since November, 2006 in the farm of the University of Tokyo. Continuous ground measurements of meteorological variables, soil moisture and temperature profiles and vegetation status have been taken over a plot, in which winter wheat was planted. At the same time, the ground based microwave radiometers (GBMR) are employed to provide accurate field measurements of brightness temperature up-welling from the plot, at the frequencies of 6.925, 10.65, 18.7, 23.8, 36.5 and 89 GHz. The LDAS_UT is then run with using data obtained from this experiment to retrieval parameters for two periods. One is the period from December 2006 to February 2007, the germination period of winter wheat, and during which the vegetation effects are small. The second period is from April to May 2007, during which the winter wheat was developing rapidly. The optimize parameters were compared with the in situ observed ¡®real' ones. It found that, for the first period, the retrieved parameters are close to the ¡®real' values, while for the second period, the gap between the retrieved parameters and the ¡®real' values are much bigger. The difference between the

  11. Validation of CTF Droplet Entrainment and Annular/Mist Closure Models using Riso Steam/Water Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    This report summarizes the work done to validate the droplet entrainment and de-entrainment models as well as two-phase closure models in the CTF code by comparison with experimental data obtained at Riso National Laboratory. The Riso data included a series of over 250 steam/water experiments that were performed in both tube and annulus geometries over a range of various pressures and outlet qualities. Experimental conditions were set so that the majority of cases were in the annular/mist ow regime. Measurements included liquid lm ow rate, droplet ow rate, lm thickness, and two-phase pressure drop. CTF was used to model 180 of the tubular geometry cases, matching experimental geometry, outlet pressure, and outlet ow quality to experimental values. CTF results were compared to the experimental data at the outlet of the test section in terms of vapor and entrained liquid ow fractions, pressure drop per unit length, and liquid lm thickness. The entire process of generating CTF input decks, running cases, extracting data, and generating comparison plots was scripted using Python and Matplotlib for a completely automated validation process. All test cases and scripting tools have been committed to the COBRA-TF master repository and selected cases have been added to the continuous testing system to serve as regression tests. The dierences between the CTF- and experimentally-calculated ow fraction values were con- sistent with previous calculations by Wurtz, who applied the same entrainment correlation to the same data. It has been found that CTF's entrainment/de-entrainment predictive capability in the annular/mist ow regime for this particular facility is comparable to the licensed industry code, COBRAG. While lm and droplet predictions are generally good, it has been found that accuracy is diminished at lower ow qualities. This nding is consistent with the noted deciencies in the Wurtz entrainment model employed by CTF. The CTF predicted two-phase pressure drop in

  12. Modelling growth performance and feeding behaviour of Atlantic salmon (Salmo salar L.) in commercial-size aquaculture net pens: Model details and validation through full-scale experiments.

    Science.gov (United States)

    Føre, Martin; Alver, Morten; Alfredsen, Jo Arve; Marafioti, Giancarlo; Senneset, Gunnar; Birkevold, Jens; Willumsen, Finn Victor; Lange, Guttorm; Espmark, Åsa; Terjesen, Bendik Fyhn

    2016-11-01

    We have developed a mathematical model which estimates the growth performance of Atlantic salmon in aquaculture production units. The model consists of sub-models estimating the behaviour and energetics of the fish, the distribution of feed pellets, and the abiotic conditions in the water column. A field experiment where three full-scale cages stocked with 120,000 salmon each (initial mean weight 72.1  ± SD 2.8 g) were monitored over six months was used to validate the model. The model was set up to simulate fish growth for all the three cages using the feeding regimes and observed environmental data as input, and simulation results were compared with the experimental data. Experimental fish achieved end weights of 878, 849 and 739 g in the three cages respectively. However, the fish contracted Pancreas Disease (PD) midway through the experiment, a factor which is expected to impair growth and increase mortality rate. The model was found able to predict growth rates for the initial period when the fish appeared to be healthy. Since the effects of PD on fish performance are not modelled, growth rates were overestimated during the most severe disease period. This work illustrates how models can be powerful tools for predicting the performance of salmon in commercial production, and also imply their potential for predicting differences between commercial scale and smaller experimental scales. Furthermore, such models could be tools for early detection of disease outbreaks, as seen in the deviations between model and observations caused by the PD outbreak. A model could potentially also give indications on how the growth performance of the fish will suffer during such outbreaks.

  13. The role of CFD combustion modeling in hydrogen safety management – V: Validation for slow deflagrations in homogeneous hydrogen-air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, Tadej, E-mail: tadej.holler@ijs.si [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Kljenak, Ivo [Jozef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, Ed [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2016-12-15

    Highlights: • Validation of the modeling approach for hydrogen deflagration is presented. • Modeling approach is based on two combustion models implemented in ANSYS Fluent. • Experiments with various initial hydrogen concentrations were used for validation. • The effects of heat transfer mechanisms selection were also investigated. • The grid sensitivity analysis was performed as well. - Abstract: The control of hydrogen in the containment is an important safety issue following rapid oxidation of the uncovered reactor core during a severe accident in a Nuclear Power Plant (NPP), because dynamic pressure loads from eventual hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In the set of our previous papers, a CFD-based method to assess the consequence of fast combustion of uniform hydrogen-air mixtures was presented, followed by its validation for hydrogen-air mixtures with diluents and for non-uniform hydrogen-air mixtures. In the present paper, the extension of this model for the slow deflagration regime is presented and validated using the hydrogen deflagration experiments performed in the medium-scale experimental facility THAI. The proposed method is implemented in the CFD software ANSYS Fluent using user defined functions. The paper describes the combustion model and the main results of code validation. It addresses questions regarding turbulence model selection, effect of heat transfer mechanisms, and grid sensitivity, as well as provides insights into the importance of combustion model choice for the slow deflagration regime of hydrogen combustion in medium-scale and large-scale experimental vessels mimicking the NPP containment.

  14. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  15. Electrolysis Performance Improvement and Validation Experiment

    Science.gov (United States)

    Schubert, Franz H.

    1992-01-01

    Viewgraphs on electrolysis performance improvement and validation experiment are presented. Topics covered include: water electrolysis: an ever increasing need/role for space missions; static feed electrolysis (SFE) technology: a concept developed for space applications; experiment objectives: why test in microgravity environment; and experiment description: approach, hardware description, test sequence and schedule.

  16. ISOTHERMAL AIR-INGRESS VALIDATION EXPERIMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2013-01-01

    Idaho National Laboratory has conducted airingress experiments as part of a campaign to validate computational fluid dynamics (CFD) calculations for very high-temperature gas-cooled reactor (VHTR) analysis. An isothermal test loop was designed to recreate exchange or stratified flow that occurs in the lower plenum of VHTR after a break in the primary loop allows helium to leak out and reactor building air to enter the reactor core. The experiment was designed to measure stratified flow in the inlet pipe connecting to the lower plenum of the General Atomics gas turbine–modular helium reactor (GT-MHR). Instead of helium and air, brine and sucrose were used as heavy fluids, and water was used as the lighter fluid to create, using scaling laws, the appropriate flow characteristics of the lower plenum immediately after depressurization. These results clearly indicate that stratified flow is established even for very small density differences. Corresponding CFD results were validated with the experimental data. A grid sensitivity study on CFD models was also performed using the Richardson extrapolation and the grid convergence index method for the numerical accuracy of CFD calculations. The calculated current speed showed very good agreement with the experimental data, indicating that current CFD methods are suitable for simulating density gradient stratified flow phenomena in an air-ingress accident.

  17. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  18. Direct reforming of biogas on Ni-based SOFC anodes: Modelling of heterogeneous reactions and validation with experiments

    Science.gov (United States)

    Santarelli, Massimo; Quesito, Francesco; Novaresio, Valerio; Guerra, Cosimo; Lanzini, Andrea; Beretta, Davide

    2013-11-01

    This work focuses on the heterogeneous reactions taking place in a tubular anode-supported solid oxide fuel cell (SOFC) when the designated fuel is biogas from anaerobic digestion directly feeding the fuel cell. Operational maps of the fuel cell running on direct reforming of biogas were first obtained. Hence a mathematical model incorporating the kinetics of reforming reactions on Ni catalyst was used to predict the gas composition profile along the fuel channel. The model was validated against experimental data based on polarization curves. Also, the anode off-gas composition was collected and analyzed through a gas chromatograph. Finally, the model has been used to predict and analyze the gas composition change along the anode channel to evaluate effectiveness of the direct steam reforming when varying cell temperature, inlet fuel composition and the type of reforming process. The simulations results confirmed that thermodynamic-equilibrium conditions are not fully achieved inside the anode channel. It also outlines that a direct biogas utilization in an anode-supported SOFC is able to provide good performance and to ensure a good conversion of the methane even though when the cell temperature is far from the nominal value.

  19. The role of CFD combustion modeling in hydrogen safety management – IV: Validation based on non-homogeneous hydrogen–air experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sathiah, Pratap, E-mail: pratap.sathiah78@gmail.com [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Komen, Ed, E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Roekaerts, Dirk, E-mail: d.j.e.m.roekaerts@tudelft.nl [Delft University of Technology, Department of Process and Energy, Section Fluid Mechanics, Mekelweg 2, 2628 CD Delft (Netherlands)

    2016-12-15

    Highlights: • TFC combustion model is further extended to simulate flame propagation in non-homogeneous hydrogen–air mixtures. • TFC combustion model results are in good agreement with large-scale non-homogeneous hydrogen–air experiments. • The model is further extended to account for the non-uniform hydrogen–air–steam mixture for the presence of PARs on hydrogen deflagration. - Abstract: The control of hydrogen in the containment is an important safety issue in NPPs during a loss of coolant accident, because the dynamic pressure loads from hydrogen combustion can be detrimental to the structural integrity of the reactor safety systems and the reactor containment. In Sathiah et al. (2012b), we presented a computational fluid dynamics based method to assess the consequence of the combustion of uniform hydrogen–air mixtures. In the present article, the extension of this method to and its validation for non-uniform hydrogen–air mixture is described. The method is implemented in the CFD software ANSYS FLUENT using user defined functions. The extended code is validated against non-uniform hydrogen–air experiments in the ENACCEF facility. It is concluded that the maximum pressure and intermediate peak pressure were predicted within 12% and 18% accuracy. The eigen frequencies of the residual pressure wave phenomena were predicted within 4%. It is overall concluded that the current model predicts the considered ENACCEF experiments well.

  20. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  1. An attempt to calibrate and validate a simple ductile failure model against axial-torsion experiments on Al 6061-T651

    Energy Technology Data Exchange (ETDEWEB)

    Reedlunn, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lu, Wei -Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-01-01

    This report details a work in progress. We have attempted to calibrate and validate a Von Mises plasticity model with the Johnson-Cook failure criterion ( Johnson & Cook , 1985 ) against a set of experiments on various specimens of Al 6061-T651. As will be shown, the effort was not successful, despite considerable attention to detail. When the model was com- pared against axial-torsion experiments on tubes, it over predicted failure by 3 x in tension, and never predicted failure in torsion, even when the tube was twisted by 4 x further than the experiment. While this result is unfortunate, it is not surprising. Ductile failure is not well understood. In future work, we will explore whether more sophisticated material mod- els of plasticity and failure will improve the predictions. Selecting the appropriate advanced material model and interpreting the results of said model are not trivial exercises, so it is worthwhile to fully investigate the behavior of a simple plasticity model before moving on to an anisotropic yield surface or a similarly complicated model.

  2. An Attempt to Calibrate and Validate a Simple Ductile Failure Model Against Axial-Torsion Experiments on Al 6061-T651.

    Energy Technology Data Exchange (ETDEWEB)

    Reedlunn, Benjamin; Lu, Wei-Yang [Sandia National Laboratories, Livermore, CA

    2015-01-01

    This report details a work in progress. We have attempted to calibrate and validate a Von Mises plasticity model with the Johnson-Cook failure criterion ( Johnson & Cook , 1985 ) against a set of experiments on various specimens of Al 6061-T651. As will be shown, the effort was not successful, despite considerable attention to detail. When the model was com- pared against axial-torsion experiments on tubes, it over predicted failure by 3 x in tension, and never predicted failure in torsion, even when the tube was twisted by 4 x further than the experiment. While this result is unfortunate, it is not surprising. Ductile failure is not well understood. In future work, we will explore whether more sophisticated material mod- els of plasticity and failure will improve the predictions. Selecting the appropriate advanced material model and interpreting the results of said model are not trivial exercises, so it is worthwhile to fully investigate the behavior of a simple plasticity model before moving on to an anisotropic yield surface or a similarly complicated model.

  3. The role of CFD combustion modelling in hydrogen safety management – VI: Validation for slow deflagration in homogeneous hydrogen-air-steam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cutrono Rakhimov, A., E-mail: cutrono@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Visser, D.C., E-mail: visser@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands); Holler, T., E-mail: tadej.holler@ijs.si [Jožef Stefan Institute (JSI), Jamova cesta 39, 1000 Ljubljana (Slovenia); Komen, E.M.J., E-mail: komen@nrg.eu [Nuclear Research and Consultancy Group (NRG), Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2017-01-15

    Highlights: • Deflagration of hydrogen-air-steam homogeneous mixtures is modeled in a medium-scale containment. • Adaptive mesh refinement is applied on flame front positions. • Steam effect influence on combustion modeling capabilities is investigated. • Mean pressure rise is predicted with 18% under-prediction when steam is involved. • Peak pressure is evaluated with 5% accuracy when steam is involved. - Abstract: Large quantities of hydrogen can be generated during a severe accident in a water-cooled nuclear reactor. When released in the containment, the hydrogen can create a potential deflagration risk. The dynamic pressure loads resulting from hydrogen combustion can be detrimental to the structural integrity of the reactor. Therefore, accurate prediction of these pressure loads is an important safety issue. In previous papers, we validated a Computational Fluid Dynamics (CFD) based method to determine the pressure loads from a fast deflagration. The combustion model applied in the CFD method is based on the Turbulent Flame Speed Closure (TFC). In our last paper, we presented the extension of this combustion model, Extended Turbulent Flame Speed Closure (ETFC), and its validation against hydrogen deflagration experiments in the slow deflagration regime. During a severe accident, cooling water will enter the containment as steam. Therefore, the effect of steam on hydrogen deflagration is important to capture in a CFD model. The primary objectives of the present paper are to further validate the TFC and ETFC combustion models, and investigate their capability to predict the effect of steam. The peak pressures, the trends of the flame velocity, and the pressure rise with an increase in the initial steam dilution are captured reasonably well by both combustion models. In addition, the ETFC model appeared to be more robust to mesh resolution changes. The mean pressure rise is evaluated with 18% under-prediction and the peak pressure is evaluated with 5

  4. Characterization of Aluminum Honeycomb and Experimentation for Model Development and Validation, Volume I: Discovery and Characterization Experiments for High-Density Aluminum Honeycomb

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Wei-Yang [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Korellis, John S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Lee, Kenneth L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Scheffel, Simon [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Mechanics of Materials; Hinnerichs, Terry Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Solid Mechanics; Neilsen, Michael K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Applied Mechanics Development; Scherzinger, William Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Solid Mechanics

    2006-08-01

    Honeycomb is a structure that consists of two-dimensional regular arrays of open cells. High-density aluminum honeycomb has been used in weapon assemblies to mitigate shock and protect payload because of its excellent crush properties. In order to use honeycomb efficiently and to certify the payload is protected by the honeycomb under various loading conditions, a validated honeycomb crush model is required and the mechanical properties of the honeycombs need to be fully characterized. Volume I of this report documents an experimental study of the crush behavior of high-density honeycombs. Two sets of honeycombs were included in this investigation: commercial grade for initial exploratory experiments, and weapon grade, which satisfied B61 specifications. This investigation also includes developing proper experimental methods for crush characterization, conducting discovery experiments to explore crush behaviors for model improvement, and identifying experimental and material uncertainties.

  5. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  6. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  7. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  8. Stratification issues in the primary system. Review of available validation experiments and State-of-the-Art in modelling capabilities (StratRev)

    Energy Technology Data Exchange (ETDEWEB)

    Westin, J.; Henriksson, M. (Vattenfall Research and Development AB (Sweden)); Paettikangas, T. (VTT (Finland)); Toppila, T.; Raemae, T. (Fortum Nuclear Services Ltd (Finland)); Kudinov, P. (KTH Nuclear Power Safety (Sweden)); Anglart, H. (KTH Nuclear Reactor Technology (Sweden))

    2009-08-15

    The objective of the present report is to review available validation experiments and State-of-the-Art in modelling of stratification and mixing in the primary system of Light Water Reactors. A topical workshop was arranged in AElvkarleby in June 2008 within the framework of BWR-OG, and the presentations from various utilities showed that stratification issues are not unusual and can cause costly stops in the production. It is desirable to take actions in order to reduce the probability for stratification to occur, and to develop well-validated and accepted tools and procedures for analyzing upcoming stratification events. A research plan covering the main questions is outlined, and a few suggestions regarding more limited research activities are given. Since many of the stratification events results in thermal loads that are localized in time and space, CFD is a suitable tool. However, the often very large and complex geometry posses a great challenge to CFD, and it is important to perform a step-by-step increase in complexity with intermediate validation versus relevant experimental data. The ultimate goal is to establish Best Practice Guidelines that can be followed both by utilities and authorities in case of an event including stratification and thermal loads. An extension of the existing Best Practice Guidelines for CFD in nuclear safety applications developed by OECD/NEA is thus suggested as a relevant target for a continuation project. (au)

  9. Agglomeration of Non-metallic Inclusions at Steel/Ar Interface: In- Situ Observation Experiments and Model Validation

    Science.gov (United States)

    Mu, Wangzhong; Dogan, Neslihan; Coley, Kenneth S.

    2017-10-01

    Better understanding of agglomeration behavior of nonmetallic inclusions in the steelmaking process is important to control the cleanliness of the steel. In this work, a revision on the Paunov simplified model has been made according to the original Kralchevsky-Paunov model. Thus, this model has been applied to quantitatively calculate the attractive capillary force on inclusions agglomerating at the liquid steel/gas interface. Moreover, the agglomeration behavior of Al2O3 inclusions at a low carbon steel/Ar interface has been observed in situ by high-temperature confocal laser scanning microscopy (CLSM). The velocity and acceleration of inclusions and attractive forces between Al2O3 inclusions of various sizes were calculated based on the CLSM video. The results calculated using the revised model offered a reasonable fit with the present experimental data for different inclusion sizes. Moreover, a quantitative comparison was made between calculations using the equivalent radius of a circle and those using the effective radius. It was found that the calculated capillary force using equivalent radius offered a better fit with the present experimental data because of the inclusion characteristics. Comparing these results with other studies in the literature allowed the authors to conclude that when applied in capillary force calculations, the equivalent radius is more suitable for inclusions with large size and irregular shape, and the effective radius is more appropriate for inclusions with small size or a large shape factor. Using this model, the effect of inclusion size on attractive capillary force has been investigated, demonstrating that larger inclusions are more strongly attracted.

  10. Validation of a turbulent Kelvin-Helmholtz shear layer model using a high-energy-density OMEGA laser experiment.

    Science.gov (United States)

    Hurricane, O A; Smalyuk, V A; Raman, K; Schilling, O; Hansen, J F; Langstaff, G; Martinez, D; Park, H-S; Remington, B A; Robey, H F; Greenough, J A; Wallace, R; Di Stefano, C A; Drake, R P; Marion, D; Krauland, C M; Kuranz, C C

    2012-10-12

    Following the successful demonstration of an OMEGA laser-driven platform for generating and studying nearly two-dimensional unstable plasma shear layers [Hurricane et al., Phys. Plasmas 16, 056305 (2009); Harding et al., Phys. Rev. Lett. 103, 045005 (2009)], this Letter reports on the first quantitative measurement of turbulent mixing in a high-energy-density plasma. As a blast wave moves parallel to an unperturbed interface between a low-density foam and a high-density plastic, baroclinic vorticity is deposited at the interface and a Kelvin-Helmholtz instability-driven turbulent mixing layer is created in the postshock flow due to surface roughness. The spatial scale and density profile of the turbulent layer are diagnosed using x-ray radiography with sufficiently small uncertainty so that the data can be used to ~0.17 μm) in the postshock plasma flow are consistent with an "inertial subrange," within which a Kolmogorov turbulent energy cascade can be active. An illustration of comparing the data set with the predictions of a two-equation turbulence model in the ares radiation hydrodynamics code is also presented.

  11. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  12. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  13. Feature extraction for structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

    2010-11-08

    This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

  14. Validation of KENO V.a Comparison with Critical Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, W.C.

    1999-01-01

    Section 1 of this report documents the validation of KENO V.a against 258 critical experiments. Experiments considered were primarily high or low enriched uranium systems. The results indicate that the KENO V.a Monte Carlo Criticality Program accurately calculates a broad range of critical experiments. A substantial number of the calculations showed a positive or negative bias in excess of 1 1/2% in k-effective (k{sub eff}). Classes of criticals which show a bias include 3% enriched green blocks, highly enriched uranyl fluoride slab arrays, and highly enriched uranyl nitrate arrays. If these biases are properly taken into account, the KENO V.a code can be used with confidence for the design and criticality safety analysis of uranium-containing systems. Section 2 of this report documents the results of investigation into the cause of the bias observed in Sect. 1. The results of this study indicate that the bias seen in Sect. 1 is caused by code bias, cross-section bias, reporting bias, and modeling bias. There is evidence that many of the experiments used in this validation and in previous validations are not adequately documented. The uncertainty in the experimental parameters overshadows bias caused by the code and cross sections and prohibits code validation to better than about 1% in k{sub eff}.

  15. Prognostics of Power Electronics, Methods and Validation Experiments

    Science.gov (United States)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  16. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  17. Validation for a recirculation model.

    Science.gov (United States)

    LaPuma, P T

    2001-04-01

    Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation.

  18. Validation of Magnetospheric Magnetohydrodynamic Models

    Science.gov (United States)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  19. Experiments for the Validation of Debris and Shrapnel Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A E; Andrew, J; Eder, D; Kalantar, D; Masters, N; Fisher, A; Anderson, R; Gunney, B; Brown, B; Sain, K; Tobin, A M; Debonnel, C; Gielle, A; Combis, P; Jadaud, J P; Meyers, M; Jarmakani, H

    2007-08-29

    The debris and shrapnel generated by laser targets are important factors in the operation of a large laser facility such as NIF, LMJ, and Orion. Past experience has shown that it is possible for such target debris to render diagnostics inoperable and also to penetrate or damage optical protection (debris) shields. We are developing the tools to allow evaluation of target configurations in order to better mitigate the generation and impact of debris, including development of dedicated modeling codes. In order to validate these predictive simulations, we briefly describe a series of experiments aimed at determining the amount of debris and/or shrapnel produced in controlled situations. We use glass and aerogel to capture generated debris/shrapnel. The experimental targets include hohlraums (halfraums) and thin foils in a variety of geometries. Post-shot analysis includes scanning electron microscopy and x-ray tomography. We show the results of some of these experiments and discuss modeling efforts.

  20. Software Validation via Model Animation

    Science.gov (United States)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  1. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  2. Obstructive lung disease models: what is valid?

    Science.gov (United States)

    Ferdinands, Jill M; Mannino, David M

    2008-12-01

    Use of disease simulation models has led to scrutiny of model methods and demand for evidence that models credibly simulate health outcomes. We sought to describe recent obstructive lung disease simulation models and their validation. Medline and EMBASE were used to identify obstructive lung disease simulation models published from January 2000 to June 2006. Publications were reviewed to assess model attributes and four types of validation: first-order (verification/debugging), second-order (comparison with studies used in model development), third-order (comparison with studies not used in model development), and predictive validity. Six asthma and seven chronic obstructive pulmonary disease models were identified. Seven (54%) models included second-order validation, typically by comparing observed outcomes to simulations of source study cohorts. Seven (54%) models included third-order validation, in which modeled outcomes were usually compared qualitatively for agreement with studies independent of the model. Validation endpoints included disease prevalence, exacerbation, and all-cause mortality. Validation was typically described as acceptable, despite near-universal absence of criteria for judging adequacy of validation. Although over half of recent obstructive lung disease simulation models report validation, inconsistencies in validation methods and lack of detailed reporting make assessing adequacy of validation difficult. For simulation modeling to be accepted as a tool for evaluating clinical and public health programs, models must be validated to credibly simulate health outcomes of interest. Defining the required level of validation and providing guidance for quantitative assessment and reporting of validation are important future steps in promoting simulation models as practical decision tools.

  3. Model Experiments and Model Descriptions

    Science.gov (United States)

    Jackman, Charles H.; Ko, Malcolm K. W.; Weisenstein, Debra; Scott, Courtney J.; Shia, Run-Lie; Rodriguez, Jose; Sze, N. D.; Vohralik, Peter; Randeniya, Lakshman; Plumb, Ian

    1999-01-01

    The Second Workshop on Stratospheric Models and Measurements Workshop (M&M II) is the continuation of the effort previously started in the first Workshop (M&M I, Prather and Remsberg [1993]) held in 1992. As originally stated, the aim of M&M is to provide a foundation for establishing the credibility of stratospheric models used in environmental assessments of the ozone response to chlorofluorocarbons, aircraft emissions, and other climate-chemistry interactions. To accomplish this, a set of measurements of the present day atmosphere was selected. The intent was that successful simulations of the set of measurements should become the prerequisite for the acceptance of these models as having a reliable prediction for future ozone behavior. This section is divided into two: model experiment and model descriptions. In the model experiment, participant were given the charge to design a number of experiments that would use observations to test whether models are using the correct mechanisms to simulate the distributions of ozone and other trace gases in the atmosphere. The purpose is closely tied to the needs to reduce the uncertainties in the model predicted responses of stratospheric ozone to perturbations. The specifications for the experiments were sent out to the modeling community in June 1997. Twenty eight modeling groups responded to the requests for input. The first part of this section discusses the different modeling group, along with the experiments performed. Part two of this section, gives brief descriptions of each model as provided by the individual modeling groups.

  4. Model validation of channel zapping quality

    OpenAIRE

    Kooij, R.; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective ...

  5. Intercenter validation of a knowledge based model for automated planning of volumetric modulated arc therapy for prostate cancer. The experience of the German RapidPlan Consortium.

    Science.gov (United States)

    Schubert, Carolin; Waletzko, Oliver; Weiss, Christian; Voelzke, Dirk; Toperim, Sevda; Roeser, Arnd; Puccini, Silvia; Piroth, Marc; Mehrens, Christian; Kueter, Jan-Dirk; Hierholz, Kirsten; Gerull, Karsten; Fogliata, Antonella; Block, Andreas; Cozzi, Luca

    2017-01-01

    To evaluate the performance of a model-based optimisation process for volumetric modulated arc therapy applied to prostate cancer in a multicentric cooperative group. The RapidPlan (RP) knowledge-based engine was tested for the planning of Volumetric modulated arc therapy with RapidArc on prostate cancer patients. The study was conducted in the frame of the German RapidPlan Consortium (GRC). 43 patients from one institute of the GRC were used to build and train a RP model. This was further shared with all members of the GRC plus an external site from a different country to increase the heterogeneity of the patient's sampling. An in silico multicentric validation of the model was performed at planning level by comparing RP against reference plans optimized according to institutional procedures. A total of 60 patients from 7 institutes were used. On average, the automated RP based plans resulted fully consistent with the manually optimised set with a modest tendency to improvement in the medium-to-high dose region. A per-site stratification allowed to identify different patterns of performance of the model with some organs at risk resulting better spared with the manual or with the automated approach but in all cases the RP data fulfilled the clinical acceptability requirements. Discrepancies in the performance were due to different contouring protocols or to different emphasis put in the optimization of the manual cases. The multicentric validation demonstrated that it was possible to satisfactorily optimize with the knowledge based model patients from all participating centres. In the presence of possibly significant differences in the contouring protocols, the automated plans, though acceptable and fulfilling the benchmark goals, might benefit from further fine tuning of the constraints. The study demonstrates that, at least for the case of prostate cancer patients, it is possibile to share models among different clinical institutes in a cooperative framework.

  6. [Catalonia's primary healthcare accreditation model: a valid model].

    Science.gov (United States)

    Davins, Josep; Gens, Montserrat; Pareja, Clara; Guzmán, Ramón; Marquet, Roser; Vallès, Roser

    2014-07-01

    There are few experiences of accreditation models validated by primary care teams (EAP). The aim of this study was to detail the process of design, development, and subsequent validation of the consensus EAP accreditation model of Catalonia. An Operating Committee of the Health Department of Catalonia revised models proposed by the European Foundation for Quality Management, the Joint Commission International and the Institut Català de la Salut and proposed 628 essential standards to the technical group (25 experts in primary care and quality of care), to establish consensus standards. The consensus document was piloted in 30 EAP for the purpose of validating the contents, testing standards and identifying evidence. Finally, we did a survey to assess acceptance and validation of the document. The Technical Group agreed on a total of 414 essential standards. The pilot selected a total of 379. Mean compliance with the standards of the final document in the 30 EAP was 70.4%. The standards results were the worst fulfilment percentage. The survey target that 83% of the EAP found it useful and 78% found the content of the accreditation manual suitable as a tool to assess the quality of the EAP, and identify opportunities for improvement. On the downside they highlighted its complexity and laboriousness. We have a model that fits the reality of the EAP, and covers all relevant issues for the functioning of an excellent EAP. The model developed in Catalonia is a model for easy understanding.

  7. Validity - a matter of resonant experience

    DEFF Research Database (Denmark)

    Revsbæk, Line

    across researcher’s past experience from the case study and her own life. The autobiographic way of analyzing conventional interview material is exemplified with a case of a junior researcher researching newcomer innovation of others, drawing on her own experience of being newcomer in work community...

  8. A methodology for global validation of microarray experiments

    Directory of Open Access Journals (Sweden)

    Sladek Robert

    2006-07-01

    Full Text Available Abstract Background DNA microarrays are popular tools for measuring gene expression of biological samples. This ever increasing popularity is ensuring that a large number of microarray studies are conducted, many of which with data publicly available for mining by other investigators. Under most circumstances, validation of differential expression of genes is performed on a gene to gene basis. Thus, it is not possible to generalize validation results to the remaining majority of non-validated genes or to evaluate the overall quality of these studies. Results We present an approach for the global validation of DNA microarray experiments that will allow researchers to evaluate the general quality of their experiment and to extrapolate validation results of a subset of genes to the remaining non-validated genes. We illustrate why the popular strategy of selecting only the most differentially expressed genes for validation generally fails as a global validation strategy and propose random-stratified sampling as a better gene selection method. We also illustrate shortcomings of often-used validation indices such as overlap of significant effects and the correlation coefficient and recommend the concordance correlation coefficient (CCC as an alternative. Conclusion We provide recommendations that will enhance validity checks of microarray experiments while minimizing the need to run a large number of labour-intensive individual validation assays.

  9. Tracing Crop Nitrogen Dynamics on the Field-Scale by Combining Multisensoral EO Data with an Integrated Process Model- A Validation Experiment for Cereals in Southern Germany

    Science.gov (United States)

    Hank, Tobias B.; Bach, Heike; Danner, Martin; Hodrius, Martina; Mauser, Wolfram

    2016-08-01

    Nitrogen, being the basic element for the construction of plant proteins and pigments, is one of the most important production factors for agricultural cultivation. High resolution and near real-time information on nitrogen status in the soil thus is of highest interest for economically and ecologically optimized fertilizer planning and application. Unfortunately, nitrogen storage in the soil column cannot be directly observed with Earth Observation (EO) instruments. Advanced EO supported process modelling approaches therefore must be applied that allow tracing the spatiotemporal dynamics of nitrogen transformation, translocation and transport in the soil and in the canopy. Before these models can be applied as decision support tools for smart farming, they must be carefully parameterized and validated. This study applies an advanced land surface process model (PROMET) to selected winter cereal fields in Southern Germany and correlates the model outputs to destructively sampled nitrogen data from the growing season of 2015 (17 sampling dates, 8 sample locations). The spatial parametrization of the process model thereby is supported by assimilating eight satellite images (5 times Landsat 8 OLI and 3 times RapidEye). It was found that the model is capable of realistically tracing the temporal and spatial dynamics of aboveground nitrogen uptake and allocation (R2 = 0.84, RMSE 31.3 kg ha-1).

  10. Steam gasification of wood biomass in a fluidized biocatalytic system bed gasifier: A model development and validation using experiment and Boubaker Polynomials Expansion Scheme BPES

    Directory of Open Access Journals (Sweden)

    Luigi Vecchione

    2015-07-01

    Full Text Available One of the most important issues in biomass biocatalytic gasification is the correct prediction of gasification products, with particular attention to the Topping Atmosphere Residues (TARs. In this work, performedwithin the European 7FP UNIfHY project, we develops and validate experimentally a model which is able of predicting the outputs,including TARs, of a steam-fluidized bed biomass gasifier. Pine wood was chosen as biomass feedstock: the products obtained in pyrolysis tests are the relevant model input. Hydrodynamics and chemical properties of the reacting system are considered: the hydrodynamic approach is based on the two phase theory of fluidization, meanwhile the chemical model is based on the kinetic equations for the heterogeneous and homogenous reactions. The derived differentials equations for the gasifier at steady state were implemented MATLAB. Solution was consecutively carried out using the Boubaker Polynomials Expansion Scheme by varying steam/biomass ratio (0.5-1 and operating temperature (750-850°C.The comparison between models and experimental results showed that the model is able of predicting gas mole fractions and production rate including most of the representative TARs compounds

  11. Model validation of channel zapping quality

    Science.gov (United States)

    Kooij, Robert; Nicolai, Floris; Ahmed, Kamal; Brunnström, Kjell

    2009-02-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective experiments. These experiments included lean backwards zapping i.e. sitting in a sofa with a remote control. The subjects are more forgiving in this case and the requirement could be relaxed to 0.67 sec. We also conducted subjective experiments where the zapping times are varying. We found that the MOS rating decreases if zapping delay times are varying. In our experiments we assumed uniformly distributed delays, where the variance cannot be larger than the mean delay. We found that in order to obtain a MOS rating of at least 3.5, that the maximum allowed variance, and thus also the maximum allowed mean zapping delay, is 0.46 sec.

  12. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the

  13. Validation of wind loading codes by experiments

    NARCIS (Netherlands)

    Geurts, C.P.W.

    1998-01-01

    Between 1994 and 1997, full scale measurements of the wind and wind induced pressures were carried out on the main building of Eindhoven University of Technology. Simultaneously, a comparative wind tunnel experiment was performed in an atmospheric boundary layer wind tunnel. In this paper, the trans

  14. [Validity of psychoprophylaxis in obstetrics. Authors' experience].

    Science.gov (United States)

    D'Alfonso, A; Zaurito, V; Facchini, D; Di Stefano, L; Patacchiola, F; Cappa, F

    1990-12-01

    The Authors report the results based on 20 years of practice on obstetric psycho-prophylaxis (PPO). Data on presence at course, on frequency, on primipares/pluripares ratio, on labour, on timing and mode of delivery, are assembled. Moreover, neonatal status at birth and at 10th day of life, are investigated. The data obtained were compared with a control group, constituted by women without any treatment before delivery. The acquired experience confirm the utility of PPO in the ordinary clinical practice.

  15. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper;

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....

  16. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  17. Validation of automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1995-01-01

    The System Management and Production Laboratory, Research Institute, The University of Alabama in Huntsville (UAH), was tasked by the Microgravity Experiment Projects (MEP) Office of the Payload projects Office (PPO) at Marshall Space Flight Center (MSFC) to conduct research in the current methods of written documentation control and retrieval. The goals of this research were to determine the logical interrelationships within selected NASA documentation, and to expand on a previously developed prototype system to deliver a distributable, electronic knowledge-based system. This computer application would then be used to provide a 'paperless' interface between the appropriate parties for the required NASA documentation.

  18. Experiments for the validation of debris and shrapnel calculations

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A E; Eder, D; Kalantar, D; Masters, N; Fisher, A; Anderson, R; Gunney, B; Brown, B; Sain, K [LLNL Livermore, CA (United States); Debonnel, C S; Bonneau, F; Bourgade, J-L; Combis, P; Jadaud, J-P; Maroni; Ulmer, J-L [CEA/DIF (France); Andrew, J [AWE (United Kingdom); Chevalier, J-M; Geille, A; Raffestin, D [CEA/CESTA (France)], E-mail: koniges@llnl.gov (and others)

    2008-05-15

    The debris and shrapnel generated by laser targets will play an increasingly major role in the operation of large laser facilities such as NIF, LMJ, and Orion. Past experience has shown that it is possible for such target debris/shrapnel to render diagnostics inoperable and also to penetrate or damage optical protection (debris) shields. We are developing the tools to evaluate target configurations, in order to better mitigate the generation and impact of debris/shrapnel, including development of dedicated modelling codes. In order to validate these predictive simulations, we briefly describe a series of experiments aimed at determining the amount of debris and/or shrapnel produced in controlled situations. We use glass plates and aerogel to capture generated debris/shrapnel. The experimental targets include hohlraums, halfraums, and thin foils in a variety of geometries. Post-shot analysis includes scanning electron microscopy and x-ray tomography. We show results from a few of these experiments and discuss related modelling efforts.

  19. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  20. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  1. Investigating Validity Evidence for the Experiences in Close Relationships-Revised Questionnaire

    Science.gov (United States)

    Fairchild, Amanda J.; Finney, Sara J.

    2006-01-01

    The current study gathered internal structural validity and external criterion validity evidence for the Experiences in Close Relationships-Revised Questionnaire (ECR-R) scores. Specifically, confirmatory factor analysis of the data provided general support for the hypothesized two-factor model, and hypothesized relationships with external…

  2. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  3. Unit testing, model validation, and biological simulation

    Science.gov (United States)

    Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models. PMID:27635225

  4. Bridging experiments, models and simulations

    DEFF Research Database (Denmark)

    Carusi, Annamaria; Burrage, Kevin; Rodríguez, Blanca

    2012-01-01

    Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our understan...... that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both....... of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process...... understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present...

  5. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  6. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  7. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.

  8. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  9. Validation of a Model for Ice Formation around Finned Tubes

    Directory of Open Access Journals (Sweden)

    Kamal A. R. Ismai

    2016-09-01

    Full Text Available Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was discretized by the finite difference method. Experiments were realized specifically to validate the model and its numerical predictions.

  10. Validation of the Hot Strip Mill Model

    Energy Technology Data Exchange (ETDEWEB)

    Richard Shulkosky; David Rosberg; Jerrud Chapman

    2005-03-30

    The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

  11. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, L.F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  12. The Mistra experiment for field containment code validation first results

    Energy Technology Data Exchange (ETDEWEB)

    Caron-Charles, M.; Blumenfeld, L. [CEA Saclay, 91 - Gif sur Yvette (France)

    2001-07-01

    The MISTRA facility is a large scale experiment, designed for the purpose of thermal-hydraulics multi-D codes validation. A short description of the facility, the set up of the instrumentation and the test program are presented. Then, the first experimental results, studying helium injection in the containment and their calculations are detailed. (author)

  13. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  14. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... Metro became operational in autumn 2002. We observed that forecasts from the demand sub-models agree well with the data from the 2000 national travel survey, with the mode choice forecasts in particular being a good match with the observed modal split. The results of the 2000 car assignment model...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  15. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  16. Regimes of validity for balanced models

    Science.gov (United States)

    Gent, Peter R.; McWilliams, James C.

    1983-07-01

    Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

  17. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  18. Validation of hadronic models in GEANT4

    CERN Document Server

    Koi, Tatsumi; Folger, Günter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; Lei, Fan; Wellisch, Hans-Peter

    2007-01-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  19. Validation of the ostracism experience scale for adolescents.

    Science.gov (United States)

    Gilman, Rich; Carter-Sowell, Adrienne; Dewall, C Nathan; Adams, Ryan E; Carboni, Inga

    2013-06-01

    This study validates a new self-report measure, the Ostracism Experience Scale for Adolescents (OES-A). Nineteen items were tested on a sample of 876 high school seniors to assess 2 of the most common ostracism experiences: being actively excluded from the peer group and being largely ignored by others. Exploratory and confirmatory factor analyses, bivariate correlations, and hierarchical regression provided support for the construct validity of the measure. The findings provided psychometric support for the OES-A, which could be used in research into the nature and correlates of social ostracism among older adolescents when a brief self-report measure is needed. Further, the OES-A may help determine how social ostracism subtypes differentially predict health-compromising behaviors later in development, as well as factors that protect against the most pernicious effects of ostracism.

  20. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  1. Data assimilation experiments with MPIESM climate model

    Directory of Open Access Journals (Sweden)

    Belyaev Konstantin

    2016-01-01

    Full Text Available Further development of data assimilation technique and its application in numerical experiments with state-of-the art Max Plank Institute Earth System model have been carried out. In particularly, the stability problem of assimilation is posed and discussed In the experiments the sea surface height data from archive Archiving, Validating and Interpolating Satellite Ocean have been used. All computations have been realized on cluster system of German Climate Computing Center. The results of numerical experiments with and without assimilation were recorded and analyzed. A special attention has been focused on the Arctic zone. It is shown that there is a good coincidence of model tendencies and independent data.

  2. Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

    2004-10-01

    A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

  3. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    Science.gov (United States)

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research.

  4. Validation experiments for LBM simulations of electron beam melting

    Science.gov (United States)

    Ammer, Regina; Rüde, Ulrich; Markl, Matthias; Jüchter, Vera; Körner, Carolin

    2014-05-01

    This paper validates three-dimensional (3D) simulation results of electron beam melting (EBM) processes by comparing experimental and numerical data. The physical setup is presented which is discretized by a 3D thermal lattice Boltzmann method (LBM). An experimental process window is used for the validation depending on the line energy injected into the metal powder bed and the scan velocity of the electron beam. In the process window, the EBM products are classified into the categories, porous, good and swelling, depending on the quality of the surface. The same parameter sets are used to generate a numerical process window. A comparison of numerical and experimental process windows shows a good agreement. This validates the EBM model and justifies simulations for future improvements of the EBM processes. In particular, numerical simulations can be used to explain future process window scenarios and find the best parameter set for a good surface quality and dense products.

  5. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    Science.gov (United States)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  6. Disruption Tolerant Networking Flight Validation Experiment on NASA's EPOXI Mission

    Science.gov (United States)

    Wyatt, Jay; Burleigh, Scott; Jones, Ross; Torgerson, Leigh; Wissler, Steve

    2009-01-01

    In October and November of 2008, the Jet Propulsion Laboratory installed and tested essential elements of Delay/Disruption Tolerant Networking (DTN) technology on the Deep Impact spacecraft. This experiment, called Deep Impact Network Experiment (DINET), was performed in close cooperation with the EPOXI project which has responsibility for the spacecraft. During DINET some 300 images were transmitted from the JPL nodes to the spacecraft. Then they were automatically forwarded from the spacecraft back to the JPL nodes, exercising DTN's bundle origination, transmission, acquisition, dynamic route computation, congestion control, prioritization, custody transfer, and automatic retransmission procedures, both on the spacecraft and on the ground, over a period of 27 days. All transmitted bundles were successfully received, without corruption. The DINET experiment demonstrated DTN readiness for operational use in space missions. This activity was part of a larger NASA space DTN development program to mature DTN to flight readiness for a wide variety of mission types by the end of 2011. This paper describes the DTN protocols, the flight demo implementation, validation metrics which were created for the experiment, and validation results.

  7. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    The absorption of probe pulses in ultrafast pump–probe experiments can be determined from the Bersohn–Zewail (BZ) model. The model relies on classical mechanics to describe the dynamics of the nuclei in the excited electronic state prepared by the ultrashort pump pulse. The BZ model provides...... excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  8. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  9. Argonne Bubble Experiment Thermal Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Buechler, Cynthia Eileen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-03

    This report will describe the Computational Fluid Dynamics (CFD) model that was developed to calculate the temperatures and gas volume fractions in the solution vessel during the irradiation. It is based on the model used to calculate temperatures and volume fractions in an annular vessel containing an aqueous solution of uranium . The experiment was repeated at several electron beam power levels, but the CFD analysis was performed only for the 12 kW irradiation, because this experiment came the closest to reaching a steady-state condition. The aim of the study is to compare results of the calculation with experimental measurements to determine the validity of the CFD model.

  10. The Outcomes and Experiences Questionnaire: development and validation

    Directory of Open Access Journals (Sweden)

    Gibbons E

    2015-07-01

    Full Text Available Elizabeth Gibbons, Paul Hewitson, David Morley, Crispin Jenkinson, Ray Fitzpatrick Health Service Research Unit, Nuffield Department of Population Health, University of Oxford, Oxford, UK Background: This report presents evidence regarding the development and validation of a new questionnaire, the Outcomes and Experiences Questionnaire (OEQ. The rationale for the questionnaire is to bring together into one short instrument questions about two distinct domains – patients' reports of the outcomes of their care and how they experience care.Methods: The OEQ was developed from literature reviews, iterative drafting and discussion within the research group and cognitive testing with a sample of patients who had a hospital experience. Two validation studies were carried out with an eleven item OEQ. The goals of the studies were to examine response rates and to test specific hypotheses of how OEQ should relate to other variables normally collected in the two studies. In the first study, the OEQ was added to the follow-up questionnaires for patients (n=490 receiving surgery for hip or knee replacement or varicose vein procedures participating in the national Patient Reported Outcome Measures (PROMs program permitting the analysis of the OEQ against change scores for the measures obtained before and after surgery. In the second study the OEQ was included in a sample of patients (n=586 who had been selected to receive the National Health Service (NHS inpatient survey from three contrasting hospital trusts.Results: Results from study one provided consistent and substantial evidence of construct validity of OEQ particularly for those receiving hip or knee replacement. The OEQ sub-scales behaved differently and as predicted against other PROMs variables. Again hypotheses of how the two sub-scales regarding outcomes and experiences would relate to the existing domains of patient experience in the inpatient survey were broadly confirmed in study two

  11. Free Radicals and Reactive Intermediates for the SAGE III Ozone Loss and Validation Experiment (SOLVE) Mission

    Science.gov (United States)

    Anderson, James G.

    2001-01-01

    This grant provided partial support for participation in the SAGE III Ozone Loss and Validation Experiment. The NASA-sponsored SOLVE mission was conducted Jointly with the European Commission-sponsored Third European Stratospheric Experiment on Ozone (THESEO 2000). Researchers examined processes that control ozone amounts at mid to high latitudes during the arctic winter and acquired correlative data needed to validate the Stratospheric Aerosol and Gas Experiment (SAGE) III satellite measurements that are used to quantitatively assess high-latitude ozone loss. The campaign began in September 1999 with intercomparison flights out of NASA Dryden Flight Research Center in Edwards. CA. and continued through March 2000. with midwinter deployments out of Kiruna. Sweden. SOLVE was co-sponsored by the Upper Atmosphere Research Program (UARP). Atmospheric Effects of Aviation Project (AEAP). Atmospheric Chemistry Modeling and Analysis Program (ACMAP). and Earth Observing System (EOS) of NASA's Earth Science Enterprise (ESE) as part of the validation program for the SAGE III instrument.

  12. Model validation in soft systems practice

    Energy Technology Data Exchange (ETDEWEB)

    Checkland, P. [Univ. of Lancaster (United Kingdom)

    1995-03-01

    The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and in addition use models as surrogates for the real world, on which experimentation is cheap. In these fields the key feature of a model is representativeness. In soft systems methodology (SSM) models are not of part of the world; they are only relevant to debate about the real world and are used in a cyclic learning process. The paper shows how the different concepts of validation in classical OR and SSM lead to a way of sharply defining the nature of `soft OR`. 21 refs.

  13. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  14. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  15. Information systems validation using formal models

    Directory of Open Access Journals (Sweden)

    Azadeh Sarram

    2014-03-01

    Full Text Available During the past few years, there has been growing interest to use unified modeling language (UML to consider the functional requirements. However, lacking a tool to detect the accuracy and the logic of diagrams in this language makes a formal model indispensable. In this study, conversion of primary UML model of a system to a colored Petri net has been accomplished in order to examine the precision of the model. For this purpose, first the definition of priority and implementation tags for UML activity diagram are provided; then it is turned into colored Petri net. Second, the proposed model provides translated tags in terms of net transitions and some monitoring are used to control the system characteristics. Finally, an executable model of UML activity diagram is provided so that the designer could simulate the model by using the simulation results to detect and to refine the problems of the model. In addition, by checking the results, we find out the proposed method enhances authenticity and accuracy of early models and the ratio of system validation increases compared with previous methods.

  16. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  17. Validation of Geant4 hadronic physics models at intermediate energies

    Science.gov (United States)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  18. Bayesian structural equation modeling method for hierarchical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Xiaomo [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: xiaomo.jiang@vanderbilt.edu; Mahadevan, Sankaran [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: sankaran.mahadevan@vanderbilt.edu

    2009-04-15

    A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level model validation under uncertainty, integrating several components: lower-level data, higher-level data, computational model, and latent variables. The method proposed in this paper uses latent variables to model two sets of relationships, namely, the computational model to system-level data, and lower-level data to system-level data. A Bayesian network with Markov chain Monte Carlo simulation is applied to represent the two relationships and to estimate the influencing factors between them. Bayesian hypothesis testing is employed to quantify the confidence in the predictive model at the system level, and the role of lower-level data in the model validation assessment at the system level. The proposed methodology is implemented for hierarchical assessment of three validation problems, using discrete observations and time-series data.

  19. BUGLE-96 validation with MORSE-SGC/S using water and iron experiments from SINBAD 97

    Energy Technology Data Exchange (ETDEWEB)

    Blanchard, A.

    1999-12-03

    This document summarizes the validation of MORSE-SGC/S with the BUGLE-96 cross section library. SINBAD Benchmark Experiment 2.004, Winfrith Water Benchmark Experiment and SBE 6.001, Karlsruhe Iron Sphere Benchmark Experiment were utilized for this validation. The MORESE-SGC/S code with the BUGLE-96 cross-section library was used to model the experimental configurations as given in SINDBAD 97. SINDBAD is a shielding integral benchmark archive and database developed at the Oak Ridge National Laboratory (ORNL). For means of comparison, the experimental models were also executed with MORSE-SGC/S using the BUGLE-80 cross-section library. BUGLE-96 cross section will be used for shielding applications only as recommended by ORNL.

  20. Modelling Urban Experiences

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2008-01-01

    How can urban designers develop an emotionally satisfying environment not only for today's users but also for coming generations? Which devices can they use to elicit interesting and relevant urban experiences? This paper attempts to answer these questions by analyzing the design of Zuidas, a new...

  1. Experiences from Designing and Validating a Software Modernization Transformation

    DEFF Research Database (Denmark)

    Iosif-Lazăr, Alexandru Florin; Al-Sibahi, Ahmad Salim; Dimovski, Aleksandar

    2015-01-01

    Software modernization often involves complex code transformations that convert legacy code to new architectures or platforms, while preserving the semantics of the original programs. We present the lessons learnt from an industrial software modernization project of considerable size. This includes...... collecting requirements for a code-to-model transformation, designing and implementing the transformation algorithm, and then validating correctness of this transformation for the code-base at hand. Our transformation is implemented in the TXL rewriting language and assumes specifically structured C++ code...... as input, which it translates to a declarative configuration model. The correctness criterion for the transformation is that the produced model admits the same configurations as the input code. The transformation converts C++ functions specifying around a thousand configuration parameters. We verify...

  2. Validation of a Model for Teaching Canine Fundoscopy.

    Science.gov (United States)

    Nibblett, Belle Marie D; Pereira, Mary Mauldin; Williamson, Julie A; Sithole, Fortune

    2015-01-01

    A validated teaching model for canine fundoscopic examination was developed to improve Day One fundoscopy skills while at the same time reducing use of teaching dogs. This novel eye model was created from a hollow plastic ball with a cutout for the pupil, a suspended 20-diopter lens, and paint and paper simulation of relevant eye structures. This eye model was mounted on a wooden stand with canine head landmarks useful in performing fundoscopy. Veterinary educators performed fundoscopy using this model and completed a survey to establish face and content validity. Subsequently, veterinary students were randomly assigned to pre-laboratory training with or without the use of this teaching model. After completion of an ophthalmology laboratory on teaching dogs, student outcome was assessed by measuring students' ability to see a symbol inserted on the simulated retina in the model. Students also completed a survey regarding their experience with the model and the laboratory. Overall, veterinary educators agreed that this eye model was well constructed and useful in teaching good fundoscopic technique. Student performance of fundoscopy was not negatively impacted by the use of the model. This novel canine model shows promise as a teaching and assessment tool for fundoscopy.

  3. Simulation - modeling - experiment; Simulation - modelisation - experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    After two workshops held in 2001 on the same topics, and in order to make a status of the advances in the domain of simulation and measurements, the main goals proposed for this workshop are: the presentation of the state-of-the-art of tools, methods and experiments in the domains of interest of the Gedepeon research group, the exchange of information about the possibilities of use of computer codes and facilities, about the understanding of physical and chemical phenomena, and about development and experiment needs. This document gathers 18 presentations (slides) among the 19 given at this workshop and dealing with: the deterministic and stochastic codes in reactor physics (Rimpault G.); MURE: an evolution code coupled with MCNP (Meplan O.); neutronic calculation of future reactors at EdF (Lecarpentier D.); advance status of the MCNP/TRIO-U neutronic/thermal-hydraulics coupling (Nuttin A.); the FLICA4/TRIPOLI4 thermal-hydraulics/neutronics coupling (Aniel S.); methods of disturbances and sensitivity analysis of nuclear data in reactor physics, application to VENUS-2 experimental reactor (Bidaud A.); modeling for the reliability improvement of an ADS accelerator (Biarotte J.L.); residual gas compensation of the space charge of intense beams (Ben Ismail A.); experimental determination and numerical modeling of phase equilibrium diagrams of interest in nuclear applications (Gachon J.C.); modeling of irradiation effects (Barbu A.); elastic limit and irradiation damage in Fe-Cr alloys: simulation and experiment (Pontikis V.); experimental measurements of spallation residues, comparison with Monte-Carlo simulation codes (Fallot M.); the spallation target-reactor coupling (Rimpault G.); tools and data (Grouiller J.P.); models in high energy transport codes: status and perspective (Leray S.); other ways of investigation for spallation (Audoin L.); neutrons and light particles production at intermediate energies (20-200 MeV) with iron, lead and uranium targets (Le Colley F

  4. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  5. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  6. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  7. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  8. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns.

    Directory of Open Access Journals (Sweden)

    Guillaume Chérel

    Full Text Available Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model's predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic.

  9. Measurements and Status at the CERES Ocean Validation Experiment (COVE)

    Science.gov (United States)

    Fabbri, B. E.; Denn, F. M.; Schuster, G. L.; Arduini, R. F.; Madigan, J. J.; Rutan, D. A.

    2014-12-01

    The Clouds and the Earth's Radiant Energy System (CERES) is a suite of instruments flying on several earth-observing satellites that provides data products of radiant energy from the top of the atmosphere to the Earth's surface. The CERES Ocean Validation Experiment (COVE) was established in 1999 as an ocean surface validation site for CERES and other satellite instruments. COVE is located at Chesapeake Light Station, approximately 25 kilometers east of Virginia (coordinates: 36.90N, 75.71W). COVE measurements include downwelling and upwelling radiant flux at visible and infrared wavelengths, basic meteorological parameters, aerosol optical depth, black carbon, total column water vapor, cloud heights, and more. COVE is part of several networks including the Baseline Surface Radiation Network (BSRN), Aerosol Robotic Network (AERONET), Micro-Pulse Lidar Network (MPLNET) and Global Positioning System Meteorology (GPS-MET). A table will be displayed that outlines the current instrumentation and measurements being collected at COVE. Select data results will be presented, including CERES satellite derived data versus COVE surface observed measurements. Also, climatologies such as black carbon from an Aethalometer will be disclosed. In October 2012, the Department of Energy (D.O.E.) purchased Chesapeake Light with the goal of producing a base station for vertically defined wind profiles. While this project is still in the planning phase, the D.O.E. has allowed our research to continue in its current state.

  10. Model calibration and validation of an impact test simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

    2001-01-01

    This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

  11. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, Shu A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1998-03-01

    In order to make benchmark validation of the existing evaluated nuclear data for fusion related material, neutron leakage spectra from spherical piles were measured with a time-of-flight technique using the intense 14 MeV neutron source, OKTAVIAN in the energy range from 0.1 to 15 MeV. The neutron energy spectra were obtained as the absolute value normalized per the source neutron. The measured spectra were compared with those by theoretical calculation using a Monte Carlo neutron transport code, MCNP with several libraries processed from the evaluated nuclear data files. Comparison has been made with the spectrum shape, the C/E values of neutron numbers integrated in 4 energy regions and the calculated spectra unfolded by the number of collisions, especially those after a single collision. The new libraries predicted the experiment fairly well for Li, Cr, Mn, Cu and Mo. For Al, Si, Zr, Nb and W, new data files could give fair prediction. However, C/E differed more than 20% for several regions. For LiF, CF{sub 2}, Ti and Co, no calculation could predict the experiment. The detailed discussion has been given for Cr, Mn and Cu samples. EFF-2 calculation overestimated by 24% for the Cr experiment between 1 and 5-MeV neutron energy region, presumably because of overestimation of inelastic cross section and {sup 52}Cr(n,2n) cross section and the problem in energy and angular distribution of secondary neutrons in EFF-2. For Cu, ENDF/B-VI and EFF-2 overestimated the experiment by about 20 to 30-% in the energy range between 5 and 12-MeV, presumably from the problem in inelastic scattering cross section. (author)

  12. Validation and verification of agent models for trust: Independent compared to relative trust

    NARCIS (Netherlands)

    Hoogendoorn, M.; Jaffry, S.W.; Maanen, P.P. van

    2011-01-01

    In this paper, the results of a validation experiment for two existing computational trust models describing human trust are reported. One model uses experiences of performance in order to estimate the trust in different trustees. The second model in addition carries the notion of relative trust. Th

  13. A computational fluid dynamics model for wind simulation:model implementation and experimental validation

    Institute of Scientific and Technical Information of China (English)

    Zhuo-dong ZHANG; Ralf WIELAND; Matthias REICHE; Roger FUNK; Carsten HOFFMANN; Yong LI; Michael SOMMER

    2012-01-01

    To provide physically based wind modelling for wind erosion research at regional scale,a 3D computational fluid dynamics (CFD) wind model was developed.The model was programmed in C language based on the Navier-Stokes equations,and it is freely available as open source.Integrated with the spatial analysis and modelling tool (SAMT),the wind model has convenient input preparation and powerful output visualization.To validate the wind model,a series of experiments was conducted in a wind tunnel.A blocking inflow experiment was designed to test the performance of the model on simulation of basic fluid processes.A round obstacle experiment was designed to check if the model could simulate the influences of the obstacle on wind field.Results show that measured and simulated wind fields have high correlations,and the wind model can simulate both the basic processes of the wind and the influences of the obstacle on the wind field.These results show the high reliability of the wind model.A digital elevation model (DEM) of an area (3800 m long and 1700 m wide) in the Xilingele grassland in Inner Mongolia (autonomous region,China) was applied to the model,and a 3D wind field has been successfully generated.The clear implementation of the model and the adequate validation by wind tunnel experiments laid a solid foundation for the prediction and assessment of wind erosion at regional scale.

  14. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  15. Validation and Scenario Analysis of a Soil Organic Carbon Model

    Institute of Scientific and Technical Information of China (English)

    HUANG Yao; LIU Shi-liang; SHEN Qi-rong; ZONG Liang-gang; JIANG Ding-an; HUANG Hong-guang

    2002-01-01

    A model developed by the authors was validated against independent data sets. The data sets were obtained from field experiments of crop residue decomposition and a 7-year soil improvement in Yixing City, Jiangsu Province. Model validation indicated that soil organic carbon dynamics can be simulated from the weather variables of temperature, sunlight and precipitation, soil clay content and bulk density, grain yield of previous crops, qualities and quantities of the added organic matter. Model simulation in general agreed with the measurements. The comparison between computed and measured resulted in correlation coefficient γ2 values of 0.9291 * * * (n = 48) and 0. 6431 * * (n = 65) for the two experiments, respectively. Model prediction under three scenarios of no additional organic matter input, with an annual incorporation of rice and wheat straw at rates of 6.75t/ha and 9.0t/ha suggested that the soil organic carbon in Wanshi Township of Yixing City would be from an initial value of 7.85g/kg in 1983 to 6.30g/kg, 11.42g/kg and 13g/kg in 2014, respectively. Consequently, total nitrogen content of the soil was predicted to be respectively 0.49g/kg,0.89g/kg and 1.01g/kg under the three scenarios.

  16. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  17. Validation of system codes for plant application on selected experiments

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Marco K.; Risken, Tobias; Agethen, Kathrin; Bratfisch, Christoph [Bochum Univ. (Germany). Reactor Simulation and Safety Group

    2016-05-15

    For decades, the Reactor Simulation and Safety Group at Ruhr-Universitaet Bochum (RUB) contributes to nuclear safety by computer code validation and model development for nuclear safety analysis. Severe accident analysis codes are relevant tools for the understanding and the development of accident management measures. The accidents in the plants Three Mile Island (USA) in 1979 and Fukushima Daiichi (Japan) in 2011 influenced these research activities significantly due to the observed phenomena, such as molten core concrete interaction and hydrogen combustion. This paper gives a brief outline of recent research activities at RUB in the named fields, contributing to code preparation for plant applications. Simulations of the molten core concrete interaction tests CCI-2 and CCI-3 with ASTEC and the hydrogen combustion test Ix9 with COCOSYS are presented exemplarily. Additionally, the application on plants is demonstrated on chosen results of preliminary Fukushima calculations.

  18. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  19. CFD and FEM modeling of PPOOLEX experiments

    Energy Technology Data Exchange (ETDEWEB)

    Paettikangas, T.; Niemi, J.; Timperi, A. (VTT Technical Research Centre of Finland (Finland))

    2011-01-15

    Large-break LOCA experiment performed with the PPOOLEX experimental facility is analysed with CFD calculations. Simulation of the first 100 seconds of the experiment is performed by using the Euler-Euler two-phase model of FLUENT 6.3. In wall condensation, the condensing water forms a film layer on the wall surface, which is modelled by mass transfer from the gas phase to the liquid water phase in the near-wall grid cell. The direct-contact condensation in the wetwell is modelled with simple correlations. The wall condensation and direct-contact condensation models are implemented with user-defined functions in FLUENT. Fluid-Structure Interaction (FSI) calculations of the PPOOLEX experiments and of a realistic BWR containment are also presented. Two-way coupled FSI calculations of the experiments have been numerically unstable with explicit coupling. A linear perturbation method is therefore used for preventing the numerical instability. The method is first validated against numerical data and against the PPOOLEX experiments. Preliminary FSI calculations are then performed for a realistic BWR containment by modeling a sector of the containment and one blowdown pipe. For the BWR containment, one- and two-way coupled calculations as well as calculations with LPM are carried out. (Author)

  20. Model development and validation of a solar cooling plant

    Energy Technology Data Exchange (ETDEWEB)

    Zambrano, Darine; Garcia-Gabin, Winston [Escuela de Ingenieria Electrica, Facultad de Ingenieria, Universidad de Los Andes, La Hechicera, Merida 5101 (Venezuela); Bordons, Carlos; Camacho, Eduardo F. [Departamento de Ingenieria de Sistemas y Automatica, Escuela Superior de Ingenieros, Universidad de Sevilla, Camino de Los Descubrimientos s/n, Sevilla 41092 (Spain)

    2008-03-15

    This paper describes the dynamic model of a solar cooling plant that has been built for demonstration purposes using market-available technology and has been successfully operational since 2001. The plant uses hot water coming from a field of solar flat collectors which feed a single-effect absorption chiller of 35 kW nominal cooling capacity. The work includes model development based on first principles and model validation with a set of experiments carried out on the real plant. The simulation model has been done in a modular way, and can be adapted to other solar cooling-plants since the main modules (solar field, absorption machine, accumulators and auxiliary heater) can be easily replaced. This simulator is a powerful tool for solar cooling systems both during the design phase, when it can be used for component selection, and also for the development and testing of control strategies. (author)

  1. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  2. Full-scale validation of a model of algal productivity.

    Science.gov (United States)

    Béchet, Quentin; Shilton, Andy; Guieysse, Benoit

    2014-12-02

    While modeling algal productivity outdoors is crucial to assess the economic and environmental performance of full-scale cultivation, most of the models hitherto developed for this purpose have not been validated under fully relevant conditions, especially with regard to temperature variations. The objective of this study was to independently validate a model of algal biomass productivity accounting for both light and temperature and constructed using parameters experimentally derived using short-term indoor experiments. To do this, the accuracy of a model developed for Chlorella vulgaris was assessed against data collected from photobioreactors operated outdoor (New Zealand) over different seasons, years, and operating conditions (temperature-control/no temperature-control, batch, and fed-batch regimes). The model accurately predicted experimental productivities under all conditions tested, yielding an overall accuracy of ±8.4% over 148 days of cultivation. For the purpose of assessing the feasibility of full-scale algal cultivation, the use of the productivity model was therefore shown to markedly reduce uncertainty in cost of biofuel production while also eliminating uncertainties in water demand, a critical element of environmental impact assessments. Simulations at five climatic locations demonstrated that temperature-control in outdoor photobioreactors would require tremendous amounts of energy without considerable increase of algal biomass. Prior assessments neglecting the impact of temperature variations on algal productivity in photobioreactors may therefore be erroneous.

  3. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  4. Benchmark validation by means of pulsed sphere experiment at OKTAVIAN

    Energy Technology Data Exchange (ETDEWEB)

    Ichihara, Chihiro [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Hayashi, S.A.; Kimura, Itsuro; Yamamoto, Junji; Takahashi, Akito

    1997-03-01

    The new version of Japanese nuclear data library JENDL-3.2 has recently been released. JENDL Fusion File which adopted DDX representations for secondary neutrons was also improved with the new evaluation method. On the other hand, FENDL nuclear data project to compile nuclear data library for fusion related research has been conducted partly under auspices of International Atomic Energy Agency (IAEA). The first version FENDL-1 consists of JENDL-3.1, ENDF/B-VI, BROND-2 and EFF-1 and has been released in 1995. The work for the second version FENDL-2 is now ongoing. The Bench mark validation of the nuclear data libraries have been performed to help selecting the candidate for the FENDL-2. The benchmark experiment have been conducted at OKTAVIAN of Osaka university. The sample spheres were constructed by filling the spherical shells with sample. The leakage neutron spectra from sphere piles were measured with a time-of-flight method. The measured spectra were compared with the theoretical calculation using MCNP 4A and the processed libraries from JENDL-3.1, JENDL-3.2, JENDL Fusion File, and FENDL-1. JENDL Fusion File and JENDL-3.2 gave almost the same prediction for the experiment. And both prediction are almost satisfying for Li, Cr, Mn, Cu, Zr, Nb and Mo, whereas for Al, LiF, CF2, Si, Ti, Co and W there is some discrepancy. However, they gave better prediction than the calculations using the library from FENDL-1, except for W. (author)

  5. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  6. Validation of Biomarker-based risk prediction models

    OpenAIRE

    Taylor, Jeremy M.G.; Ankerst, Donna P.; Andridge, Rebecca R.

    2008-01-01

    The increasing availability and use of predictive models to facilitate informed decision making highlights the need for careful assessment of the validity of these models. In particular, models involving biomarkers require careful validation for two reasons: issues with overfitting when complex models involve a large number of biomarkers, and inter-laboratory variation in assays used to measure biomarkers. In this paper we distinguish between internal and external statistical validation. Inte...

  7. Empirical data validation for model building

    Science.gov (United States)

    Kazarian, Aram

    2008-03-01

    Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k I requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining

  8. EPIC Calibration/Validation Experiment Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Steven E [National Severe Storm Laboratory/NOAA; Chilson, Phillip [University of Oklahoma; Argrow, Brian [University of Colorado

    2017-03-15

    A field exercise involving several different kinds of Unmanned Aerial Systems (UAS) and supporting instrumentation systems provided by DOE/ARM and NOAA/NSSL was conducted at the ARM SGP site in Lamont, Oklahoma on 29-30 October 2016. This campaign was part of a larger National Oceanic and Atmospheric Administration (NOAA) UAS Program Office program awarded to the National Severe Storms Laboratory (NSSL). named Environmental Profiling and Initiation of Convection (EPIC). The EPIC Field Campaign (Test and Calibration/Validation) proposed to ARM was a test or “dry-run” for a follow-up campaign to be requested for spring/summer 2017. The EPIC project addresses NOAA’s objective to “evaluate options for UAS profiling of the lower atmosphere with applications for severe weather.” The project goal is to demonstrate that fixed-wing and rotary-wing small UAS have the combined potential to provide a unique observing system capable of providing detailed profiles of temperature, moisture, and winds within the atmospheric boundary layer (ABL) to help determine the potential for severe weather development. Specific project objectives are: 1) to develop small UAS capable of acquiring needed wind and thermodynamic profiles and transects of the ABL using one fixed-wing UAS operating in tandem with two different fixed rotary-wing UAS pairs; 2) adapt and test miniaturized, high-precision, and fast-response atmospheric sensors with high accuracy in strong winds characteristic of the pre-convective ABL in Oklahoma; 3) conduct targeted short-duration experiments at the ARM Southern Great Plains site in northern Oklahoma concurrently with a second site to be chosen in “real-time” from the Oklahoma Mesonet in coordination with the (National Weather Service (NWS)-Norman Forecast Office; and 4) gain valuable experience in pursuit of NOAA’s goals for determining the value of airborne, mobile observing systems for monitoring rapidly evolving high-impact severe weather

  9. Validation of the WATEQ4 geochemical model for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  10. ISOTHERMAL AIR INGRESS VALIDATION EXPERIMENTS AT IDAHO NATIONAL LABORATORY: DESCRIPTION AND SUMMARY OF DATA

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2010-09-01

    Idaho National Laboratory performed air ingress experiments as part of validating computational fluid dynamics code (CFD). An isothermal stratified flow experiment was designed and set to understand stratified flow phenomena in the very high temperature gas cooled reactor (VHTR) and to provide experimental data for validating computer codes. The isothermal experiment focused on three flow characteristics unique in the VHTR air-ingress accident: stratified flow in the horizontal pipe, stratified flow expansion at the pipe and vessel junction, and stratified flow around supporting structures. Brine and sucrose were used as heavy fluids and water was used as light fluids. The density ratios were changed between 0.87 and 0.98. This experiment clearly showed that a stratified flow between heavy and light fluids is generated even for very small density differences. The code was validated by conducting blind CFD simulations and comparing the results to the experimental data. A grid sensitivity study was also performed based on the Richardson extrapolation and the grid convergence index method for modeling confidence. As a result, the calculated current speed showed very good agreement with the experimental data, indicating that the current CFD methods are suitable for predicting density gradient stratified flow phenomena in the air-ingress accident.

  11. The structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ

    Directory of Open Access Journals (Sweden)

    Pieter Schaap

    2016-04-01

    Full Text Available Orientation: Best practice frameworks suggest that an assessment practitioner’s choice of an assessment tool should be based on scientific evidence that underpins the appropriate and just use of the instrument. This is a context-specific validity study involving a classified psychological instrument against the background of South African regulatory frameworks and contemporary validity theory principles.Research purpose: The aim of the study was to explore the structural validity of the Experience of Work and Life Circumstances Questionnaire (WLQ administered to employees in the automotive assembly plant of a South African automotive manufacturing company.Motivation for the study: Although the WLQ has been used by registered health practitioners and numerous researchers, evidence to support the structural validity is lacking. This study, therefore, addressed the need for context-specific empirical support for the validity of score inferences in respect of employees in a South African automotive manufacturing plant.Research design, approach and method: The research was conducted using a convenience sample (N = 217 taken from the automotive manufacturing company where the instrument was used. Reliability and factor analyses were carried out to explore the structural validity of the WLQ.Main findings: The reliability of the WLQ appeared to be acceptable, and the assumptions made about unidimensionality were mostly confirmed. One of the proposed higher-order structural models of the said questionnaire administered to the sample group was confirmed, whereas the other one was partially confirmed.Practical/managerial implications: The conclusion reached was that preliminary empirical grounds existed for considering the continued use of the WLQ (with some suggested refinements by the relevant company, provided the process of accumulating a body of validity evidence continued.Contribution/value-add: This study identified some of the difficulties that

  12. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  13. Validation of the Breastfeeding Experience Scale in a Sample of Iranian Mothers

    Directory of Open Access Journals (Sweden)

    Forough Mortazavi

    2014-01-01

    Full Text Available Objectives. The aim of this study was to validate the breastfeeding experience scale (BES in a sample of Iranian mothers. Methods. After translation and back translation of the BES, an expert panel evaluated the items by assessing the content validity ratio (CVR and content validity index (CVI. 347 of mothers visiting health centers completed the Farsi version of the BES in the first month postpartum. Exploratory factor analysis (EFA and confirmatory factor analysis (CFA were performed to indicate the scale constructs. Reliability was assessed by Cronbach's alpha coefficient. Results. CVR and CVI scores for the BES were 0.96 and 0.87, respectively. Cronbach's alpha coefficient for the BES was 0.83. The results of the EFA revealed a new 5-factor model. The results of the CFA for the BES indicated a marginally acceptable fit for the proposed model and acceptable fit for the new model (RMSEA = 0.064, SRMR = 0.064, χ2/df = 2.4, and CFI = 0.95. Mothers who were exclusively breastfeeding at the first month postpartum had less breastfeeding difficulties score (30.3 ± 7.6 than mothers who were on partial breastfeeding (36.7 ± 11.3 (P<0.001. Conclusions. The Farsi version of the BES is a reliable and valid instrument to assess postpartum breastfeeding difficulties in Iranian mothers.

  14. Validation of the breastfeeding experience scale in a sample of Iranian mothers.

    Science.gov (United States)

    Mortazavi, Forough; Mousavi, Seyed Abbas; Chaman, Reza; Khosravi, Ahmad

    2014-01-01

    Objectives. The aim of this study was to validate the breastfeeding experience scale (BES) in a sample of Iranian mothers. Methods. After translation and back translation of the BES, an expert panel evaluated the items by assessing the content validity ratio (CVR) and content validity index (CVI). 347 of mothers visiting health centers completed the Farsi version of the BES in the first month postpartum. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were performed to indicate the scale constructs. Reliability was assessed by Cronbach's alpha coefficient. Results. CVR and CVI scores for the BES were 0.96 and 0.87, respectively. Cronbach's alpha coefficient for the BES was 0.83. The results of the EFA revealed a new 5-factor model. The results of the CFA for the BES indicated a marginally acceptable fit for the proposed model and acceptable fit for the new model (RMSEA = 0.064, SRMR = 0.064, χ (2)/df = 2.4, and CFI = 0.95). Mothers who were exclusively breastfeeding at the first month postpartum had less breastfeeding difficulties score (30.3 ± 7.6) than mothers who were on partial breastfeeding (36.7 ± 11.3) (P < 0.001). Conclusions. The Farsi version of the BES is a reliable and valid instrument to assess postpartum breastfeeding difficulties in Iranian mothers.

  15. Radiative transfer model for contaminated slabs: experimental validations

    Science.gov (United States)

    Andrieu, F.; Schmidt, F.; Schmitt, B.; Douté, S.; Brissaud, O.

    2015-09-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kinds of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of 1.5 μm, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from 0.8 to 2.0 μm. In order to validate the model, we made qualitative tests to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a Bayesian inversion method in order to estimate the parameters (e.g., sample thickness, surface roughness) from the radiative measurements only. A simple comparison between the retrieved parameters and the direct independent measurements allowed us to validate the model. We developed an innovative Bayesian inversion approach to quantitatively estimate the uncertainties in the parameters avoiding the usual slow Monte Carlo approach. First we built lookup tables, and then we searched the best fits and calculated a posteriori density probability functions. The results show that the model is able to reproduce the geometrical energy distribution in the specular spot, as well as the spectral behavior of water ice slabs. In addition, the different parameters of the model are compatible with independent measurements.

  16. Proceedings of the workshop on integral experiment covariance data for critical safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik (ed.)

    2016-04-15

    For some time, attempts to quantify the statistical dependencies of critical experiments and to account for them properly in validation procedures were discussed in the literature by various groups. Besides the development of suitable methods especially the quality and modeling issues of the freely available experimental data are in the focus of current discussions, carried out for example in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECD-NEA Nuclear Science Committee. The same committee compiles and publishes also the freely available experimental data in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Most of these experiments were performed as series and might share parts of experimental setups leading to correlated results. The quality of the determination of these correlations and the underlying covariance data depend strongly on the quality of the documentation of experiments.

  17. Monitoring Building Deformation with InSAR: Experiments and Validation.

    Science.gov (United States)

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-12-20

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.

  18. The earth radiation budget experiment: Early validation results

    Science.gov (United States)

    Smith, G. Louis; Barkstrom, Bruce R.; Harrison, Edwin F.

    The Earth Radiation Budget Experiment (ERBE) consists of radiometers on a dedicated spacecraft in a 57° inclination orbit, which has a precessional period of 2 months, and on two NOAA operational meteorological spacecraft in near polar orbits. The radiometers include scanning narrow field-of-view (FOV) and nadir-looking wide and medium FOV radiometers covering the ranges 0.2 to 5 μm and 5 to 50 μm and a solar monitoring channel. This paper describes the validation procedures and preliminary results. Each of the radiometer channels underwent extensive ground calibration, and the instrument packages include in-flight calibration facilities which, to date, show negligible changes of the instruments in orbit, except for gradual degradation of the suprasil dome of the shortwave wide FOV (about 4% per year). Measurements of the solar constant by the solar monitors, wide FOV, and medium FOV radiometers of two spacecraft agree to a fraction of a percent. Intercomparisons of the wide and medium FOV radiometers with the scanning radiometers show agreement of 1 to 4%. The multiple ERBE satellites are acquiring the first global measurements of regional scale diurnal variations in the Earth's radiation budget. These diurnal variations are verified by comparison with high temporal resolution geostationary satellite data. Other principal investigators of the ERBE Science Team are: R. Cess, SUNY, Stoneybrook; J. Coakley, NCAR; C. Duncan, M. King and A Mecherikunnel, Goddard Space Flight Center, NASA; A. Gruber and A.J. Miller, NOAA; D. Hartmann, U. Washington; F.B. House, Drexel U.; F.O. Huck, Langley Research Center, NASA; G. Hunt, Imperial College, London U.; R. Kandel and A. Berroir, Laboratory of Dynamic Meteorology, Ecole Polytechique; V. Ramanathan, U. Chicago; E. Raschke, U. of Cologne; W.L. Smith, U. of Wisconsin and T.H. Vonder Haar, Colorado State U.

  19. Validation of NEPTUNE-CFD on ULPU-V experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jamet, Mathieu, E-mail: mathieu.jamet@edf.fr; Lavieville, Jerome; Atkhen, Kresna; Mechitoua, Namane

    2015-11-15

    In-vessel retention (IVR) of molten corium through external cooling of the reactor pressure vessel is one possible means of severe accident mitigation for a class of nuclear power plants. The aim is to successfully terminate the progression of a core melt within the reactor vessel. The probability of success depends on the efficacy of the cooling strategy; hence one of the key aspects of an IVR demonstration relates to the heat removal capability through the vessel wall by convection and boiling in the external water flow. This is only possible if the in-vessel thermal loading is lower than the local critical heat flux expected along the outer wall of the vessel, which is in turn highly dependent on the flow characteristics between the vessel and the insulator. The NEPTUNE-CFD multiphase flow solver is used to obtain a better understanding at local scale of the thermal hydraulics involved in this situation. The validation of the NEPTUNE-CFD code on the ULPU-V facility experiments carried out at the University of California Santa Barbara is presented as a first attempt of using CFD codes at EDF to address such an issue. Two types of computation are performed. On the one hand, a steady state algorithm is used to compute natural circulation flow rates and differential pressures and, on the other, a transient algorithm computation reveals the oscillatory nature of the pressure data recorded in the ULPU facility. Several dominant frequencies are highlighted. In both cases, the CFD simulations reproduce reasonably well the experimental data for these quantities.

  20. PETN ignition experiments and models.

    Science.gov (United States)

    Hobbs, Michael L; Wente, William B; Kaneshige, Michael J

    2010-04-29

    Ignition experiments from various sources, including our own laboratory, have been used to develop a simple ignition model for pentaerythritol tetranitrate (PETN). The experiments consist of differential thermal analysis, thermogravimetric analysis, differential scanning calorimetry, beaker tests, one-dimensional time to explosion tests, Sandia's instrumented thermal ignition tests (SITI), and thermal ignition of nonelectrical detonators. The model developed using this data consists of a one-step, first-order, pressure-independent mechanism used to predict pressure, temperature, and time to ignition for various configurations. The model was used to assess the state of the degraded PETN at the onset of ignition. We propose that cookoff violence for PETN can be correlated with the extent of reaction at the onset of ignition. This hypothesis was tested by evaluating metal deformation produced from detonators encased in copper as well as comparing postignition photos of the SITI experiments.

  1. Validation technique using mean and variance of kriging model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ho Sung; Jung, Jae Jun; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)

    2007-07-01

    To validate rigorously the accuracy of metamodel is an important research area in metamodel techniques. A leave-k-out cross-validation technique not only requires considerable computational cost but also cannot measure quantitatively the fidelity of metamodel. Recently, the average validation technique has been proposed. However the average validation criterion may stop a sampling process prematurely even if kriging model is inaccurate yet. In this research, we propose a new validation technique using an average and a variance of response during a sequential sampling method, such as maximum entropy sampling. The proposed validation technique becomes more efficient and accurate than cross-validation technique, because it integrates explicitly kriging model to achieve an accurate average and variance, rather than numerical integration. The proposed validation technique shows similar trend to root mean squared error such that it can be used as a strop criterion for sequential sampling.

  2. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio

    2014-01-01

    attention lately and show that the conditions under which they are valid mathematical models have been overlooked so far. 3. We provide rigorous results for the construction of valid covariance models in this family. 4. We also outline how to construct alternative covariance models for the analysis...

  3. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  4. Validation of thermal models for a prototypical MEMS thermal actuator.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

    2008-09-01

    This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the

  5. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  6. SDG-based Model Validation in Chemical Process Simulation

    Institute of Scientific and Technical Information of China (English)

    张贝克; 许欣; 马昕; 吴重光

    2013-01-01

    Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.

  7. TRIMS: Validating T2 Molecular Effects for Neutrino Mass Experiments

    Science.gov (United States)

    Lin, Ying-Ting; Bodine, Laura; Enomoto, Sanshiro; Kallander, Matthew; Machado, Eric; Parno, Diana; Robertson, Hamish; Trims Collaboration

    2017-01-01

    The upcoming KATRIN and Project 8 experiments will measure the model-independent effective neutrino mass through the kinematics near the endpoint of tritium beta-decay. A critical systematic, however, is the understanding of the molecular final-state distribution populated by tritium decay. In fact, the current theory incorporated in the KATRIN analysis framework predicts an observable that disagrees with an experimental result from the 1950s. The Tritium Recoil-Ion Mass Spectrometer (TRIMS) experiment will reexamine branching ratio of the molecular tritium (T2) beta decay to the bound state (3HeT+). TRIMS consists of a magnet-guided time-of-flight mass spectrometer with a detector located on each end. By measuring the kinetic energy and time-of-flight difference of the ions and beta particles reaching the detectors, we will be able to distinguish molecular ions from atomic ones and hence derive the ratio in question.We will give an update on simulation software, analysis tools, and the apparatus, including early commissioning results. U.S. Department of Energy Office of Science, Office of Nuclear Physics, Award Number DE-FG02-97ER41020.

  8. Validation of the Danish language Injustice Experience Questionnaire

    DEFF Research Database (Denmark)

    Schultz, Rikke

    2015-01-01

    /somatoform symptoms. These patients also completed questionnaires concerning sociodemographics, anxiety and depression, subjective well-being, and overall physical and mental functioning. Our results showed satisfactory interpretability and face validity, and high internal consistency (Cronbach's alpha = .90...

  9. Numerical simulation and experimental validation of aircraft ground deicing model

    Directory of Open Access Journals (Sweden)

    Bin Chen

    2016-05-01

    Full Text Available Aircraft ground deicing plays an important role of guaranteeing the aircraft safety. In practice, most airports generally use as many deicing fluids as possible to remove the ice, which causes the waste of the deicing fluids and the pollution of the environment. Therefore, the model of aircraft ground deicing should be built to establish the foundation for the subsequent research, such as the optimization of the deicing fluid consumption. In this article, the heat balance of the deicing process is depicted, and the dynamic model of the deicing process is provided based on the analysis of the deicing mechanism. In the dynamic model, the surface temperature of the deicing fluids and the ice thickness are regarded as the state parameters, while the fluid flow rate, the initial temperature, and the injection time of the deicing fluids are treated as control parameters. Ignoring the heat exchange between the deicing fluids and the environment, the simplified model is obtained. The rationality of the simplified model is verified by the numerical simulation and the impacts of the flow rate, the initial temperature and the injection time on the deicing process are investigated. To verify the model, the semi-physical experiment system is established, consisting of the low-constant temperature test chamber, the ice simulation system, the deicing fluid heating and spraying system, the simulated wing, the test sensors, and the computer measure and control system. The actual test data verify the validity of the dynamic model and the accuracy of the simulation analysis.

  10. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  12. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  13. Prospects and problems for standardizing model validation in systems biology.

    Science.gov (United States)

    Gross, Fridolin; MacLeod, Miles

    2017-10-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  15. Prospects and problems for standardizing model validation in systems biology

    NARCIS (Netherlands)

    Gross, Fridolin; MacLeod, Miles Alexander James

    2017-01-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary coll

  16. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment we

  17. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  18. Toward Validation of the Diagnostic-Prescriptive Model

    Science.gov (United States)

    Ysseldyke, James E.; Sabatino, David A.

    1973-01-01

    Criticized are recent research efforts to validate the diagnostic prescriptive model of remediating learning disabilities, and proposed is a 6-step psychoeducational model designed to ascertain links between behavioral differences and instructional outcomes. (DB)

  19. Generation of integral experiment covariance data and their impact on criticality safety validation

    Energy Technology Data Exchange (ETDEWEB)

    Stuke, Maik; Peters, Elisabeth; Sommer, Fabian

    2016-11-15

    The quantification of statistical dependencies in data of critical experiments and how to account for them properly in validation procedures has been discussed in the literature by various groups. However, these subjects are still an active topic in the Expert Group on Uncertainty Analysis for Criticality Safety Assessment (UACSA) of the OECDNEA Nuclear Science Committee. The latter compiles and publishes the freely available experimental data collection, the International Handbook of Evaluated Criticality Safety Benchmark Experiments, ICSBEP. Most of the experiments were performed as series and share parts of experimental setups, consequently leading to correlation effects in the results. The correct consideration of correlated data seems to be inevitable if the experimental data in a validation procedure is limited or one cannot rely on a sufficient number of uncorrelated data sets, e.g. from different laboratories using different setups. The general determination of correlations and the underlying covariance data as well as the consideration of them in a validation procedure is the focus of the following work. We discuss and demonstrate possible effects on calculated k{sub eff}'s, their uncertainties, and the corresponding covariance matrices due to interpretation of evaluated experimental data and its translation into calculation models. The work shows effects of various modeling approaches, varying distribution functions of parameters and compares and discusses results from the applied Monte-Carlo sampling method with available data on correlations. Our findings indicate that for the reliable determination of integral experimental covariance matrices or the correlation coefficients a detailed study of the underlying experimental data, the modeling approach and assumptions made, and the resulting sensitivity analysis seems to be inevitable. Further, a Bayesian method is discussed to include integral experimental covariance data when estimating an

  20. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    Historical Methods The three historical methods of validation are rationalism, empiricism , and positive economics. Rationalism requires that... Empiricism requires every assumption and outcome to be empirically validated. Positive economics requires only that the model’s outcome(s) be correct...historical methods of rationalism, empiricism , and positive economics into a multistage process of validation. This validation method consists of (1

  1. Measuring the experience of hospitality : Scale development and validation

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Adriaan T.H.

    2017-01-01

    This paper identifies what customers experience as hospitality and subsequently presents a novel and compact assessment scale for measuring customers’ experience of hospitality at any kind of service organization. The Experience of Hospitality Scale (EH-Scale) takes a broader perspective compared to

  2. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.;

    2011-01-01

    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... hydrolysis employing a dynamic mathematical model. A systematic framework for parameter estimation is used for model validation, which helps overcome the problem of parameter correlation. Data sets obtained from carefully designed enzymatic cellulose and cellobiose hydrolysis experiments, were used...

  3. Model validation through long-term promising sustainable maize/pigeon pea residue management in Malawi

    NARCIS (Netherlands)

    Mwale, C.D.; Kabambe, V.H.; Sakale, W.D.; Giller, K.E.; Kauwa, A.A.; Ligowe, I.; Kamalongo, D.

    2013-01-01

    In the 2005/2006 season, the Model Validation Through Long-Term Promising Sustainable Maize/Pigeon Pea Residue Management experiment was in the 11th year at Chitedze and Chitala, and in the 8th year at Makoka and Zombwe. The experiment was a split-plot design with cropping system as the main plot an

  4. Modeling the Effects of Argument Length and Validity on Inductive and Deductive Reasoning

    Science.gov (United States)

    Rotello, Caren M.; Heit, Evan

    2009-01-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were…

  5. Radiative transfer model for contaminated slabs : experimental validations

    CERN Document Server

    Andrieu, François; Schmitt, Bernard; Douté, Sylvain; Brissaud, Olivier

    2015-01-01

    This article presents a set of spectro-goniometric measurements of different water ice samples and the comparison with an approximated radiative transfer model. The experiments were done using the spectro-radiogoniometer described in Brissaud et al. (2004). The radiative transfer model assumes an isotropization of the flux after the second interface and is fully described in Andrieu et al. (2015). Two kind of experiments were conducted. First, the specular spot was closely investigated, at high angular resolution, at the wavelength of $1.5\\,\\mbox{\\mu m}$, where ice behaves as a very absorbing media. Second, the bidirectional reflectance was sampled at various geometries, including low phase angles on 61 wavelengths ranging from $0.8\\,\\mbox{\\mu m}$ to $2.0\\,\\mbox{\\mu m}$. In order to validate the model, we made a qualitative test to demonstrate the relative isotropization of the flux. We also conducted quantitative assessments by using a bayesian inversion method in order to estimate the parameters (e.g. sampl...

  6. Experimental validation of a numerical model for subway induced vibrations

    Science.gov (United States)

    Gupta, S.; Degrande, G.; Lombaert, G.

    2009-04-01

    This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

  7. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2016-10-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  8. Stable Eutectoid Transformation in Nodular Cast Iron: Modeling and Validation

    Science.gov (United States)

    Carazo, Fernando D.; Dardati, Patricia M.; Celentano, Diego J.; Godoy, Luis A.

    2017-01-01

    This paper presents a new microstructural model of the stable eutectoid transformation in a spheroidal cast iron. The model takes into account the nucleation and growth of ferrite grains and the growth of graphite spheroids. Different laws are assumed for the growth of both phases during and below the intercritical stable eutectoid. At a microstructural level, the initial conditions for the phase transformations are obtained from the microstructural simulation of solidification of the material, which considers the divorced eutectic and the subsequent growth of graphite spheroids up to the initiation of the stable eutectoid transformation. The temperature field is obtained by solving the energy equation by means of finite elements. The microstructural (phase change) and macrostructural (energy balance) models are coupled by a sequential multiscale procedure. Experimental validation of the model is achieved by comparison with measured values of fractions and radius of 2D view of ferrite grains. Agreement with such experiments indicates that the present model is capable of predicting ferrite phase fraction and grain size with reasonable accuracy.

  9. Use of Synchronized Phasor Measurements for Model Validation in ERCOT

    Science.gov (United States)

    Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

    2013-05-01

    This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

  10. A Report on the Validation of Beryllium Strength Models

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    This report discusses work on validating beryllium strength models with flyer plate and Taylor rod experimental data. Strength models are calibrated with Hopkinson bar and quasi-static data. The Hopkinson bar data for beryllium provides strain rates up to about 4000 per second. A limitation of the Hopkinson bar data for beryllium is that it only provides information on strain up to about 0.15. The lack of high strain data at high strain rates makes it difficult to distinguish between various strength model settings. The PTW model has been calibrated many different times over the last 12 years. The lack of high strain data for high strain rates has resulted in these calibrated PTW models for beryllium exhibiting significantly different behavior when extrapolated to high strain. For beryllium, the α parameter of PTW has recently been calibrated to high precision shear modulus data. In the past the α value for beryllium was set based on expert judgment. The new α value for beryllium was used in a calibration of the beryllium PTW model by Sky Sjue. The calibration by Sjue used EOS table information to model the temperature dependence of the heat capacity. Also, the calibration by Sjue used EOS table information to model the density changes of the beryllium sample during the Hopkinson bar and quasi-static experiments. In this paper, the calibrated PTW model by Sjue is compared against experimental data and other strength models. The other strength models being considered are a PTW model calibrated by Shuh- Rong Chen and a Steinberg-Guinan type model by John Pedicini. The three strength models are used in a comparison against flyer plate and Taylor rod data. The results show that the Chen PTW model provides better agreement to this data. The Chen PTW model settings have been previously adjusted to provide a better fit to flyer plate data, whereas the Sjue PTW model has not been changed based on flyer plate data. However, the Sjue model provides a reasonable fit to

  11. A Complex-Geometry Validation Experiment for Advanced Neutron Transport Codes

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Anthony W. LaPorta; Joseph W. Nielsen; James Parry; Mark D. DeHart; Samuel E. Bays; William F. Skerjanc

    2013-11-01

    The Idaho National Laboratory (INL) has initiated a focused effort to upgrade legacy computational reactor physics software tools and protocols used for support of core fuel management and experiment management in the Advanced Test Reactor (ATR) and its companion critical facility (ATRC) at the INL.. This will be accomplished through the introduction of modern high-fidelity computational software and protocols, with appropriate new Verification and Validation (V&V) protocols, over the next 12-18 months. Stochastic and deterministic transport theory based reactor physics codes and nuclear data packages that support this effort include MCNP5[1], SCALE/KENO6[2], HELIOS[3], SCALE/NEWT[2], and ATTILA[4]. Furthermore, a capability for sensitivity analysis and uncertainty quantification based on the TSUNAMI[5] system has also been implemented. Finally, we are also evaluating the Serpent[6] and MC21[7] codes, as additional verification tools in the near term as well as for possible applications to full three-dimensional Monte Carlo based fuel management modeling in the longer term. On the experimental side, several new benchmark-quality code validation measurements based on neutron activation spectrometry have been conducted using the ATRC. Results for the first four experiments, focused on neutron spectrum measurements within the Northwest Large In-Pile Tube (NW LIPT) and in the core fuel elements surrounding the NW LIPT and the diametrically opposite Southeast IPT have been reported [8,9]. A fifth, very recent, experiment focused on detailed measurements of the element-to-element core power distribution is summarized here and examples of the use of the measured data for validation of corresponding MCNP5, HELIOS, NEWT, and Serpent computational models using modern least-square adjustment methods are provided.

  12. Validation of Numerical Shallow Water Models for Tidal Lagoons

    Energy Technology Data Exchange (ETDEWEB)

    Eliason, D.; Bourgeois, A.

    1999-11-01

    An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

  13. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  14. Premixing of corium into water during a Fuel-Coolant Interaction. The models used in the 3 field version of the MC3D code and two examples of validation on Billeau and FARO experiments

    Energy Technology Data Exchange (ETDEWEB)

    Berthoud, G.; Crecy, F. de; Duplat, F.; Meignen, R.; Valette, M. [CEA/Grenoble, DRN/DTP, 17 Avenue des Martyrs, 38054 Grenoble Cedex 9 (France)

    1998-01-01

    This paper presents the <> application of the multiphasic 3D computer code MC3D. This application is devoted to the premixing phase of a Fuel Coolant Interaction (FCI) when large amounts of molten corium flow into water and interact with it. A description of the new features of the model is given (a more complete description of the full model is given in annex). Calculations of Billeau experiments (cold or hot spheres dropped into water) and of a FARO test (<> corium dropped into 5 MPa saturated water) are presented. (author)

  15. Extending Model Checking to Object Process Validation

    NARCIS (Netherlands)

    Rein, van H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent models

  16. Modeling simulation and experimental validation for mold filling process

    Institute of Scientific and Technical Information of China (English)

    HOU Hua; JU Dong-ying; MAO Hong-kui; D. SAITO

    2006-01-01

    Based on the continuum equation, momentum conservation and energy conservation equations, the numerical model of turbulent flow filling was introduced; the 3-D free surface vof method was improved. Whether or not the numerical simulation results are reasonable, it needs corresponding experimental validations. General experimental techniques for casting fluid flow process include: thermocouple tracking location method, hydraulic simulating method, heat-resistant glass window method and X-ray observation etc. The hydraulic analogue experiment with DPIV technique is arranged to validate the fluent flow program for low-pressure casting with 0.1×105 Pa and 0.6×105 Pa pressure visually. By comparing the flow head, liquid surface, flow velocity, it is found that the filling pressure value influences the flow state strongly. With the increase of the filling pressure, the fluid flow state becomes unstable, the flow head becomes higher, and the filling time is reduced. The simulated results are accordant with the observed results approximately, which can prove the reasonability of our numerical program for filling process further.

  17. Validation of ice loads predicted from meteorological models

    Energy Technology Data Exchange (ETDEWEB)

    Veal, A.; Skea, A. [UK Met Office, Exeter, England (United Kingdom); Wareing, B. [Brian Wareing Tech Ltd., England (United Kingdom)

    2005-07-01

    Results of a field trial conducted on 2 Gerber PVM-100 instruments at Deadwater Fell test site in the United Kingdom were presented. The trials were conducted to assess whether the instruments were capable of measuring the liquid water content of the air, as well as to validate an ice model in terms of accretion rates on different sized conductors. Ambient air temperature, wind speed and direction were monitored at the Deadwater Fell weather station along with load cell values. Time lapse video recorders and a web camera system were used to view the performance of the conductors in varying weather conditions. All data was collected and stored at the site. It was anticipated that output from the instruments could be related to the conditions under which overhead line conductors suffer from ice loads, and help to revise weather maps which have proved to be incompatible with utility experience and the lifetimes achieved by overhead line designs. The data provided from the Deadwater work included logged data from the Gerbers, weather data and load data from a 10 mm diameter aluminium alloy conductor. When the combination of temperature, wind direction and Gerber output indicated icing conditions, they were confirmed by the conductor's load cell data. The tests confirmed the validity of the Gerber instruments to predict the occurrence of icing conditions, when combined with other meteorological data. It was concluded that the instruments may aid in optimized prediction methods for ice loads and icing events. 2 refs., 4 figs.

  18. Validation of elastic cross section models for space radiation applications

    Science.gov (United States)

    Werneth, C. M.; Xu, X.; Norman, R. B.; Ford, W. P.; Maung, K. M.

    2017-02-01

    The space radiation field is composed of energetic particles that pose both acute and long-term risks for astronauts in low earth orbit and beyond. In order to estimate radiation risk to crew members, the fluence of particles and biological response to the radiation must be known at tissue sites. Given that the spectral fluence at the boundary of the shielding material is characterized, radiation transport algorithms may be used to find the fluence of particles inside the shield and body, and the radio-biological response is estimated from experiments and models. The fidelity of the radiation spectrum inside the shield and body depends on radiation transport algorithms and the accuracy of the nuclear cross sections. In a recent study, self-consistent nuclear models based on multiple scattering theory that include the option to study relativistic kinematics were developed for the prediction of nuclear cross sections for space radiation applications. The aim of the current work is to use uncertainty quantification to ascertain the validity of the models as compared to a nuclear reaction database and to identify components of the models that can be improved in future efforts.

  19. Measuring Educators' Attitudes and Beliefs about Evaluation: Construct Validity and Reliability of the Teacher Evaluation Experience Scale

    Science.gov (United States)

    Reddy, Linda A.; Dudek, Christopher M.; Kettler, Ryan J.; Kurz, Alexander; Peters, Stephanie

    2016-01-01

    This study presents the reliability and validity of the Teacher Evaluation Experience Scale--Teacher Form (TEES-T), a multidimensional measure of educators' attitudes and beliefs about teacher evaluation. Confirmatory factor analyses of data from 583 teachers were conducted on the TEES-T hypothesized five-factor model, as well as on alternative…

  20. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  1. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  2. Cross-validation criteria for SETAR model selection

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2001-01-01

    Three cross-validation criteria, denoted C, C_c, and C_u are proposed for selecting the orders of a self-exciting threshold autoregressive SETAR) model when both the delay and the threshold value are unknown. The derivatioon of C is within a natural cross-validation framework. The crietion C_c is si

  3. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... in these models remains to be established....

  4. Tennessee Valley Authority validation experience with substation reliability centered maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, S.M.; Graziano, J.A. [Tennessee Valley Authority, Chattanooga, TN (United States)

    1996-08-01

    This paper discusses the approach the Tennessee Valley Authority (TVA) used in the application of the Reliability Centered Maintenance (RCM) process during the evaluation of a Delle air blast circuit breaker. It discusses the selection of a system, establishing system boundaries, and the problems encountered in reviewing test data and personnel interviews. This paper also shows the functional failure analysis process, critical failure mode causes and resulting tasks. From the failure mode causes a task comparison is developed. The results from the RCM evaluation are shown as a cost savings. Finally, from this evaluation a list of conclusions and recommendations concerning the RCM validation process is provided.

  5. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...... excitations from the Thanet farm are used for trying to update some of the models discussed in D2.5. Because of very limited amount of data only simple dynamic transfer function models can be obtained. The three obtained data series are somewhat different. Only the first data set seems to have the front...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading....

  6. Gear Windage Modeling Progress - Experimental Validation Status

    Science.gov (United States)

    Kunz, Rob; Handschuh, Robert F.

    2008-01-01

    In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

  7. A General Strategy for Physics-Based Model Validation Illustrated with Earthquake Phenomenology, Atmospheric Radiative Transfer, and Computational Fluid Dynamics

    CERN Document Server

    Sornette, Didier; Kamm, James R; Ide, Kayo

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. In this article, we survey the model validation literature and propose to formulate validation as an iterative construction process that mimics the process occurring implicitly in the minds of scientists. We thus offer a formal representation of the progressive build-up of trust in the model, and thereby replace incapacitating claims on the impossibility of validating a given model by an adaptive process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the n...

  8. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  9. EMMD-Prony approach for dynamic validation of simulation models

    Institute of Scientific and Technical Information of China (English)

    Ruiyang Bai

    2015-01-01

    Model validation and updating is critical to model credi-bility growth. In order to assess model credibility quantitatively and locate model error precisely, a new dynamic validation method based on extremum field mean mode decomposition (EMMD) and the Prony method is proposed in this paper. Firstly, complex dy-namic responses from models and real systems are processed into stationary components by EMMD. These components always have definite physical meanings which can be the evidence about rough model error location. Secondly, the Prony method is applied to identify the features of each EMMD component. Amplitude si-milarity, frequency similarity, damping similarity and phase simi-larity are defined to describe the similarity of dynamic responses. Then quantitative validation metrics are obtained based on the improved entropy weight and energy proportion. Precise model error location is realized based on the physical meanings of these features. The application of this method in aircraft control er design provides evidence about its feasibility and usability.

  10. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  11. Validation of a national hydrological model

    Science.gov (United States)

    McMillan, H. K.; Booker, D. J.; Cattoën, C.

    2016-10-01

    Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.

  12. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  13. Validation Experiments supporting the CryoSat-2 mission

    Science.gov (United States)

    Cullen, R.; Davidson, M.; Wingham, D.

    2009-12-01

    The primary goals of CryoSat are to derive improved estimates of the rates of change concerning land ice elevation and sea ice thickness and freeboard of the Earth’s land and marine ice fields. Validating such retrievals derived from a phase coherent pulse-width limited polar observing radar altimeter such as SIRAL, the primary payload of CryoSat, is not a simple one. In order to understand all the respective error co-variances it is necessary to acquire many different types of in-situ measurements (GPR, neutron probe density profiles, drilled and electromagnetic derived sea-ice thicknesses, for example) in highly inhospitable regions of the cryosphere at times of the year to detect relevant signals. In order to correlate retrievals from CryoSat with the in-situ data it was decided early in the CryoSat development that an aircraft borne radar altimeter with similar functionality to SIRAL would provide the necessary link, albeit on the smaller scale, and provide pre-launch incite into expected performances. In 2001 ESA commenced the development of its own prototype radar altimeter that mimics the functionality of SIRAL to be operated along-side an airborne laser scanner. Similar to SIRAL, but with subtle functional differences, the airborne SAR/Interferometric Radar Altimeter System (ASIRAS) has now been the centre piece instrument for a number of large scale land and sea ice field campaigns in the Arctic during spring and autumn 2004 and 2006 and 2008. Additional smaller science/test campaigns have taken place in March 2003 (Svalbard), March 2005 (Bay of Bothnia), March 2006 (Western Greenland) and April 2007 (CryoVEx 2007 in Svalbard). It is a credit to all parties that constitute the CryoSat Validation and Retrieval Team (CVRT) for the coordination, planning, acquisition of in-situ and airborne measurements and the subsequent processing and distributing of its data for analysis. CVRT has a robust infrastructure in place for validating and providing measures of

  14. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  15. Experimental manipulation of working memory model parameters: an exercise in construct validity.

    Science.gov (United States)

    Brown, Gregory G; Turner, Travis H; Mano, Quintino R; Bolden, Khalima; Thomas, Michael L

    2013-09-01

    As parametric cognitive models become more commonly used to measure individual differences, the construct validity of the interpretation of individual model parameters needs to be well established. The validity of the interpretation of 2 parameters of a formal model of the Continuous Recognition Memory Test (CRMT) was investigated in 2 experiments. The 1st study found that manipulating the percentage of trials on the CRMT for which degraded pseudowords were presented altered the model's stimulus encoding parameter but not the working memory displacement parameter. The 2nd experiment showed that manipulating the number of syllables forming a pseudoword altered the model's working memory displacement parameter for each syllable added to the pseudoword. Findings from both experiments supported the construct representation of the model parameters, supporting the construct validity of the model's use to interpret CRMT performance. Combining parametric models with the manipulation of factors that theory predicts are related to model parameters provides an approach to construct validation that bridges experimental and individual difference methods of studying human cognition.

  16. CFD modelling and validation of wall condensation in the presence of non-condensable gases

    Energy Technology Data Exchange (ETDEWEB)

    Zschaeck, G., E-mail: guillermo.zschaeck@ansys.com [ANSYS Germany GmbH, Staudenfeldweg 12, Otterfing 83624 (Germany); Frank, T. [ANSYS Germany GmbH, Staudenfeldweg 12, Otterfing 83624 (Germany); Burns, A.D. [ANSYS UK Ltd, 97 Milton Park, Abingdon, Oxfordshire OX14 4RY (United Kingdom)

    2014-11-15

    Highlights: • A wall condensation model was implemented and validated in ANSYS CFX. • Condensation rate is assumed to be controlled by the concentration boundary layer. • Validation was done using two laboratory scale experiments. • CFD calculations show good agreement with experimental data. - Abstract: The aim of this paper is to present and validate a mathematical model implemented in ANSYS CFD for the simulation of wall condensation in the presence of non-condensable substances. The model employs a mass sink at isothermal walls or conjugate heat transfer (CHT) domain interfaces where condensation takes place. The model was validated using the data reported by Ambrosini et al. (2008) and Kuhn et al. (1997)

  17. Group Performance Under Experienced and Inexperienced Leaders; A Validation Experiment.

    Science.gov (United States)

    Fiedler, Fred E.; Chemers, Martin M.

    This study investigated the effect of experience and training on the performance of Belgian naval officers in an experimental leadership situation. As in a previous study conducted with Belgian naval personnel, group performance under trained and experienced officers was not significantly better than performance under untrained recruits. Moreover,…

  18. Development, Validity, and Reliability of the Campus Residential Experience Survey

    Science.gov (United States)

    Sriram, Rishi; Scales, Laine; Shushok, Frank, Jr.

    2017-01-01

    The importance of living on campus is well established, but extant research that examines administrator perceptions of what comprises the best educational experience for students living on campus is generally unavailable. This study reports the development of a psychometric instrument designed to uncover underlying paradigms and attitudes of…

  19. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    Science.gov (United States)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  20. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  1. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  2. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  3. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  4. First approximations in avalanche model validations using seismic information

    Science.gov (United States)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position

  5. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  6. Why a Trade-Off? The Relationship between the External and Internal Validity of Experiments

    Directory of Open Access Journals (Sweden)

    Maria Jimenez-Buedo

    2010-10-01

    Full Text Available Much of the methodological discussion around experiments in economics and other social sciences is framed in terms of the notions of internal and external validity. The standard view is that internal validity and external validity stand in a relationship best described as a trade-off. However, it is also commonly held that internal validity is a prerequisite to external validity. This article addresses the problem of the compatibility of these two ideas and analyzes critically the standard arguments about the conditions under which a trade-off between internal and external validity arises. Our argument stands against common associations of internal validity and external validity with the distinction between field and laboratory experiments and assesses critically the arguments that link the artificiality of experimental settings done in the laboratory with the purported trade-off between internal and external validity. We conclude that the idea of a trade-off or tension between internal and external validity seems, upon analysis, far less cogent than its intuitive attractiveness may lead us to think at first sight.

  7. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  8. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    2007-01-01

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation us

  9. Experimental validation of Swy-2 clay standard's PHREEQC model

    Science.gov (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  10. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation...

  11. Validation of Air Traffic Controller Workload Models

    Science.gov (United States)

    1979-09-01

    SAR) tapes dtirinq the data reduc- tion phase of the project. Kentron International Limited provided the software support for the oroject. This included... ETABS ) or to revised traffic control procedures. The models also can be used to verify productivity benefits after new configurations have been...col- lected and processed manually. A preliminary compari- son has been made between standard NAS Stage A and ETABS operations at Miami. 1.2

  12. A Rasch scaling validation of a 'core' near-death experience.

    Science.gov (United States)

    Lange, Rense; Greyson, Bruce; Houran, James

    2004-05-01

    For those with true near-death experiences (NDEs), Greyson's (1983, 1990) NDE Scale satisfactorily fits the Rasch rating scale model, thus yielding a unidimensional measure with interval-level scaling properties. With increasing intensity, NDEs reflect peace, joy and harmony, followed by insight and mystical or religious experiences, while the most intense NDEs involve an awareness of things occurring in a different place or time. The semantics of this variable are invariant across True-NDErs' gender, current age, age at time of NDE, and latency and intensity of the NDE, thus identifying NDEs as 'core' experiences whose meaning is unaffected by external variables, regardless of variations in NDEs' intensity. Significant qualitative and quantitative differences were observed between True-NDErs and other respondent groups, mostly revolving around the differential emphasis on paranormal/mystical/religious experiences vs. standard reactions to threat. The findings further suggest that False-Positive respondents reinterpret other profound psychological states as NDEs. Accordingly, the Rasch validation of the typology proposed by Greyson (1983) also provides new insights into previous research, including the possibility of embellishment over time (as indicated by the finding of positive, as well as negative, latency effects) and the potential roles of religious affiliation and religiosity (as indicated by the qualitative differences surrounding paranormal/mystical/religious issues).

  13. Assessing decentering: validation, psychometric properties, and clinical usefulness of the Experiences Questionnaire in a Spanish sample.

    Science.gov (United States)

    Soler, Joaquim; Franquesa, Alba; Feliu-Soler, Albert; Cebolla, Ausias; García-Campayo, Javier; Tejedor, Rosa; Demarzo, Marcelo; Baños, Rosa; Pascual, Juan Carlos; Portella, Maria J

    2014-11-01

    Decentering is defined as the ability to observe one's thoughts and feelings in a detached manner. The Experiences Questionnaire (EQ) is a self-report instrument that originally assessed decentering and rumination. The purpose of this study was to evaluate the psychometric properties of the Spanish version of EQ-Decentering and to explore its clinical usefulness. The 11-item EQ-Decentering subscale was translated into Spanish and psychometric properties were examined in a sample of 921 adult individuals, 231 with psychiatric disorders and 690 without. The subsample of nonpsychiatric participants was also split according to their previous meditative experience (meditative participants, n=341; and nonmeditative participants, n=349). Additionally, differences among these three subgroups were explored to determine clinical validity of the scale. Finally, EQ-Decentering was administered twice in a group of borderline personality disorder, before and after a 10-week mindfulness intervention. Confirmatory factor analysis indicated acceptable model fit, sbχ(2)=243.8836 (p.46; and divergent validity: r<-.35). The scale detected changes in decentering after a 10-session intervention in mindfulness (t=-4.692, p<.00001). Differences among groups were significant (F=134.8, p<.000001), where psychiatric participants showed the lowest scores compared to nonpsychiatric meditative and nonmeditative participants. The Spanish version of the EQ-Decentering is a valid and reliable instrument to assess decentering either in clinical and nonclinical samples. In addition, the findings show that EQ-Decentering seems an adequate outcome instrument to detect changes after mindfulness-based interventions.

  14. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  15. Progress Towards a Microgravity CFD Validation Study Using the ISS SPHERES-SLOSH Experiment

    Science.gov (United States)

    Storey, Jedediah M.; Kirk, Daniel; Marsell, Brandon (Editor); Schallhorn, Paul (Editor)

    2017-01-01

    Understanding, predicting, and controlling fluid slosh dynamics is critical to safety and improving performance of space missions when a significant percentage of the spacecrafts mass is a liquid. Computational fluid dynamics simulations can be used to predict the dynamics of slosh, but these programs require extensive validation. Many CFD programs have been validated by slosh experiments using various fluids in earth gravity, but prior to the ISS SPHERES-Slosh experiment1, little experimental data for long-duration, zero-gravity slosh existed. This paper presents the current status of an ongoing CFD validation study using the ISS SPHERES-Slosh experimental data.

  16. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  17. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  18. Parameterization and Validation of an Integrated Electro-Thermal LFP Battery Model

    Science.gov (United States)

    2012-01-01

    an equivalent cir- cuit as seen in Fig. 1. The double RC model structure is a good choice for this battery chemistry , as shown in [25]. The two RC...the average of the charge and discharge curves taken at very low current (C/20), since the LiFePO4 cell chemistry is known to yield a hysteresis effect...condition. 4 MODEL VALIDATION AND RESULTS The electro-thermal model is implemented in Simulink to validate its performance under the UAC experiment

  19. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    Science.gov (United States)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  20. Validation of Orthorectified Interferometric Radar Imagery and Digital Elevation Models

    Science.gov (United States)

    Smith Charles M.

    2004-01-01

    This work was performed under NASA's Verification and Validation (V&V) Program as an independent check of data supplied by EarthWatch, Incorporated, through the Earth Science Enterprise Scientific Data Purchase (SDP) Program. This document serves as the basis of reporting results associated with validation of orthorectified interferometric interferometric radar imagery and digital elevation models (DEM). This validation covers all datasets provided under the first campaign (Central America & Virginia Beach) plus three earlier missions (Indonesia, Red River: and Denver) for a total of 13 missions.

  1. Validation of a Model of the Domino Effect?

    CERN Document Server

    Larham, Ron

    2008-01-01

    A recent paper proposing a model of the limiting speed of the domino effect is discussed with reference to its need and the need of models in general for validation against experimental data. It is shown that the proposed model diverges significantly from experimentally derived speed estimates over a significant range of domino spacing using data from the existing literature and this author's own measurements, hence if its use had had economic importance its use outside its range of validity could have led to loses of one sort or another to its users.

  2. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  3. Validation of an Efficient Outdoor Sound Propagation Model Using BEM

    DEFF Research Database (Denmark)

    Quirós-Alpera, S.; Henriquez, Vicente Cutanda; Jacobsen, Finn

    2001-01-01

    An approximate, simple and practical model for prediction of outdoor sound propagation exists based on ray theory, diffraction theory and Fresnel-zone considerations [1]. This model, which can predict sound propagation over non-flat terrain, has been validated for combinations of flat ground, hills...... and barriers, but it still needs to be validated for configurations that involve combinations of valleys and barriers. In order to do this a boundary element model has been implemented in MATLAB to serve as a reliable reference....

  4. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    Science.gov (United States)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  5. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David; Thompson, Sandra E.

    2016-09-17

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  6. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  7. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  8. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.

    2011-01-01

    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  9. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a network

  10. Measurements for validation of high voltage underground cable modelling

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Gudmundsdottir, Unnur Stella; Wiechowski, Wojciech Tomasz

    2009-01-01

    This paper discusses studies concerning cable modelling for long high voltage AC cable lines. In investigating the possibilities of using long cables instead of overhead lines, the simulation results must be trustworthy. Therefore a model validation is of great importance. This paper describes...

  11. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  12. Validation of a terrestrial food chain model.

    Science.gov (United States)

    Travis, C C; Blaylock, B P

    1992-01-01

    An increasingly important topic in risk assessment is the estimation of human exposure to environmental pollutants through pathways other than inhalation. The Environmental Protection Agency (EPA) has recently developed a computerized methodology (EPA, 1990) to estimate indirect exposure to toxic pollutants from Municipal Waste Combuster emissions. This methodology estimates health risks from exposure to toxic pollutants from the terrestrial food chain (TFC), soil ingestion, drinking water ingestion, fish ingestion, and dermal absorption via soil and water. Of these, one of the most difficult to estimate is exposure through the food chain. This paper estimates the accuracy of the EPA methodology for estimating food chain contamination. To our knowledge, no data exist on measured concentrations of pollutants in food grown around Municipal Waste Incinerators, and few field-scale studies have been performed on the uptake of pollutants in the food chain. Therefore, to evaluate the EPA methodology, we compare actual measurements of background contaminant levels in food with estimates made using EPA's computerized methodology. Background levels of contaminants in air, water, and soil were used as input to the EPA food chain model to predict background levels of contaminants in food. These predicted values were then compared with the measured background contaminant levels. Comparisons were performed for dioxin, pentachlorophenol, polychlorinated biphenyls, benzene, benzo(a)pyrene, mercury, and lead.

  13. Verification and Validation of Requirements on the CEV Parachute Assembly System Using Design of Experiments

    Science.gov (United States)

    Schulte, Peter Z.; Moore, James W.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.

  14. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments

    Directory of Open Access Journals (Sweden)

    Gyöngyi Munkácsy

    2016-01-01

    Full Text Available No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal–Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E−06. Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E−04. There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  15. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  16. Predictive Model for Particle Residence Time Distributions in Riser Reactors. Part 1: Model Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Foust, Thomas D.; Ziegler, Jack L.; Pannala, Sreekanth; Ciesielski, Peter; Nimlos, Mark R.; Robichaud, David J.

    2017-01-16

    In this computational study, we model the mixing of biomass pyrolysis vapor with solid catalyst in circulating riser reactors with a focus on the determination of solid catalyst residence time distributions (RTDs). A comprehensive set of 2D and 3D simulations were conducted for a pilot-scale riser using the Eulerian-Eulerian two-fluid modeling framework with and without sub-grid-scale models for the gas-solids interaction. A validation test case was also simulated and compared to experiments, showing agreement in the pressure gradient and RTD mean and spread. For simulation cases, it was found that for accurate RTD prediction, the Johnson and Jackson partial slip solids boundary condition was required for all models and a sub-grid model is useful so that ultra high resolutions grids that are very computationally intensive are not required. We discovered a 2/3 scaling relation for the RTD mean and spread when comparing resolved 2D simulations to validated unresolved 3D sub-grid-scale model simulations.

  17. Docking validation resources: protein family and ligand flexibility experiments.

    Science.gov (United States)

    Mukherjee, Sudipto; Balius, Trent E; Rizzo, Robert C

    2010-11-22

    A database consisting of 780 ligand-receptor complexes, termed SB2010, has been derived from the Protein Databank to evaluate the accuracy of docking protocols for regenerating bound ligand conformations. The goal is to provide easily accessible community resources for development of improved procedures to aid virtual screening for ligands with a wide range of flexibilities. Three core experiments using the program DOCK, which employ rigid (RGD), fixed anchor (FAD), and flexible (FLX) protocols, were used to gauge performance by several different metrics: (1) global results, (2) ligand flexibility, (3) protein family, and (4) cross-docking. Global spectrum plots of successes and failures vs rmsd reveal well-defined inflection regions, which suggest the commonly used 2 Å criteria is a reasonable choice for defining success. Across all 780 systems, success tracks with the relative difficulty of the calculations: RGD (82.3%) > FAD (78.1%) > FLX (63.8%). In general, failures due to scoring strongly outweigh those due to sampling. Subsets of SB2010 grouped by ligand flexibility (7-or-less, 8-to-15, and 15-plus rotatable bonds) reveal that success degrades linearly for FAD and FLX protocols, in contrast to RGD, which remains constant. Despite the challenges associated with FLX anchor orientation and on-the-fly flexible growth, success rates for the 7-or-less (74.5%) and, in particular, the 8-to-15 (55.2%) subset are encouraging. Poorer results for the very flexible 15-plus set (39.3%) indicate substantial room for improvement. Family-based success appears largely independent of ligand flexibility, suggesting a strong dependence on the binding site environment. For example, zinc-containing proteins are generally problematic, despite moderately flexible ligands. Finally, representative cross-docking examples, for carbonic anhydrase, thermolysin, and neuraminidase families, show the utility of family-based analysis for rapid identification of particularly good or bad

  18. A prediction model for ocular damage - Experimental validation.

    Science.gov (United States)

    Heussner, Nico; Vagos, Márcia; Spitzer, Martin S; Stork, Wilhelm

    2015-08-01

    With the increasing number of laser applications in medicine and technology, accidental as well as intentional exposure of the human eye to laser sources has become a major concern. Therefore, a prediction model for ocular damage (PMOD) is presented within this work and validated for long-term exposure. This model is a combination of a raytracing model with a thermodynamical model of the human and an application which determines the thermal damage by the implementation of the Arrhenius integral. The model is based on our earlier work and is here validated against temperature measurements taken with porcine eye samples. For this validation, three different powers were used: 50mW, 100mW and 200mW with a spot size of 1.9mm. Also, the measurements were taken with two different sensing systems, an infrared camera and a fibre optic probe placed within the tissue. The temperatures were measured up to 60s and then compared against simulations. The measured temperatures were found to be in good agreement with the values predicted by the PMOD-model. To our best knowledge, this is the first model which is validated for both short-term and long-term irradiations in terms of temperature and thus demonstrates that temperatures can be accurately predicted within the thermal damage regime. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Experiments beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references.

  20. The hypothetical world of CoMFA and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Oprea, T.I. [Los Alamos National Lab., NM (United States)

    1996-12-31

    CoMFA is a technique used to establish the three-dimensional similarity of molecular structures, in relationship to a target property. Because the risk of chance correlation is high, validation is required for all CoMFA models. The following validation steps should be performed: the choice of alignment rules (superimposition and conformer criteria) has to use experimental data when available, or different (alternate) hypotheses; statistical methods (e.g., cross-validation with randomized groups), have to emphasize simplicity, robustness, predictivity and explanatory power. When several CoMFA-QSAR models on similar targets and/or structures are available, qualitative lateral validation can be applied. This meta-analysis for CoMFA models offers a broader perspective on the similarities and differences between compared biological targets, with potential applications in rational drug design [e.g., selectivity, efficacy] and environmental toxicology. Examples that focus on validation of CoMFA models include the following steroid-binding proteins: aromatase, the estrogen and the androgen receptors, a monoclonal antibody against progesterone and two steroid binding globulins.

  1. Validation Experiments for Spent- Fuel Dry-Cask In-Basket Convection

    Energy Technology Data Exchange (ETDEWEB)

    Smith, barton [Utah State Univ., Logan, UT (United States)

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  2. Validation Experiments for Spent-Fuel Dry-Cask In-Basket Convection

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L. [Utah State Univ., Logan, UT (United States). Dept. of Mechanical and Aerospace Engineering

    2016-08-16

    This work consisted of the following major efforts; 1. Literature survey on validation of external natural convection; 2. Design the experiment; 3. Build the experiment; 4. Run the experiment; 5. Collect results; 6. Disseminate results; and 7. Perform a CFD validation study using the results. We note that while all tasks are complete, some deviations from the original plan were made. Specifically, geometrical changes in the parameter space were skipped in favor of flow condition changes, which were found to be much more practical to implement. Changing the geometry required new as-built measurements, which proved extremely costly and impractical given the time and funds available

  3. The Validation of Climate Models: The Development of Essential Practice

    Science.gov (United States)

    Rood, R. B.

    2011-12-01

    It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its

  4. ALTWAVE: Toolbox for use of satellite L2P altimeter data for wave model validation

    Science.gov (United States)

    Appendini, Christian M.; Camacho-Magaña, Víctor; Breña-Naranjo, José Agustín

    2016-03-01

    To characterize some of the world's ocean physical processes such as its wave height, wind speed and sea surface elevation is a major need for coastal and marine infrastructure planning and design, tourism activities, wave power and storm surge risk assessment, among others. Over the last decades, satellite remote sensing tools have provided quasi-global measurements of ocean altimetry by merging data from different satellite missions. While there is a widely use of altimeter data for model validation, practical tools for model validation remain scarce. Our purpose is to fill this gap by introducing ALTWAVE, a MATLAB user-oriented toolbox for oceanographers and coastal engineers developed to validate wave model results based on visual features and statistical estimates against satellite derived altimetry. Our toolbox uses altimetry information from the GlobWave initiative, and provides a sample application to validate a one year wave hindcast for the Gulf of Mexico. ALTWAVE also offers an effective toolbox to validate wave model results using altimeter data, as well as a guidance for non-experienced satellite data users. This article is intended for wave modelers with no experience using altimeter data to validate their results.

  5. Validation of ASTECV2.1 based on the QUENCH-08 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Gómez-García-Toraño, Ignacio, E-mail: ignacio.torano@kit.edu [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Sánchez-Espinoza, Víctor-Hugo; Stieglitz, Robert [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology (INR), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Stuckert, Juri [Karlsruhe Institute of Technology, Institute for Applied Materials-Applied Materials Physics (IAM-AWP), Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany); Laborde, Laurent; Belon, Sébastien [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), Nuclear Safety Division/Safety Research/Severe Accident Department, Saint Paul Lez Durance 13115 (France)

    2017-04-01

    Highlights: • ASTECV2.1 can reproduce QUENCH-08 experimental trends e.g. hydrogen generation. • Radial temperature gradient and heat transfer through argon gap are underestimated. • Mesh sizes lower than 55 mm needed to capture the strong axial temperature gradient. • Minor variations of external electrical resistance strongly affect bundle heat-up. • Modelling of a bypass and inclusion of currents partially overcome discrepancies. - Abstract: The Fukushima accidents have shown that further improvements of Severe Accident Management Guidelines (SAMGs) are still necessary. Hence, the enhancement of severe accident codes and their validation based on integral experiments is pursued worldwide. In particular, the capabilities of the European integral severe accident ASTECV2.1 code are being extended within the CESAM project through the improvement of physical models, code numerics and an extensive code validation. Among the different strategies encompassed in the plant SAMGs, one of the most important ones to prevent core damage is the injection of water into the overheated core (reflooding). However, under certain conditions, reflooding may trigger a sharp hydrogen generation that may jeopardize the containment. Within this work, ASTECV2.1 models describing the early in-vessel phase of the severe accident and its termination by core reflooding are validated against data from the QUENCH test facility. The QUENCH-08, involving the injection of 15 g/s (about 0.6 g/s/rod) of saturated steam at a bundle temperature of 2073 K, has been selected for this comparison. Results show that ASTECV2.1 is able to reproduce the experimental temperatures and oxide thicknesses at representative bundle locations. The predicted total hydrogen generation (76 g) is similar to the experimental one (84 g). In addition, the choices of an axial mesh size lower than 55 mm and of an external electrical resistance of a 7 mΩ/rod have been justified with parametric analyses. Finally, new

  6. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  7. Nonequilibrium stage modelling of dividing wall columns and experimental validation

    Science.gov (United States)

    Hiller, Christoph; Buck, Christina; Ehlers, Christoph; Fieg, Georg

    2010-11-01

    Dealing with complex process units like dividing wall columns pushes the focus on the determination of suitable modelling approaches. For this purpose a nonequilibrium stage model is developed. The successful validation is achieved by an experimental investigation of fatty alcohol mixtures under vacuum condition at pilot scale. Aim is the recovery of high purity products. The proposed model predicts the product qualities and temperature profiles very well.

  8. Human surrogate models of neuropathic pain: validity and limitations.

    Science.gov (United States)

    Binder, Andreas

    2016-02-01

    Human surrogate models of neuropathic pain in healthy subjects are used to study symptoms, signs, and the hypothesized underlying mechanisms. Although different models are available, different spontaneous and evoked symptoms and signs are inducible; 2 key questions need to be answered: are human surrogate models conceptually valid, ie, do they share the sensory phenotype of neuropathic pain states, and are they sufficiently reliable to allow consistent translational research?

  9. Development and validation of a cisplatin dose-ototoxicity model.

    Science.gov (United States)

    Dille, Marilyn F; Wilmington, Debra; McMillan, Garnett P; Helt, Wendy; Fausti, Stephen A; Konrad-Martin, Dawn

    2012-01-01

    Cisplatin is effective in the treatment of several cancers but is a known ototoxin resulting in shifts to hearing sensitivity in up to 50-60% of patients. Cisplatin-induced hearing shifts tend to occur first within an octave of a patient's high frequency hearing limit, termed the sensitive range for ototoxicity (SRO), and progress to lower frequencies. While it is currently not possible to know which patients will experience ototoxicity without testing their hearing directly, monitoring the SRO provides an early indication of damage. A tool to help forecast susceptibility to ototoxic-induced changes in the SRO in advance of each chemotherapy treatment visit may prove useful for ototoxicity monitoring efforts, patient counseling, and therapeutic planning. This project was designed to (1) establish pretreatment risk curves that quantify the probability that a new patient will suffer hearing loss within the SRO during treatment with cisplatin and (2) evaluate the accuracy of these predictions in an independent sample of Veterans receiving cisplatin for the treatment of cancer. Two study samples were used. The Developmental sample contained 23 subjects while the Validation sample consisted of 12 subjects. Risk curve predictions for SRO threshold shifts following cisplatin exposure were developed using a Developmental sample comprised of data from a total of 155 treatment visits obtained in 45 ears of 23 Veterans. Pure-tone thresholds were obtained within each subject's SRO at each treatment visit and compared with baseline measures. The risk of incurring an SRO shift was statistically modeled as a function of factors related to chemotherapy treatment (cisplatin dose, radiation treatment, doublet medication) and patient status (age, pre-exposure hearing, cancer location and stage). The model was reduced so that only statistically significant variables were included. Receiver-operating characteristic (ROC) curve analyses were then used to determine the accuracy of the

  10. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  11. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  12. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  13. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    Science.gov (United States)

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  14. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  15. Experimental Validation of a Mathematical Model for Seabed Liquefaction Under Waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2012-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt (d(50) = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range of 7.7-18 cm, 55-cm water depth and 1.6-s wave period enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data were used to validate the model. A numerical example...

  16. Experimental validation of a mathematical model for seabed liquefaction in waves

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Kirca, Özgür; Fredsøe, Jørgen

    2011-01-01

    This paper summarizes the results of an experimental study directed towards the validation of a mathematical model for the buildup of pore water pressure and resulting liquefaction of marine soils under progressive waves. Experiments were conducted under controlled conditions with silt ( d50 = 0.......070 mm) in a wave flume with a soil pit. Waves with wave heights in the range 7.7-18 cm with the water depth 55 cm and the wave period 1.6 s enabled us to study both the liquefaction and no-liquefaction regime pore water pressure buildup. The experimental data was used to validate the model. A numerical...

  17. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    Science.gov (United States)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  18. System Modeling, Validation, and Design of Shape Controllers for NSTX

    Science.gov (United States)

    Walker, M. L.; Humphreys, D. A.; Eidietis, N. W.; Leuer, J. A.; Welander, A. S.; Kolemen, E.

    2011-10-01

    Modeling of the linearized control response of plasma shape and position has become fairly routine in the last several years. However, such response models rely on the input of accurate values of model parameters such as conductor and diagnostic sensor geometry and conductor resistivity or resistance. Confidence in use of such a model therefore requires that some effort be spent in validating that the model has been correctly constructed. We describe the process of constructing and validating a response model for NSTX plasma shape and position control, and subsequent use of that model for the development of shape and position controllers. The model development, validation, and control design processes are all integrated within a Matlab-based toolset known as TokSys. The control design method described emphasizes use of so-called decoupling control, in which combinations of coil current modifications are designed to modify only one control parameter at a time, without perturbing any other control parameter values. Work supported by US DOE under DE-FG02-99ER54522 and DE-AC02-09CH11466.

  19. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    Science.gov (United States)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  20. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  1. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  2. Cross-validation model assessment for modular networks

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Model assessment of the stochastic block model is a crucial step in identification of modular structures in networks. Although this has typically been done according to the principle that a parsimonious model with a large marginal likelihood or a short description length should be selected, another principle is that a model with a small prediction error should be selected. We show that the leave-one-out cross-validation estimate of the prediction error can be efficiently obtained using belief propagation for sparse networks. Furthermore, the relations among the objectives for model assessment enable us to determine the exact cause of overfitting.

  3. Model Validation for Shipboard Power Cables Using Scattering Parameters%Model Validation for Shipboard Power Cables Using Scattering Parameters

    Institute of Scientific and Technical Information of China (English)

    Lukas Graber; Diomar Infante; Michael Steurer; William W. Brey

    2011-01-01

    Careful analysis of transients in shipboard power systems is important to achieve long life times of the com ponents in future all-electric ships. In order to accomplish results with high accuracy, it is recommended to validate cable models as they have significant influence on the amplitude and frequency spectrum of voltage transients. The authors propose comparison of model and measurement using scattering parameters. They can be easily obtained from measurement and simulation and deliver broadband information about the accuracy of the model. The measurement can be performed using a vector network analyzer. The process to extract scattering parameters from simulation models is explained in detail. Three different simulation models of a 5 kV XLPE power cable have been validated. The chosen approach delivers an efficient tool to quickly estimate the quality of a model.

  4. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  5. Modeling the Experience of Emotion

    OpenAIRE

    Broekens, Joost

    2009-01-01

    Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers resulting in work that is widely published. The majority of this work consists of computational models of emotion recognition, computational modeling of causal factors of emotion and emotion expression through rendered and robotic faces. A smaller part is concerned with modeling the effects of emotion, formal modeling of cognitive appraisal theory and models of emergent...

  6. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  7. Validation Study of Unnotched Charpy and Taylor-Anvil Impact Experiments using Kayenta

    Energy Technology Data Exchange (ETDEWEB)

    Kamojjala, Krishna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lacy, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chu, Henry S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brannon, Rebecca [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    Validation of a single computational model with multiple available strain-to-failure fracture theories is presented through experimental tests and numerical simulations of the standardized unnotched Charpy and Taylor-anvil impact tests, both run using the same material model (Kayenta). Unnotched Charpy tests are performed on rolled homogeneous armor steel. The fracture patterns using Kayenta’s various failure options that include aleatory uncertainty and scale effects are compared against the experiments. Other quantities of interest include the average value of the absorbed energy and bend angle of the specimen. Taylor-anvil impact tests are performed on Ti6Al4V titanium alloy. The impact speeds of the specimen are 321 m/s and 393 m/s. The goal of the numerical work is to reproduce the damage patterns observed in the laboratory. For the numerical study, the Johnson-Cook failure model is used as the ductile fracture criterion, and aleatory uncertainty is applied to rate-dependence parameters to explore its effect on the fracture patterns.

  8. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  9. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided i

  10. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate emp...

  11. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1999-01-01

    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  12. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  13. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  14. ID Model Construction and Validation: A Multiple Intelligences Case

    Science.gov (United States)

    Tracey, Monica W.; Richey, Rita C.

    2007-01-01

    This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

  15. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  16. Bibliometric Modeling Processes and the Empirical Validity of Lotka's Law.

    Science.gov (United States)

    Nicholls, Paul Travis

    1989-01-01

    Examines the elements involved in fitting a bibliometric model to empirical data, proposes a consistent methodology for applying Lotka's law, and presents the results of an empirical test of the methodology. The results are discussed in terms of the validity of Lotka's law and the suitability of the proposed methodology. (49 references) (CLB)

  17. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  18. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  19. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  20. Validation of a Model for Ice Formation around Finned Tubes

    OpenAIRE

    Kamal A. R. Ismai; Fatima A. M. Lino

    2016-01-01

    Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was di...

  1. Toward metrics and model validation in web-site QEM

    OpenAIRE

    Olsina Santos, Luis Antonio; Pons, Claudia; Rossi, Gustavo Héctor

    2000-01-01

    In this work, a conceptual framework and the associated strategies for metrics and model validation are analyzed regarding website measurement and evaluation. Particularly, we have conducted three case studies in different Web domains in order to evaluate and compare the quality of sites. For such an end the quantitative, model-based methodology, so-called Web-site QEM (Quality Evaluation Methodology), was utilized. In the assessment process of sites, definition of attributes and measurements...

  2. Validating firn compaction model with remote sensing data

    OpenAIRE

    2011-01-01

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland ...

  3. Dynamically Scaled Model Experiment of a Mooring Cable

    Directory of Open Access Journals (Sweden)

    Lars Bergdahl

    2016-01-01

    Full Text Available The dynamic response of mooring cables for marine structures is scale-dependent, and perfect dynamic similitude between full-scale prototypes and small-scale physical model tests is difficult to achieve. The best possible scaling is here sought by means of a specific set of dimensionless parameters, and the model accuracy is also evaluated by two alternative sets of dimensionless parameters. A special feature of the presented experiment is that a chain was scaled to have correct propagation celerity for longitudinal elastic waves, thus providing perfect geometrical and dynamic scaling in vacuum, which is unique. The scaling error due to incorrect Reynolds number seemed to be of minor importance. The 33 m experimental chain could then be considered a scaled 76 mm stud chain with the length 1240 m, i.e., at the length scale of 1:37.6. Due to the correct elastic scale, the physical model was able to reproduce the effect of snatch loads giving rise to tensional shock waves propagating along the cable. The results from the experiment were used to validate the newly developed cable-dynamics code, MooDy, which utilises a discontinuous Galerkin FEM formulation. The validation of MooDy proved to be successful for the presented experiments. The experimental data is made available here for validation of other numerical codes by publishing digitised time series of two of the experiments.

  4. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  5. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    Energy Technology Data Exchange (ETDEWEB)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Department of Materials, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  6. Propeller aircraft interior noise model utilization study and validation

    Science.gov (United States)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  7. Dynamic validation of the Planck/LFI thermal model

    CERN Document Server

    Tomasi, M; Gregorio, A; Colombo, F; Lapolla, M; Terenzi, L; Morgante, G; Bersanelli, M; Butler, R C; Galeotta, S; Mandolesi, N; Maris, M; Mennella, A; Valenziano, L; Zacchei, A; 10.1088/1748-0221/5/01/T01002

    2010-01-01

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its valid...

  8. Validation of a finite element model of the human metacarpal.

    Science.gov (United States)

    Barker, D S; Netherway, D J; Krishnan, J; Hearn, T C

    2005-03-01

    Implant loosening and mechanical failure of components are frequently reported following metacarpophalangeal (MCP) joint replacement. Studies of the mechanical environment of the MCP implant-bone construct are rare. The objective of this study was to evaluate the predictive ability of a finite element model of the intact second human metacarpal to provide a validated baseline for further mechanical studies. A right index human metacarpal was subjected to torsion and combined axial/bending loading using strain gauge (SG) and 3D finite element (FE) analysis. Four different representations of bone material properties were considered. Regression analyses were performed comparing maximum and minimum principal surface strains taken from the SG and FE models. Regression slopes close to unity and high correlation coefficients were found when the diaphyseal cortical shell was modelled as anisotropic and cancellous bone properties were derived from quantitative computed tomography. The inclusion of anisotropy for cortical bone was strongly influential in producing high model validity whereas variation in methods of assigning stiffness to cancellous bone had only a minor influence. The validated FE model provides a tool for future investigations of current and novel MCP joint prostheses.

  9. Process Control of Pink Guava Puree Pasteurization Process: Simulation and Validation by Experiment

    Directory of Open Access Journals (Sweden)

    W. M. F. Wan Mokhtar

    2012-01-01

    Full Text Available Recently, process control has been applied extensively in many food processes include pasteurization process. The purpose is to control and maintain the product temperature at desired value. In order to be able to control the process properly, the model of the process needs to be obtained. This research aims to obtain the empirical model and to determine the best control strategy in pasteurization process of pink guava puree. The PID controller tuned by different tuning methods was simulated using Simulink and closed loop responses were observed. Simulation results revealed that PID controller tuned by minimizing of integral absolute error (IAE method were satisfactory adaptable in this process in term of faster settling time, less overshoot, smallest values of IAE and ISE that less than 1. Then, experiment was performed using this method in order to validate simulation results. In general, a good agreement was achieved between experimental data and dynamic simulation result in control of pasteurization temperature process with  R2=0.83. As the conclusion, the results obtained can be used as the recommendation for a suitable control strategy for the pasteurization process of pink guava puree in the industry.

  10. Towards Generic Models of Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Shaker, Mohammad; Abou-Zleikha, Mohamed

    2015-01-01

    further examine whether generic features of player be- haviour can be defined and used to boost the modelling per- formance. The accuracies obtained in both experiments in- dicate a promise for the proposed approach and suggest that game-independent player experience models can be built.......-dependent and their applicability is usually limited to the system and the data used for model construction. Establishing models of user experience that are highly scalable while maintaing the performance constitutes an important research direction. In this paper, we propose generic models of user experience in the computer games...... domain. We employ two datasets collected from players in- teractions with two games from different genres where accu- rate models of players experience were previously built. We take the approach one step further by investigating the mod- elling mechanism ability to generalise over the two datasets. We...

  11. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  12. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  13. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  14. Finite Element Model of the Knee for Investigation of Injury Mechanisms: Development and Validation

    Science.gov (United States)

    Kiapour, Ali; Kiapour, Ata M.; Kaul, Vikas; Quatman, Carmen E.; Wordeman, Samuel C.; Hewett, Timothy E.; Demetropoulos, Constantine K.; Goel, Vijay K.

    2014-01-01

    Multiple computational models have been developed to study knee biomechanics. However, the majority of these models are mainly validated against a limited range of loading conditions and/or do not include sufficient details of the critical anatomical structures within the joint. Due to the multifactorial dynamic nature of knee injuries, anatomic finite element (FE) models validated against multiple factors under a broad range of loading conditions are necessary. This study presents a validated FE model of the lower extremity with an anatomically accurate representation of the knee joint. The model was validated against tibiofemoral kinematics, ligaments strain/force, and articular cartilage pressure data measured directly from static, quasi-static, and dynamic cadaveric experiments. Strong correlations were observed between model predictions and experimental data (r > 0.8 and p knee joint as well as the complex, nonuniform stress and strain fields that occur in biological soft tissue. Such a model will facilitate the in-depth understanding of a multitude of potential knee injury mechanisms with special emphasis on ACL injury. PMID:24763546

  15. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  16. Validation of Advanced EM Models for UXO Discrimination

    CERN Document Server

    Weichman, Peter B

    2012-01-01

    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  17. Integral Reactor Containment Condensation Model and Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Qiao [Oregon State Univ., Corvallis, OR (United States); Corradini, Michael [Univ. of Wisconsin, Madison, WI (United States)

    2016-05-02

    This NEUP funded project, NEUP 12-3630, is for experimental, numerical and analytical studies on high-pressure steam condensation phenomena in a steel containment vessel connected to a water cooling tank, carried out at Oregon State University (OrSU) and the University of Wisconsin at Madison (UW-Madison). In the three years of investigation duration, following the original proposal, the planned tasks have been completed: (1) Performed a scaling study for the full pressure test facility applicable to the reference design for the condensation heat transfer process during design basis accidents (DBAs), modified the existing test facility to route the steady-state secondary steam flow into the high pressure containment for controllable condensation tests, and extended the operations at negative gage pressure conditions (OrSU). (2) Conducted a series of DBA and quasi-steady experiments using the full pressure test facility to provide a reliable high pressure condensation database (OrSU). (3) Analyzed experimental data and evaluated condensation model for the experimental conditions, and predicted the prototypic containment performance under accidental conditions (UW-Madison). A film flow model was developed for the scaling analysis, and the results suggest that the 1/3 scaled test facility covers large portion of laminar film flow, leading to a lower average heat transfer coefficient comparing to the prototypic value. Although it is conservative in reactor safety analysis, the significant reduction of heat transfer coefficient (50%) could under estimate the prototypic condensation heat transfer rate, resulting in inaccurate prediction of the decay heat removal capability. Further investigation is thus needed to quantify the scaling distortion for safety analysis code validation. Experimental investigations were performed in the existing MASLWR test facility at OrST with minor modifications. A total of 13 containment condensation tests were conducted for pressure

  18. Design of experiment and data analysis by JMP (SAS institute) in analytical method validation.

    Science.gov (United States)

    Ye, C; Liu, J; Ren, F; Okafo, N

    2000-08-15

    Validation of an analytical method through a series of experiments demonstrates that the method is suitable for its intended purpose. Due to multi-parameters to be examined and a large number of experiments involved in validation, it is important to design the experiments scientifically so that appropriate validation parameters can be examined simultaneously to provide a sound, overall knowledge of the capabilities of the analytical method. A statistical method through design of experiment (DOE) was applied to the validation of a HPLC analytical method for the quantitation of a small molecule in drug product in terms of intermediate precision and robustness study. The data were analyzed in JMP (SAS institute) software using analyses of variance method. Confidence intervals for outcomes and control limits for individual parameters were determined. It was demonstrated that the experimental design and statistical analysis used in this study provided an efficient and systematic approach to evaluating intermediate precision and robustness for a HPLC analytical method for small molecule quantitation.

  19. Service validity and service reliability of search, experience and credence services: A scenario study

    NARCIS (Netherlands)

    Galetzka, Mirjam; Verhoeven, Joost W.M.; Pruyn, Ad Th.H.

    2006-01-01

    The purpose of this research is to add to our understanding of the antecedents of customer satisfaction by examining the effects of service reliability (Is the service “correctly” produced?) and service validity (Is the “correct” service produced?) of search, experience and credence services. Design

  20. (In)validation in the Minority: The Experiences of Latino Students Enrolled in an HBCU

    Science.gov (United States)

    Allen, Taryn Ozuna

    2016-01-01

    This qualitative, phenomenological study examined the academic and interpersonal validation experiences of four female and four male Latino students who were enrolled in their second- to fifth-year at an HBCU in Texas. Using interviews, campus observations, a questionnaire, and analytic memos, this study sought to understand the role of in- and…

  1. Validation of Earth Radiation Budget Experiment scanning radiometer data inversion procedures

    Science.gov (United States)

    Manalo, Natividad D.; Smith, G. L.; Green, Richard N.; Avis, Lee M.; Suttles, John T.

    1990-01-01

    Validation techniques were implemented in the inversion of scanner radiometer data to assess the accuracy of the top of atmosphere radiant fluxes. An evaluation of SW radiant flux standard deviations for the same scene type shows that they contribute about 6.0 W/sq m for viewing zenith angles less than 55 deg and can reach values of up to 17.6 W/sq m for larger zenith angles in the backward scanning position. Three-channel intercomparison results, presented as color graphic displays and histograms, effectively validate the radiance measurements and the spectral factors. Along-track data were used to validate limb-darkening models and showed good agreement with current ERBE models. These validation techniques were found to be very effective in assessing the quality of the radiant fluxes generated by the ERBE inversion algorithm.

  2. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  3. Numerical experiments modelling turbulent flows

    Directory of Open Access Journals (Sweden)

    Trefilík Jiří

    2014-03-01

    Full Text Available The work aims at investigation of the possibilities of modelling transonic flows mainly in external aerodynamics. New results are presented and compared with reference data and previously achieved results. For the turbulent flow simulations two modifications of the basic k – ω model are employed: SST and TNT. The numerical solution was achieved by using the MacCormack scheme on structured non-ortogonal grids. Artificial dissipation was added to improve the numerical stability.

  4. Experimental validation of a solar-chimney power plant model

    Science.gov (United States)

    Fathi, Nima; Wayne, Patrick; Trueba Monje, Ignacio; Vorobieff, Peter

    2016-11-01

    In a solar chimney power plant system (SCPPS), the energy of buoyant hot air is converted to electrical energy. SCPPS includes a collector at ground level covered with a transparent roof. Solar radiation heats the air inside and the ground underneath. There is a tall chimney at the center of the collector, and a turbine located at the base of the chimney. Lack of detailed experimental data for validation is one of the important issues in modeling this type of power plants. We present a small-scale experimental prototype developed to perform validation analysis for modeling and simulation of SCCPS. Detailed velocity measurements are acquired using particle image velocimetry (PIV) at a prescribed Reynolds number. Convection is driven by a temperature-controlled hot plate at the bottom of the prototype. Velocity field data are used to perform validation analysis and measure any mismatch of the experimental results and the CFD data. CFD Code verification is also performed, to assess the uncertainly of the numerical model with respect to our grid and the applied mathematical model. The dimensionless output power of the prototype is calculated and compared with a recent analytical solution and the experimental results.

  5. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  6. PBMR radionuclide source term analysis validation based on AVR operating experience

    Energy Technology Data Exchange (ETDEWEB)

    Stoker, C.C. [PBMR, Lake Buena Vista Building, 1267 Gordon Hood Avenue, Centurion 0046 (South Africa); Olivier, L.D. [Independent Nuclear Consultants, Grahamstown (South Africa); Stassen, E.; Reitsma, F. [PBMR, Lake Buena Vista Building, 1267 Gordon Hood Avenue, Centurion 0046 (South Africa); Merwe, J.J. van der, E-mail: hanno.vdmerwe@pbmr.co.z [PBMR, Lake Buena Vista Building, 1267 Gordon Hood Avenue, Centurion 0046 (South Africa)

    2010-10-15

    The determination of radionuclide source terms is vital for any reactor design and licensing safety evaluation. This paper provides an overview of the PBMR analysis tools with specific focus on the modelling of mobile and deposited radionuclide source terms within the pressure boundary of a typical pebble-bed high temperature reactor (HTR). The main focus is on the Dust and Activity Migration and Distribution (DAMD) software code system that models the activation, migration and time-dependent distribution of dust and atomic particles in an HTR such as the AVR and PBMR. Since DAMD provides a time-dependent systems integrated model of HTR designs, most of the obvious physical phenomena relevant to source terms are at play. These include the neutron flux, activation cross-sections, radioactive decay, dust production rates, dust impurity levels, dust filter capabilities, dust particle size distributions, thermal-hydraulic parameters influencing the migration and distribution of particles throughout the main power system and subsystems, and helium coolant leakage and make-up rates. At this stage the DAMD calibration and validation is mainly based on the operational data, experiments and measurements made during 21 years of operating life of the AVR. The comparisons of the DAMD results with various AVR measurements provide confidence in the use of DAMD for the PBMR design and safety evaluations. In addition, sensitivity analyses are performed with DAMD to determine the bounding system parameters that drive the migration and distribution of radionuclides. The use of DAMD to evaluate design configurations, e.g. the effect of the introduction and placement of filters on the radionuclide distribution, is also shown. In conclusion, the importance of a systems modelling approach for radionuclide transport and distribution within the pressure boundary of a typical HTR system, is demonstrated. Since the DAMD code system is calibrated and validated against the AVR measurements it

  7. Towards Generic Models of Player Experience

    DEFF Research Database (Denmark)

    Shaker, Noor; Shaker, Mohammad; Abou-Zleikha, Mohamed

    2015-01-01

    -dependent and their applicability is usually limited to the system and the data used for model construction. Establishing models of user experience that are highly scalable while maintaing the performance constitutes an important research direction. In this paper, we propose generic models of user experience in the computer games...... further examine whether generic features of player be- haviour can be defined and used to boost the modelling per- formance. The accuracies obtained in both experiments in- dicate a promise for the proposed approach and suggest that game-independent player experience models can be built.......Context personalisation is a flourishing area of research with many applications. Context personalisation systems usually employ a user model to predict the appeal of the context to a particular user given a history of interactions. Most of the models used are context...

  8. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens;

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  9. [Development and validation of a finite element model of human knee joint for dynamic analysis].

    Science.gov (United States)

    Li, Haiyan; Gu, Yulong; Ruan, Shijie; Cui, Shihai

    2012-02-01

    Based on the biomechanical response of human knee joint to a front impact in occupants accidents, a finite element (FE) model of human knee joint was developed by using computer simulation technique for impacting. The model consists of human anatomical structure, including femoral condyle, tibia condyle, fibular small head, patellar, cartilage, meniscus and primary ligament. By comparing the results of the FE model with experiments of the knee joint in axial load conditions, the validation of the model was verified. Furthermore, this study provides data for the mechanical of human knee joint injury, and is helpful for the design and optimization of the vehicle protective devices.

  10. Finite Element Model and Validation of Nasal Tip Deformation.

    Science.gov (United States)

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  11. Reactivity worth measurements on fast burst reactor Caliban - description and interpretation of integral experiments for the validation of nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Richard, B. [Commissariat a l' Energie Atomique et Aux Energies Alternatives CEA, DAM, VALDUC, F-21120 Is-sur-Tille (France)

    2012-07-01

    Reactivity perturbation experiments using various materials are being performed on the HEU fast core CALIBAN, an experimental device operated by the CEA VALDUC Criticality and Neutron Transport Research Laboratory. These experiments provide valuable information to contribute to the validation of nuclear data for the materials used in such measurements. This paper presents the results obtained in a first series of measurements performed with Au-197 samples. Experiments which have been conducted in order to improve the characterization of the core are also described and discussed. The experimental results have been compared to numerical calculation using both deterministic and Monte Carlo neutron transport codes with a simplified model of the reactor. This early work led to a methodology which will be applied to the future experiments which will concern other materials of interest. (authors)

  12. Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research

    DEFF Research Database (Denmark)

    Ding, Ming; Danielsen, C.C.; Cheng, L.

    2009-01-01

    Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research +1Ding, M; 2Danielsen, CC; 1Cheng, L; 3Bollen, P; 4Schwarz, P; 1Overgaard, S +1Dept of Orthopaedics O, Odense University Hospital, Denmark, 2Dept of Connective Tissue Biology, University of Aarhus, Denmark, 3Biomedicine...... Lab, University of Southern Denmark, 4Dept of Geriatrics, Glostrup University Hospital, Denmark ming.ding@ouh.regionsyddanmark.dk   Introduction:  Currently, majority orthopaedic prosthesis and biomaterial researches have been based on investigation in normal animals. In most clinical situations, most...... resemble osteoporosis in humans. This study aimed to validate glucocorticoid-induced osteopenia sheep model for orthopaedic implant and biomaterial research. We hypothesized that a 7-month GC treatment together with restricted diet but without OVX would induce osteopenia. Materials and Methods: Eighteen...

  13. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  14. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  15. Validation of a heat conduction model for finite domain, non-uniformly heated, laminate bodies

    Science.gov (United States)

    Desgrosseilliers, Louis; Kabbara, Moe; Groulx, Dominic; White, Mary Anne

    2016-07-01

    Infrared thermographic validation is shown for a closed-form analytical heat conduction model for non-uniformly heated, laminate bodies with an insulated domain boundary. Experiments were conducted by applying power to rectangular electric heaters and cooled by natural convection in air, but also apply to constant-temperature heat sources and forced convection. The model accurately represents two-dimensional laminate heat conduction behaviour giving rise to heat spreading using one-dimensional equations for the temperature distributions and heat transfer rates under steady-state and pseudo-steady-state conditions. Validation of the model with an insulated boundary (complementing previous studies with an infinite boundary) provides useful predictions of heat spreading performance and simplified temperature uniformity calculations (useful in log-mean temperature difference style heat exchanger calculations) for real laminate systems such as found in electronics heat sinks, multi-ply stovetop cookware and interface materials for supercooled salt hydrates. Computational determinations of implicit insulated boundary condition locations in measured data, required to assess model equation validation, were also demonstrated. Excellent goodness of fit was observed (both root-mean-square error and R 2 values), in all cases except when the uncertainty of low temperatures measured via infrared thermography hindered the statistical significance of the model fit. The experimental validation in all other cases supports use of the model equations in design calculations and heat exchange simulations.

  16. Validation and Scaling of Soil Moisture in a Semi-Arid Environment: SMAP Validation Experiment 2015 (SMAPVEX15)

    Science.gov (United States)

    Colliander, Andreas; Cosh, Michael H.; Misra, Sidharth; Jackson, Thomas J.; Crow, Wade T.; Chan, Steven; Bindlish, Rajat; Chae, Chun; Holifield Collins, Chandra; Yueh, Simon H.

    2017-01-01

    The NASA SMAP (Soil Moisture Active Passive) mission conducted the SMAP Validation Experiment 2015 (SMAPVEX15) in order to support the calibration and validation activities of SMAP soil moisture data products. The main goals of the experiment were to address issues regarding the spatial disaggregation methodologies for improvement of soil moisture products and validation of the in situ measurement upscaling techniques. To support these objectives high-resolution soil moisture maps were acquired with the airborne PALS (Passive Active L-band Sensor) instrument over an area in southeast Arizona that includes the Walnut Gulch Experimental Watershed (WGEW), and intensive ground sampling was carried out to augment the permanent in situ instrumentation. The objective of the paper was to establish the correspondence and relationship between the highly heterogeneous spatial distribution of soil moisture on the ground and the coarse resolution radiometer-based soil moisture retrievals of SMAP. The high-resolution mapping conducted with PALS provided the required connection between the in situ measurements and SMAP retrievals. The in situ measurements were used to validate the PALS soil moisture acquired at 1-km resolution. Based on the information from a dense network of rain gauges in the study area, the in situ soil moisture measurements did not capture all the precipitation events accurately. That is, the PALS and SMAP soil moisture estimates responded to precipitation events detected by rain gauges, which were in some cases not detected by the in situ soil moisture sensors. It was also concluded that the spatial distribution of the soil moisture resulted from the relatively small spatial extents of the typical convective storms in this region was not completely captured with the in situ stations. After removing those cases (approximately10 of the observations) the following metrics were obtained: RMSD (root mean square difference) of0.016m3m3 and correlation of 0.83. The

  17. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  18. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  19. Firn Model Intercomparison Experiment (FirnMICE)

    DEFF Research Database (Denmark)

    Lundin, Jessica M.D.; Stevens, C. Max; Arthern, Robert

    2017-01-01

    Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes in tempe......Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes...... rate and temperature. Firn Model Intercomparison Experiment can provide a benchmark of results for future models, provide a basis to quantify model uncertainties and guide future directions of firn-densification modeling....

  20. A validated approach for modeling collapse of steel structures

    Science.gov (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  1. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS).

    Science.gov (United States)

    Phan, Mikki H; Keebler, Joseph R; Chaparro, Barbara S

    2016-12-01

    The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors. Playtesting is often conducted in the video game industry to help game developers build better games by providing insight into the players' attitudes and preferences. However, quality feedback is difficult to obtain from playtesting sessions without a quality gaming assessment tool. There is a need for a psychometrically validated and comprehensive gaming scale that is appropriate for playtesting and game evaluation purposes. The process of developing and validating this new scale followed current best practices of scale development and validation. As a result, a mixed-method design that consisted of item pool generation, expert review, questionnaire pilot study, exploratory factor analysis (N = 629), and confirmatory factor analysis (N = 729) was implemented. A new instrument measuring video game satisfaction, called the Game User Experience Satisfaction Scale (GUESS), with nine subscales emerged. The GUESS was demonstrated to have content validity, internal consistency, and convergent and discriminant validity. The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres. Thus, it can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience. The GUESS can be administered to evaluate user satisfaction of different types of video games by a variety of users. © 2016, Human Factors and Ergonomics Society.

  2. Validation of a Hertzian contact model with nonlinear damping

    Science.gov (United States)

    Sierakowski, Adam

    2015-11-01

    Due to limited spatial resolution, most disperse particle simulation methods rely on simplified models for incorporating short-range particle interactions. In this presentation, we introduce a contact model that combines the Hertz elastic restoring force with a nonlinear damping force, requiring only material properties and no tunable parameters. We have implemented the model in a resolved-particle flow solver that implements the Physalis method, which accurately captures hydrodynamic interactions by analytically enforcing the no-slip condition on the particle surface. We summarize the results of a few numerical studies that suggest the validity of the contact model over a range of particle interaction intensities (i.e., collision Stokes numbers) when compared with experimental data. This work was supported by the National Science Foundation under Grant Number CBET1335965 and the Johns Hopkins University Modeling Complex Systems IGERT program.

  3. Statistical validation of high-dimensional models of growing networks

    CERN Document Server

    Medo, Matus

    2013-01-01

    The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

  4. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  5. Model selection, identification and validation in anaerobic digestion: a review.

    Science.gov (United States)

    Donoso-Bravo, Andres; Mailier, Johan; Martin, Cristina; Rodríguez, Jorge; Aceves-Lara, César Arturo; Vande Wouwer, Alain

    2011-11-01

    Anaerobic digestion enables waste (water) treatment and energy production in the form of biogas. The successful implementation of this process has lead to an increasing interest worldwide. However, anaerobic digestion is a complex biological process, where hundreds of microbial populations are involved, and whose start-up and operation are delicate issues. In order to better understand the process dynamics and to optimize the operating conditions, the availability of dynamic models is of paramount importance. Such models have to be inferred from prior knowledge and experimental data collected from real plants. Modeling and parameter identification are vast subjects, offering a realm of approaches and methods, which can be difficult to fully understand by scientists and engineers dedicated to the plant operation and improvements. This review article discusses existing modeling frameworks and methodologies for parameter estimation and model validation in the field of anaerobic digestion processes. The point of view is pragmatic, intentionally focusing on simple but efficient methods.

  6. Validity of the Janssen Model for Layered Structures in a Silo

    Directory of Open Access Journals (Sweden)

    Abdul Qadir

    2011-07-01

    Full Text Available Granular materials are found every where despite they are poorly understood at microscopic level. The main hindrance is how to connect the microscopic properties with the macroscopic behavior and propose a rigorous unified theory. One method is to test the existing theoretical models in various configurations. In this connection we have performed experiments in different configurations of granules in a silo to determine the validity of the Janssen model under such arrangements. Two and four layered structures of different bead diameters are prepared. The effective mass at the bottom of the container in such cases have been measured. Moreover, the investigation of layered structures reveals that such configurations also follow well the Janssen model. An interesting percolation phenomenon was observed when smaller beads were stacked on larger ones, despite the model remained valid. Furthermore, it is demonstrated that Janssen law holds for larger bead diameters.

  7. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  8. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2012-02-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  9. Are screening instruments valid for psychotic-like experiences? A validation study of screening questions for psychotic-like experiences using in-depth clinical interview.

    LENUS (Irish Health Repository)

    Kelleher, Ian

    2011-03-01

    Individuals who report psychotic-like experiences are at increased risk of future clinical psychotic disorder. They constitute a unique "high-risk" group for studying the developmental trajectory to schizophrenia and related illnesses. Previous research has used screening instruments to identify this high-risk group, but the validity of these instruments has not yet been established. We administered a screening questionnaire with 7 items designed to assess psychotic-like experiences to 334 adolescents aged 11-13 years. Detailed clinical interviews were subsequently carried out with a sample of these adolescents. We calculated sensitivity and specificity and positive predictive value (PPV) and negative predictive value (NPV) for each screening question for the specific symptom it enquired about and also in relation to any psychotic-like experience. The predictive power varied substantially between items, with the question on auditory hallucinations ("Have you ever heard voices or sounds that no one else can hear?") providing the best predictive power. For interview-verified auditory hallucinations specifically, this question had a PPV of 71.4% and an NPV of 90.4%. When assessed for its predictive power for any psychotic-like experience (including, but not limited to, auditory hallucinations), it provided a PPV of 100% and an NPV of 88.4%. Two further questions-relating to visual hallucinations and paranoid thoughts-also demonstrated good predictive power for psychotic-like experiences. Our results suggest that it may be possible to screen the general adolescent population for psychotic-like experiences with a high degree of accuracy using a short self-report questionnaire.

  10. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  11. Experimental validation of flexible robot arm modeling and control

    Science.gov (United States)

    Ulsoy, A. Galip

    1989-01-01

    Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

  12. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  13. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  14. Characterization of a CLYC detector and validation of the Monte Carlo Simulation by measurement experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Suk; Ye, Sung Joon [Seoul National University, Seoul (Korea, Republic of); Smith, Martin B.; Koslowsky, Martin R. [Bubble Technology Industries Inc., Chalk River (Canada); Kwak, Sung Woo [Korea Institute of Nuclear Nonproliferation And Control (KINAC), Daejeon (Korea, Republic of); Kim Gee Hyun [Sejong University, Seoul (Korea, Republic of)

    2017-03-15

    Simultaneous detection of neutrons and gamma rays have become much more practicable, by taking advantage of good gamma-ray discrimination properties using pulse shape discrimination (PSD) technique. Recently, we introduced a commercial CLYC system in Korea, and performed an initial characterization and simulation studies for the CLYC detector system to provide references for the future implementation of the dual-mode scintillator system in various studies and applications. We evaluated a CLYC detector with 95% 6Li enrichment using various gamma-ray sources and a 252Cf neutron source, with validation of our Monte Carlo simulation results via measurement experiments. Absolute full-energy peak efficiency values were calculated for gamma-ray sources and neutron source using MCNP6 and compared with measurement experiments of the calibration sources. In addition, behavioral characteristics of neutrons were validated by comparing simulations and experiments on neutron moderation with various polyethylene (PE) moderator thicknesses. Both results showed good agreements in overall characteristics of the gamma and neutron detection efficiencies, with consistent ⁓20% discrepancy. Furthermore, moderation of neutrons emitted from {sup 252}Cf showed similarities between the simulation and the experiment, in terms of their relative ratios depending on the thickness of the PE moderator. A CLYC detector system was characterized for its energy resolution and detection efficiency, and Monte Carlo simulations on the detector system was validated experimentally. Validation of the simulation results in overall trend of the CLYC detector behavior will provide the fundamental basis and validity of follow-up Monte Carlo simulation studies for the development of our dual-particle imager using a rotational modulation collimator.

  15. Validating a spatially distributed hydrological model with soil morphology data

    Directory of Open Access Journals (Sweden)

    T. Doppler

    2013-10-01

    Full Text Available Spatially distributed hydrological models are popular tools in hydrology and they are claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time-series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for the transport of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography. Around 40% of the catchment area are artificially drained. We measured weather data, discharge and groundwater levels in 11 piezometers for 1.5 yr. For broadening the spatially distributed data sets that can be used for model calibration and validation, we translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. We used redox-morphology signs for these estimates. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to the groundwater levels in the piezometers and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the accuracy of the groundwater level predictions was not high enough to be used for the prediction of saturated areas

  16. Analyzing the Validity of Relationship Banking through Agent-based Modeling

    Science.gov (United States)

    Nishikido, Yukihito; Takahashi, Hiroshi

    This article analyzes the validity of relationship banking through agent-based modeling. In the analysis, we especially focus on the relationship between economic conditions and both lenders' and borrowers' behaviors. As a result of intensive experiments, we made the following interesting findings: (1) Relationship banking contributes to reducing bad loan; (2) relationship banking is more effective in enhancing the market growth compared to transaction banking, when borrowers' sales scale is large; (3) keener competition among lenders may bring inefficiency to the market.

  17. Organic acid modeling and model validation: Workshop summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  18. Organic acid modeling and model validation: Workshop summary

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  19. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  20. Packed bed heat storage: Continuum mechanics model and validation

    Science.gov (United States)

    Knödler, Philipp; Dreißigacker, Volker; Zunft, Stefan

    2016-05-01

    Thermal energy storage (TES) systems are key elements for various types of new power plant concepts. As possible cost-effective storage inventory option, packed beds of miscellaneous material come into consideration. However, high technical risks arise due to thermal expansion and shrinking of the packed bed's particles during cyclic thermal operation, possibly leading to material failure. Therefore, suitable tools for designing the heat storage system are mandatory. While particle discrete models offer detailed simulation results, the computing time for large scale applications is inefficient. In contrast, continuous models offer time-efficient simulation results but are in need of effective packed bed parameters. This work focuses on providing insight into some basic methods and tools on how to obtain such parameters and on how they are implemented into a continuum model. In this context, a particle discrete model as well as a test rig for carrying out uniaxial compression tests (UCT) is introduced. Performing of experimental validation tests indicate good agreement with simulated UCT results. In this process, effective parameters required for a continuous packed bed model were identified and used for continuum simulation. This approach is validated by comparing the simulated results with experimental data from another test rig. The presented method significantly simplifies subsequent design studies.

  1. Validation of data-driven computational models of social perception of faces.

    Science.gov (United States)

    Todorov, Alexander; Dotsch, Ron; Porter, Jenny M; Oosterhof, Nikolaas N; Falvello, Virginia B

    2013-08-01

    People rapidly form impressions from facial appearance, and these impressions affect social decisions. We argue that data-driven, computational models are the best available tools for identifying the source of such impressions. Here we validate seven computational models of social judgments of faces: attractiveness, competence, dominance, extroversion, likability, threat, and trustworthiness. The models manipulate both face shape and reflectance (i.e., cues such as pigmentation and skin smoothness). We show that human judgments track the models' predictions (Experiment 1) and that the models differentiate between different judgments, though this differentiation is constrained by the similarity of the models (Experiment 2). We also make the validated stimuli available for academic research: seven databases containing 25 identities manipulated in the respective model to take on seven different dimension values, ranging from -3 SD to +3 SD (175 stimuli in each database). Finally, we show how the computational models can be used to control for shared variance of the models. For example, even for highly correlated dimensions (e.g., dominance and threat), we can identify cues specific to each dimension and, consequently, generate faces that vary only on these cues.

  2. Calibration and validation of DRAINMOD to model bioretention hydrology

    Science.gov (United States)

    Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

    2013-04-01

    SummaryPrevious field studies have shown that the hydrologic performance of bioretention cells varies greatly because of factors such as underlying soil type, physiographic region, drainage configuration, surface storage volume, drainage area to bioretention surface area ratio, and media depth. To more accurately describe bioretention hydrologic response, a long-term hydrologic model that generates a water balance is needed. Some current bioretention models lack the ability to perform long-term simulations and others have never been calibrated from field monitored bioretention cells with underdrains. All peer-reviewed models lack the ability to simultaneously perform both of the following functions: (1) model an internal water storage (IWS) zone drainage configuration and (2) account for soil-water content using the soil-water characteristic curve. DRAINMOD, a widely-accepted agricultural drainage model, was used to simulate the hydrologic response of runoff entering a bioretention cell. The concepts of water movement in bioretention cells are very similar to those of agricultural fields with drainage pipes, so many bioretention design specifications corresponded directly to DRAINMOD inputs. Detailed hydrologic measurements were collected from two bioretention field sites in Nashville and Rocky Mount, North Carolina, to calibrate and test the model. Each field site had two sets of bioretention cells with varying media depths, media types, drainage configurations, underlying soil types, and surface storage volumes. After 12 months, one of these characteristics was altered - surface storage volume at Nashville and IWS zone depth at Rocky Mount. At Nashville, during the second year (post-repair period), the Nash-Sutcliffe coefficients for drainage and exfiltration/evapotranspiration (ET) both exceeded 0.8 during the calibration and validation periods. During the first year (pre-repair period), the Nash-Sutcliffe coefficients for drainage, overflow, and exfiltration

  3. Neural network models of learning and categorization in multigame experiments

    Directory of Open Access Journals (Sweden)

    Davide eMarchiori

    2011-12-01

    Full Text Available Previous research has shown that regret-driven neural networks predict behavior in repeated completely mixed games remarkably well, substantially equating the performance of the most accurate established models of learning. This result prompts the question of what is the added value of modeling learning through neural networks. We submit that this modeling approach allows for models that are able to distinguish among and respond differently to different payoff structures. Moreover, the process of categorization of a game is implicitly carried out by these models, thus without the need of any external explicit theory of similarity between games. To validate our claims, we designed and ran two multigame experiments in which subjects faced, in random sequence, different instances of two completely mixed 2x2 games. Then, we tested on our experimental data two regret-driven neural network models, and compared their performance with that of other established models of learning and Nash equilibrium.

  4. Second-order model selection in mixture experiments

    Energy Technology Data Exchange (ETDEWEB)

    Redgate, P.E.; Piepel, G.F.; Hrma, P.R.

    1992-07-01

    Full second-order models for q-component mixture experiments contain q(q+l)/2 terms, which increases rapidly as q increases. Fitting full second-order models for larger q may involve problems with ill-conditioning and overfitting. These problems can be remedied by transforming the mixture components and/or fitting reduced forms of the full second-order mixture model. Various component transformation and model reduction approaches are discussed. Data from a 10-component nuclear waste glass study are used to illustrate ill-conditioning and overfitting problems that can be encountered when fitting a full second-order mixture model. Component transformation, model term selection, and model evaluation/validation techniques are discussed and illustrated for the waste glass example.

  5. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  6. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4–H2O and ternary H2SO4–NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  7. Multicomponent aerosol dynamics model UHMA: model development and validation

    Science.gov (United States)

    Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

    2004-05-01

    A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  8. Development, Validation and Parametric study of a 3-Year-Old Child Head Finite Element Model

    Science.gov (United States)

    Cui, Shihai; Chen, Yue; Li, Haiyan; Ruan, ShiJie

    2015-12-01

    Traumatic brain injury caused by drop and traffic accidents is an important reason for children's death and disability. Recently, the computer finite element (FE) head model has been developed to investigate brain injury mechanism and biomechanical responses. Based on CT data of a healthy 3-year-old child head, the FE head model with detailed anatomical structure was developed. The deep brain structures such as white matter, gray matter, cerebral ventricle, hippocampus, were firstly created in this FE model. The FE model was validated by comparing the simulation results with that of cadaver experiments based on reconstructing the child and adult cadaver experiments. In addition, the effects of skull stiffness on the child head dynamic responses were further investigated. All the simulation results confirmed the good biofidelity of the FE model.

  9. Solar Module Modeling, Simulation And Validation Under Matlab / Simulink

    Directory of Open Access Journals (Sweden)

    M.Diaw

    2016-09-01

    Full Text Available Solar modules are systems which convert sunlight into electricity using the physics of semiconductors. Mathematical modeling of these systems uses weather data such as irradiance and temperature as inputs. It provides the current, voltage or power as outputs, which allows plot the characteristic giving the intensity I as a function of voltage V for photovoltaic cells. In this work, we have developed a model for a diode of a Photovoltaic module under the Matlab / Simulink environment. From this model, we have plotted the characteristic curves I-V and P-V of solar cell for different values of temperature and sunlight. The validation has been done by comparing the experimental curve with power from a solar panel HORONYA 20W type with that obtained by the model.

  10. Dynamic Modeling of Wind Turbine Gearboxes and Experimental Validation

    DEFF Research Database (Denmark)

    Pedersen, Rune

    is presented. The model takes into account the effects of load and applied grinding corrections. The results are verified by comparing to simulated and experimental results reported in the existing literature. Using gear data loosely based on a 1 MW wind turbine gearbox, the gear mesh stiffness is expanded...... analysis in relation to gear dynamics. A multibody model of two complete 2.3MWwind turbine gearboxes mounted back-to-back in a test rig is built. The mean values of the proposed gear mesh stiffnesses are included. The model is validated by comparing with calculated and measured eigenfrequencies and mode...... shapes. The measured eigenfrequencies have been identified in accelerometer signals obtained during run-up tests. Since the calculated eigenfrequencies do not match the measured eigenfrequencies with sufficient accuracy, a model updating technique is applied to ensure a better match by adjusting...

  11. Two-phase CFD PTS validation in an extended range of thermohydraulics conditions covered by the COSI experiment

    Energy Technology Data Exchange (ETDEWEB)

    Coste, P., E-mail: pierre.coste@cea.fr [CEA, DEN/DANS/DM2S/SMTF/LMSF, 17 rue des Martyrs, 38054 Grenoble (France); Ortolan, A. [CEA, DEN/DANS/DM2S/SMTF/LMSF, 17 rue des Martyrs, 38054 Grenoble (France); ENSICA Engineering School, Toulouse (France)

    2014-11-15

    Highlights: • Models for large interfaces in two-phase CFD were developed for PTS. • The COSI experiment is used for NEPTUNE{sub C}FD integral validation. • COSI is a PWR cold leg scaled 1/100 for volume. • Fifty runs are calculated, covering a large range of flow configurations. • The CFD predicting capability is analysed using global and local measurements. - Abstract: In the context of the Pressurized Water Reactors (PWR) life duration safety studies, some models were developed to address the Pressurized Thermal Shock (PTS) from the two-phase CFD angle, dealing with interfaces much larger than cells size and with direct contact condensation. Such models were implemented in NEPTUNE{sub C}FD, a 3D transient Eulerian two-fluid model. The COSI experiment is used for its integral validation. It represents a cold leg scaled 1/100 for volume and power from a 900 MW PWR under a large range of LOCA PTS conditions. In this study, the CFD is evaluated in the whole range of parameters and flow configurations covered by the experiment. In a first step, a single choice of mesh and CFD models parameters is fixed and justified. In a second step, fifty runs are calculated. The CFD predicting capability is analysed, comparing the liquid temperature and the total condensation rate with the experiment, discussing their dependency on the inlet cold liquid rate, on the liquid level in the cold leg and on the difference between co-current and counter-current runs. It is shown that NEPTUNE{sub C}FD 1.0.8 calculates with a fair agreement a large range of flow configurations related to ECCS injection and steam condensation.

  12. Deviatoric constitutive model: domain of strain rate validity

    Energy Technology Data Exchange (ETDEWEB)

    Zocher, Marvin A [Los Alamos National Laboratory

    2009-01-01

    A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

  13. Modeling, Robust Control, and Experimental Validation of a Supercavitating Vehicle

    Science.gov (United States)

    Escobar Sanabria, David

    This dissertation considers the mathematical modeling, control under uncertainty, and experimental validation of an underwater supercavitating vehicle. By traveling inside a gas cavity, a supercavitating vehicle reduces hydrodynamic drag, increases speed, and minimizes power consumption. The attainable speed and power efficiency make these vehicles attractive for undersea exploration, high-speed transportation, and defense. However, the benefits of traveling inside a cavity come with difficulties in controlling the vehicle dynamics. The main challenge is the nonlinear force that arises when the back-end of the vehicle pierces the cavity. This force, referred to as planing, leads to oscillatory motion and instability. Control technologies that are robust to planing and suited for practical implementation need to be developed. To enable these technologies, a low-order vehicle model that accounts for inaccuracy in the characterization of planing is required. Additionally, an experimental method to evaluate possible pitfalls in the models and controllers is necessary before undersea testing. The major contribution of this dissertation is a unified framework for mathematical modeling, robust control synthesis, and experimental validation of a supercavitating vehicle. First, we introduce affordable experimental methods for mathematical modeling and controller testing under planing and realistic flow conditions. Then, using experimental observations and physical principles, we create a low-order nonlinear model of the longitudinal vehicle motion. This model quantifies the planing uncertainty and is suitable for robust controller synthesis. Next, based on the vehicle model, we develop automated tools for synthesizing controllers that deliver a certificate of performance in the face of nonlinear and uncertain planing forces. We demonstrate theoretically and experimentally that the proposed controllers ensure higher performance when the uncertain planing dynamics are

  14. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-08-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  15. Model Validation for Propulsion - On the TFNS and LES Subgrid Models for a Bluff Body Stabilized Flame

    Science.gov (United States)

    Wey, Thomas

    2017-01-01

    This paper summarizes the reacting results of simulating a bluff body stabilized flame experiment of Volvo Validation Rig using a releasable edition of the National Combustion Code (NCC). The turbulence models selected to investigate the configuration are the sub-grid scaled kinetic energy coupled large eddy simulation (K-LES) and the time-filtered Navier-Stokes (TFNS) simulation. The turbulence chemistry interaction used is linear eddy mixing (LEM).

  16. A new validation-assessment tool for health-economic decision models

    NARCIS (Netherlands)

    Mauskopf, J.; Vemer, P.; Voorn, van G.A.K.; Corro Ramos, I.

    2014-01-01

    A validation-assessment tool is being developed for decision makers to transparently and consistently evaluate the validation status of different health-economic decision models. It is designed as a list of validation techniques covering all relevant aspects of model validation to be filled in by

  17. Quantitative Validation of a Human Body Finite Element Model Using Rigid Body Impacts.

    Science.gov (United States)

    Vavalle, Nicholas A; Davis, Matthew L; Stitzel, Joel D; Gayzik, F Scott

    2015-09-01

    Validation is a critical step in finite element model (FEM) development. This study focuses on the validation of the Global Human Body Models Consortium full body average male occupant FEM in five localized loading regimes-a chest impact, a shoulder impact, a thoracoabdominal impact, an abdominal impact, and a pelvic impact. Force and deflection outputs from the model were compared to experimental traces and corridors scaled to the 50th percentile male. Predicted fractures and injury severity measures were compared to evaluate the model's injury prediction capabilities. The methods of ISO/TS 18571 were used to quantitatively assess the fit of model outputs to experimental force and deflection traces. The model produced peak chest, shoulder, thoracoabdominal, abdominal, and pelvis forces of 4.8, 3.3, 4.5, 5.1, and 13.0 kN compared to 4.3, 3.2, 4.0, 4.0, and 10.3 kN in the experiments, respectively. The model predicted rib and pelvic fractures related to Abbreviated Injury Scale scores within the ranges found experimentally all cases except the abdominal impact. ISO/TS 18571 scores for the impacts studied had a mean score of 0.73 with a range of 0.57-0.83. Well-validated FEMs are important tools used by engineers in advancing occupant safety.

  18. Validation of a numerical FSI simulation of an aortic BMHV by in vitro PIV experiments.

    Science.gov (United States)

    Annerel, S; Claessens, T; Degroote, J; Segers, P; Vierendeels, J

    2014-08-01

    In this paper, a validation of a recently developed fluid-structure interaction (FSI) coupling algorithm to simulate numerically the dynamics of an aortic bileaflet mechanical heart valve (BMHV) is performed. This validation is done by comparing the numerical simulation results with in vitro experiments. For the in vitro experiments, the leaflet kinematics and flow fields are obtained via the particle image velocimetry (PIV) technique. Subsequently, the same case is numerically simulated by the coupling algorithm and the resulting leaflet kinematics and flow fields are obtained. Finally, the results are compared, revealing great similarity in leaflet motion and flow fields between the numerical simulation and the experimental test. Therefore, it is concluded that the developed algorithm is able to capture very accurately all the major leaflet kinematics and dynamics and can be used to study and optimize the design of BMHVs.

  19. Experiment of Laser Pointing Stability on Different Surfaces to validate Micrometric Positioning Sensor

    CERN Document Server

    AUTHOR|(SzGeCERN)721924; Mainaud Durand, Helene; Piedigrossi, Didier; Sandomierski, Jacek; Sosin, Mateusz; Geiger, Alain; Guillaume, Sebastien

    2014-01-01

    CLIC requires 10 μm precision and accuracy over 200m for the pre-alignment of beam related components. A solution based on laser beam as straight line reference is being studied at CERN. It involves camera/shutter assemblies as micrometric positioning sensors. To validate the sensors, it is necessary to determine an appropriate material for the shutter in terms of laser pointing stability. Experiments are carried out with paper, metal and ceramic surfaces. This paper presents the standard deviations of the laser spot coordinates obtained on the different surfaces, as well as the measurement error. Our experiments validate the choice of paper and ceramic for the shutter of the micrometric positioning sensor. It also provides an estimate of the achievable precision and accuracy of the determination of the laser spot centre with respect to the shutter coordinate system defined by reference targets.

  20. Validation of spatial variability in downscaling results from the VALUE perfect predictor experiment

    Science.gov (United States)

    Widmann, Martin; Bedia, Joaquin; Gutiérrez, Jose Manuel; Maraun, Douglas; Huth, Radan; Fischer, Andreas; Keller, Denise; Hertig, Elke; Vrac, Mathieu; Wibig, Joanna; Pagé, Christian; Cardoso, Rita M.; Soares, Pedro MM; Bosshard, Thomas; Casado, Maria Jesus; Ramos, Petra

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. Within VALUE a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods has been developed. In the first validation experiment the downscaling methods are validated in a setup with perfect predictors taken from the ERA-interim reanalysis for the period 1997 - 2008. This allows to investigate the isolated skill of downscaling methods without further error contributions from the large-scale predictors. One aspect of the validation is the representation of spatial variability. As part of the VALUE validation we have compared various properties of the spatial variability of downscaled daily temperature and precipitation with the corresponding properties in observations. We have used two test validation datasets, one European-wide set of 86 stations, and one higher-density network of 50 stations in Germany. Here we present results based on three approaches, namely the analysis of i.) correlation matrices, ii.) pairwise joint threshold exceedances, and iii.) regions of similar variability. We summarise the information contained in correlation matrices by calculating the dependence of the correlations on distance and deriving decorrelation lengths, as well as by determining the independent degrees of freedom. Probabilities for joint threshold exceedances and (where appropriate) non-exceedances are calculated for various user-relevant thresholds related for instance to extreme precipitation or frost and heat days. The dependence of these probabilities on distance is again characterised by calculating typical length scales that separate dependent from independent exceedances. Regionalisation is based on rotated Principal Component Analysis. The results indicate which downscaling methods are preferable if the dependency of variability at different locations is relevant for the user.

  1. Methodology and issues of integral experiments selection for nuclear data validation

    Science.gov (United States)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  2. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  3. Firn Model Intercomparison Experiment (FirnMICE)

    DEFF Research Database (Denmark)

    Lundin, Jessica M.D.; Stevens, C. Max; Arthern, Robert

    2017-01-01

    Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes in tempe......Evolution of cold dry snow and firn plays important roles in glaciology; however, the physical formulation of a densification law is still an active research topic. We forced eight firn-densification models and one seasonal-snow model in six different experiments by imposing step changes...

  4. Rate-based modelling and validation of a pilot absorber using MDEA enhanced with carbonic anhydrase (CA)

    DEFF Research Database (Denmark)

    Gaspar, Jozsef; Gladis, Arne; Woodley, John

    2017-01-01

    solvent-regeneration energy demand.The focus of this work is to develop a rate-based model for CO2 absorption using MDEA enhanced with CA and to validate it against pilot-scale absorption experiments. In this work, we compare model predictions to measured temperature and CO2 concentration profiles...

  5. Validation of numerical models for flow simulation in labyrinth seals

    Science.gov (United States)

    Frączek, D.; Wróblewski, W.

    2016-10-01

    CFD results were compared with the results of experiments for the flow through the labyrinth seal. RANS turbulence models (k-epsilon, k-omega, SST and SST-SAS) were selected for the study. Steady and transient results were analyzed. ANSYS CFX was used for numerical computation. The analysis included flow through sealing section with the honeycomb land. Leakage flows and velocity profiles in the seal were compared. In addition to the comparison of computational models, the divergence of modeling and experimental results has been determined. Tips for modeling these problems were formulated.

  6. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  7. Validation experiment of a numerically processed millimeter-wave interferometer in a laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kogi, Y., E-mail: kogi@fit.ac.jp; Higashi, T.; Matsukawa, S. [Department of Information Electronics, Fukuoka Institute of Technology, Fukuoka 811-0295 (Japan); Mase, A. [Art, Science and Technology Center for Cooperative Research, Kyushu University, Kasuga, Fukuoka 816-0811 (Japan); Kohagura, J.; Yoshikawa, M. [Plasma Research Center, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Nagayama, Y.; Kawahata, K. [National Institute for Fusion Science, Toki, Gifu 509-5202 (Japan); Kuwahara, D. [Tokyo University of Agriculture and Technology, Koganei, Tokyo 184-8588 (Japan)

    2014-11-15

    We propose a new interferometer system for density profile measurements. This system produces multiple measurement chords by a leaky-wave antenna driven by multiple frequency inputs. The proposed system was validated in laboratory evaluation experiments. We confirmed that the interferometer generates a clear image of a Teflon plate as well as the phase shift corresponding to the plate thickness. In another experiment, we confirmed that quasi-optical mirrors can produce multiple measurement chords; however, the finite spot size of the probe beam degrades the sharpness of the resulting image.

  8. Validation experiment of a numerically processed millimeter-wave interferometer in a laboratory.

    Science.gov (United States)

    Kogi, Y; Higashi, T; Matsukawa, S; Mase, A; Kohagura, J; Nagayama, Y; Kawahata, K; Kuwahara, D; Yoshikawa, M

    2014-11-01

    We propose a new interferometer system for density profile measurements. This system produces multiple measurement chords by a leaky-wave antenna driven by multiple frequency inputs. The proposed system was validated in laboratory evaluation experiments. We confirmed that the interferometer generates a clear image of a Teflon plate as well as the phase shift corresponding to the plate thickness. In another experiment, we confirmed that quasi-optical mirrors can produce multiple measurement chords; however, the finite spot size of the probe beam degrades the sharpness of the resulting image.

  9. MOLECULAR VALIDATED MODEL FOR ADSORPTION OF PROTONATED DYE ON LDH

    Directory of Open Access Journals (Sweden)

    B. M. Braga

    Full Text Available Abstract Hydrotalcite-like compounds are anionic clays of scientific and technological interest for their use as ion exchange materials, catalysts and modified electrodes. Surface phenomenon are important for all these applications. Although conventional analytical methods have enabled progress in understanding the behavior of anionic clays in solution, an evaluation at the atomic scale of the dynamics of their ionic interactions has never been performed. Molecular simulation has become an extremely useful tool to provide this perspective. Our purpose is to validate a simplified model for the adsorption of 5-benzoyl-4-hydroxy-2-methoxy-benzenesulfonic acid (MBSA, a prototype molecule of anionic dyes, onto a hydrotalcite surface. Monte Carlo simulations were performed in the canonical ensemble with MBSA ions and a pore model of hydrotalcite using UFF and ClayFF force fields. The proposed molecular model has allowed us to reproduce experimental data of atomic force microscopy. Influences of protonation during the adsorption process are also presented.

  10. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  11. Numerical modelling of transdermal delivery from matrix systems: parametric study and experimental validation with silicone matrices.

    Science.gov (United States)

    Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már

    2014-08-01

    A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery.

  12. Panamanian women׳s experience of vaginal examination in labour: A questionnaire validation.

    Science.gov (United States)

    Bonilla-Escobar, Francisco J; Ortega-Lenis, Delia; Rojas-Mirquez, Johanna C; Ortega-Loubon, Christian

    2016-05-01

    to validate a tool that allows healthcare providers to obtain accurate information regarding Panamanian women׳s thoughts and feelings about vaginal examination during labour that can be used in other Latin-American countries. validation study based on a database from a cross-sectional study carried out in two tertiary care hospitals in Panama City, Panama. Women in the immediate postpartum period who had spontaneous labour onset and uncomplicated deliveries were included in the study from April to August 2008. Researchers used a survey designed by Lewin et al. that included 20 questions related to a patient׳s experience during a vaginal examination. five constructs (factors) related to a patient׳s experience of vaginal examination during labour were identified: Approval (Alpha Cronbach׳s 0.72), Perception (0.67), Rejection (0.40), Consent (0.51), and Stress (0.20). it was demonstrated the validity of the scale and its constructs used to obtain information related to vaginal examination during labour, including patients' experiences with examination and healthcare staff performance. utilisation of the scale will allow institutions to identify items that need improvement and address these areas in order to promote the best care for patients in labour. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    Directory of Open Access Journals (Sweden)

    Belzung Catherine

    2011-11-01

    Full Text Available Abstract Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them; homological validity (including species validity and strain validity, pathogenic validity (including ontopathogenic validity and triggering validity, mechanistic validity, face validity (including ethological and biomarker validity and predictive validity (including induction and remission validity. Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity and during adulthood (for example, stress: triggering validity. Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias or biological mechanisms (such as dysfunction of the hormonal stress axis regulation underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity or biological (biomarker validity outcomes: for example anhedonic behavior (ethological validity or elevated corticosterone (biomarker validity. Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity and between the effects of

  14. Dynamic validation of the Planck-LFI thermal model

    Energy Technology Data Exchange (ETDEWEB)

    Tomasi, M; Bersanelli, M; Mennella, A [Universita degli Studi di Milano, Via Celoria 16, 20133 Milano (Italy); Cappellini, B [INAF IASF Milano, Via Bassini, 15, 20133, Milano (Italy); Gregorio, A [University of Trieste, Department of Physics, via Valerio 2, 34127 Trieste (Italy); Colombo, F; Lapolla, M [Thales Alenia Space Italia S.p.A., IUEL - Scientific Instruments, S.S. Padana Superiore 290, 20090 Vimodrone (Mi) (Italy); Terenzi, L; Morgante, G; Butler, R C; Mandolesi, N; Valenziano, L [INAF IASF Bologna, via Gobetti 101, 40129 Bologna (Italy); Galeotta, S; Maris, M; Zacchei, A [LFI-DPC INAF-OATs, via Tiepolo 11, 34131 Trieste (Italy)

    2010-01-15

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave background (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its validation is therefore extremely important in the context of the Planck mission. Our analysis shows that the measured thermal properties of the instrument show a thermal damping level better than predicted, therefore further reducing the expected systematic effect induced in the LFI maps. We then propose an explanation of the increased damping in terms of non-ideal thermal contacts.

  15. The Atmospheric Radionuclide Transport Model (ARTM) - Validation of a long-term atmospheric dispersion model

    Science.gov (United States)

    Hettrich, Sebastian; Wildermuth, Hans; Strobl, Christopher; Wenig, Mark

    2016-04-01

    In the last couple of years, the Atmospheric Radionuclide Transport Model (ARTM) has been developed by the German Federal Office for Radiation Protection (BfS) and the Society for Plant and Reactor Security (GRS). ARTM is an atmospheric dispersion model for continuous long-term releases of radionuclides into the atmosphere, based on the Lagrangian particle model. This model, developed in the first place as a more realistic replacement for the out-dated Gaussian plume models, is currently being optimised for further scientific purposes to study atmospheric dispersion in short-range scenarios. It includes a diagnostic wind field model, allows for the application of building structures and multiple sources (including linear, 2-and 3-dimensional source geometries), and considers orography and surface roughness. As an output it calculates the activity concentration, dry and wet deposition and can model also the radioactive decay of Rn-222. As such, ARTM requires to undergo an intense validation process. While for short-term and short-range models, which were mainly developed for examining nuclear accidents or explosions, a few measurement data-sets are available for validation, data-sets for validating long-term models are very sparse and the existing ones mostly prove to be not applicable for validation. Here we present a strategy for the validation of long-term Lagrangian particle models based on the work with ARTM. In our validation study, the first part we present is a comprehensive analysis of the model sensitivities on different parameters like e.g. (simulation grid size resolution, starting random number, amount of simulation particles, etc.). This study provides a good estimation for the uncertainties of the simulation results and consequently can be used to generate model outputs comparable to the available measurements data at various distances from the emission source. This comparison between measurement data from selected scenarios and simulation results

  16. A validation study of a stochastic model of human interaction

    Science.gov (United States)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree

  17. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available : Series B (Statistical Methodology), 50(2), pp. 157-224. 12th INCOSE SA Systems Engineering Conference ISBN 978-0-620-72719-8 Page 103 M&SCO. 2013. Verification, Validation, & Accreditation (VV&A) Recommended Practices Guide (RPG). Retrieved from U....S. DoD Modelling & Simulation Coordination Office. http://www.msco.mil/VVA_RPG.html (last accessed April 8, 2016). Pearl, J. 1988. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Sargent, R. G. 1981...

  18. Initial Studies of Validation of MHD Models for MST Reversed Field Pinch Plasmas

    Science.gov (United States)

    Jacobson, C. M.; Almagri, A. F.; Craig, D.; McCollam, K. J.; Reusch, J. A.; Sauppe, J. P.; Sovinec, C. R.; Triana, J. C.

    2015-11-01

    Quantitative validation of visco-resistive MHD models for RFP plasmas takes advantage of MST's advanced diagnostics. These plasmas are largely governed by MHD relaxation activity, so that a broad range of validation metrics can be evaluated. Previous nonlinear simulations using the visco-resistive MHD code DEBS at Lundquist number S = 4 ×106 produced equilibrium relaxation cycles in qualitative agreement with experiment, but magnetic fluctuation amplitudes b~ were at least twice as large as in experiment. The extended-MHD code NIMROD previously suggested that a two-fluid model may be necessary to produce b~ in agreement with experiment. For best comparisons with DEBS and to keep computational expense tractable, NIMROD is run in single-fluid mode at low S. These simulations are complemented by DEBS at higher S in cylindrical geometry, which will be used to examine b~ as a function of S. Experimental measurements are used with results from these simulations to evaluate validation metrics. Convergence tests of previous high S DEBS simulations are also discussed, along with benchmarking of DEBS and NIMROD with the SPECYL and PIXIE3D codes. Work supported by U.S. DOE and NSF.

  19. Bond Graph Modeling and Validation of an Energy Regenerative System for Emulsion Pump Tests

    Directory of Open Access Journals (Sweden)

    Yilei Li

    2014-01-01

    Full Text Available The test system for emulsion pump is facing serious challenges due to its huge energy consumption and waste nowadays. To settle this energy issue, a novel energy regenerative system (ERS for emulsion pump tests is briefly introduced at first. Modeling such an ERS of multienergy domains needs a unified and systematic approach. Bond graph modeling is well suited for this task. The bond graph model of this ERS is developed by first considering the separate components before assembling them together and so is the state-space equation. Both numerical simulation and experiments are carried out to validate the bond graph model of this ERS. Moreover the simulation and experiments results show that this ERS not only satisfies the test requirements, but also could save at least 25% of energy consumption as compared to the original test system, demonstrating that it is a promising method of energy regeneration for emulsion pump tests.

  20. Hadronic Shower Models in GEANT4: Validation Strategy and Results.

    Institute of Scientific and Technical Information of China (English)

    JohannesPeterWellisch

    2001-01-01

    Optimal exploitation of hadronic final states played a key role in successes of all recent hadron collider experiment in HEP,and the ability to use hadronic final states will continue to be one of the decisive issues during the analysis phase of the LHC experinents Monte Carlo implementations of hadronic shower models provided with GEANT4 facilitate the use of hadronic final states,and have been developed for many years.We will give an overview on the physics underlying hadronic shower simulation,discussing the three basic types of modelling;data driven,parametrisation driven,and theory driven modelling,and their respective implementation status in GEANT4.We will confront the different types of modelling with a validation suite for hadronic generators based on cross-sections measurements from thin target experiments,and expose the strength and weaknesses of the individual approaches.

  1. Structural Identification and Validation in Stochastic Differential Equation based Models

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Carstensen, Jacob; Madsen, Henrik

    2011-01-01

    Stochastic differential equations (SDEs) for ecosystem modelling have attracted increasing attention during recent years. The modelling has mostly been through simulation based experiments. Estimation of parameters in SDEs is, however, possible by combining Kalman filter and likelihood techniques...... as a function of the state variables and global radiation. Further improvements of both the drift and the diffusion term are achieved by comparing simulated densities and data....

  2. LANL*V2.0: global modeling and validation

    Directory of Open Access Journals (Sweden)

    S. Zaharia

    2011-08-01

    Full Text Available We describe in this paper the new version of LANL*, an artificial neural network (ANN for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1 we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2 The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005 (TS05 instead of the older model by Tsyganenko et al. (2003. We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ΔL* * V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

  3. Evaluation and cross-validation of Environmental Models

    Science.gov (United States)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore

  4. Development and validation of a two-dimensional fast-response flood estimation model

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV OF UTAK

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.

  5. Validating Mechanistic Sorption Model Parameters and Processes for Reactive Transport in Alluvium

    Energy Technology Data Exchange (ETDEWEB)

    Zavarin, M; Roberts, S K; Rose, T P; Phinney, D L

    2002-05-02

    The laboratory batch and flow-through experiments presented in this report provide a basis for validating the mechanistic surface complexation and ion exchange model we use in our hydrologic source term (HST) simulations. Batch sorption experiments were used to examine the effect of solution composition on sorption. Flow-through experiments provided for an analysis of the transport behavior of sorbing elements and tracers which includes dispersion and fluid accessibility effects. Analysis of downstream flow-through column fluids allowed for evaluation of weakly-sorbing element transport. Secondary Ion Mass Spectrometry (SIMS) analysis of the core after completion of the flow-through experiments permitted the evaluation of transport of strongly sorbing elements. A comparison between these data and model predictions provides additional constraints to our model and improves our confidence in near-field HST model parameters. In general, cesium, strontium, samarium, europium, neptunium, and uranium behavior could be accurately predicted using our mechanistic approach but only after some adjustment was made to the model parameters. The required adjustments included a reduction in strontium affinity for smectite, an increase in cesium affinity for smectite and illite, a reduction in iron oxide and calcite reactive surface area, and a change in clinoptilolite reaction constants to reflect a more recently published set of data. In general, these adjustments are justifiable because they fall within a range consistent with our understanding of the parameter uncertainties. These modeling results suggest that the uncertainty in the sorption model parameters must be accounted for to validate the mechanistic approach. The uncertainties in predicting the sorptive behavior of U-1a and UE-5n alluvium also suggest that these uncertainties must be propagated to nearfield HST and large-scale corrective action unit (CAU) models.

  6. Challenges in validating model results for first year ice

    Science.gov (United States)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  7. Atmospheric Dispersion Model Validation in Low Wind Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sawyer, Patrick

    2007-11-01

    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  8. Atmospheric Dispersion Model Validation in Low Wind Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sawyer, Patrick

    2007-11-01

    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  9. Validation of the thermophysiological model by Fiala for prediction of local skin temperatures

    Science.gov (United States)

    Martínez, Natividad; Psikuta, Agnes; Kuklane, Kalev; Quesada, José Ignacio Priego; de Anda, Rosa María Cibrián Ortiz; Soriano, Pedro Pérez; Palmer, Rosario Salvador; Corberán, José Miguel; Rossi, René Michel; Annaheim, Simon

    2016-12-01

    The most complete and realistic physiological data are derived from direct measurements during human experiments; however, they present some limitations such as ethical concerns, time and cost burden. Thermophysiological models are able to predict human thermal response in a wide range of environmental conditions, but their use is limited due to lack of validation. The aim of this work was to validate the thermophysiological model by Fiala for prediction of local skin temperatures against a dedicated database containing 43 different human experiments representing a wide range of conditions. The validation was conducted based on root-mean-square deviation (rmsd) and bias. The thermophysiological model by Fiala showed a good precision when predicting core and mean skin temperature (rmsd 0.26 and 0.92 °C, respectively) and also local skin temperatures for most body sites (average rmsd for local skin temperatures 1.32 °C). However, an increased deviation of the predictions was observed for the forehead skin temperature (rmsd of 1.63 °C) and for the thigh during exercising exposures (rmsd of 1.41 °C). Possible reasons for the observed deviations are lack of information on measurement circumstances (hair, head coverage interference) or an overestimation of the sweat evaporative cooling capacity for the head and thigh, respectively. This work has highlighted the importance of collecting details about the clothing worn and how and where the sensors were attached to the skin for achieving more precise results in the simulations.

  10. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  11. Development and validation of a 10-year-old child ligamentous cervical spine finite element model.

    Science.gov (United States)

    Dong, Liqiang; Li, Guangyao; Mao, Haojie; Marek, Stanley; Yang, King H

    2013-12-01

    Although a number of finite element (FE) adult cervical spine models have been developed to understand the injury mechanisms of the neck in automotive related crash scenarios, there have been fewer efforts to develop a child neck model. In this study, a 10-year-old ligamentous cervical spine FE model was developed for application in the improvement of pediatric safety related to motor vehicle crashes. The model geometry was obtained from medical scans and meshed using a multi-block approach. Appropriate properties based on review of literature in conjunction with scaling were assigned to different parts of the model. Child tensile force-deformation data in three segments, Occipital-C2 (C0-C2), C4-C5 and C6-C7, were used to validate the cervical spine model and predict failure forces and displacements. Design of computer experiments was performed to determine failure properties for intervertebral discs and ligaments needed to set up the FE model. The model-predicted ultimate displacements and forces were within the experimental range. The cervical spine FE model was validated in flexion and extension against the child experimental data in three segments, C0-C2, C4-C5 and C6-C7. Other model predictions were found to be consistent with the experimental responses scaled from adult data. The whole cervical spine model was also validated in tension, flexion and extension against the child experimental data. This study provided methods for developing a child ligamentous cervical spine FE model and to predict soft tissue failures in tension.

  12. Low Altitude Validation of Geomagnetic Cutoff Models Using SAMPEX Data

    Science.gov (United States)

    Young, S. L.; Kress, B. T.

    2011-12-01

    Single event upsets (SEUs) caused by MeV protons are a concern for satellite operators so AFRL is working to create a tool that can specify and/or forecast SEU probabilities. An important component of the tool's SEU probability calculation will be the local energetic ion spectrum. The portion of that spectrum due to trapped energetic ion population is relatively stable and predictable; however it is more difficult to account for the transient solar energetic particles (SEPs). These particles, which can be ejected from the solar atmosphere during a solar flare or filament eruption or can be energized by coronal mass ejection (CME) driven shocks, can penetrate the Earth's magnetosphere into regions not normally populated by energetic protons. The magnetosphere will provide energy dependent shielding that also depends on its magnetic configuration. During magnetic storms that configuration is modified and the SEP cutoff latitude for a given particle energy can be suppressed up to ~15 degrees equatorward exposing normally shielded regions. As a first step to creating the satellite SEU prediction tool, we are comparing the Smart et al. (Advances in Space Research, 2006) and CISM-Dartmouth (Kress et al., Space Weather, 2010) geomagnetic cutoff tools. While they have provided some of their own validations in the noted papers, our validation will be done consistently between models allowing us to better compare the models.

  13. A geomagnetically induced current warning system: model development and validation

    Science.gov (United States)

    McKay, A.; Clarke, E.; Reay, S.; Thomson, A.

    Geomagnetically Induced Currents (GIC), which can flow in technological systems at the Earth's surface, are a consequence of magnetic storms and Space Weather. A well-documented practical problem for the power transmission industry is that GIC can affect the lifetime and performance of transformers within the power grid. Operational mitigation is widely considered to be one of the best strategies to manage the Space Weather and GIC risk. Therefore in the UK a magnetic storm warning and GIC monitoring and analysis programme has been under development by the British Geological Survey and Scottish Power plc (the power grid operator for Central Scotland) since 1999. Under the auspices of the European Space Agency's service development activities BGS is developing the capability to meet two key user needs that have been identified. These needs are, firstly, the development of a near real-time solar wind shock/ geomagnetic storm warning, based on L1 solar wind data and, secondly, the development of an integrated surface geo-electric field and power grid network model that should allow prediction of GIC throughout the power grid in near real time. While the final goal is a `seamless package', the components of the package utilise diverse scientific techniques. We review progress to date with particular regard to the validation of the individual components of the package. The Scottish power grid response to the October 2003 magnetic storms is also discussed and model and validation data are presented.

  14. Experimental investigations and validation of two dimensional model for multistream plate fin heat exchangers

    Science.gov (United States)

    Goyal, Mukesh; Chakravarty, Anindya; Atrey, M. D.

    2017-03-01

    Experimental investigations are carried out using a specially developed three-layer plate fin heat exchanger (PFHE), with helium as the working fluid cooled to cryogenic temperatures using liquid nitrogen (LN2) as a coolant. These results are used for validation of an already proposed and reported numerical model based on finite volume analysis for multistream (MS) plate fin heat exchangers (PFHE) for cryogenic applications (Goyal et al., 2014). The results from the experiments are presented and a reasonable agreement is observed with the already reported numerical model.

  15. Modeling Root Growth, Crop Growth and N Uptake of Winter Wheat Based on SWMS_2D: Model and Validation

    Directory of Open Access Journals (Sweden)

    Dejun Yang

    Full Text Available ABSTRACT Simulations for root growth, crop growth, and N uptake in agro-hydrological models are of significant concern to researchers. SWMS_2D is one of the most widely used physical hydrologically related models. This model solves equations that govern soil-water movement by the finite element method, and has a public access source code. Incorporating key agricultural components into the SWMS_2D model is of practical importance, especially for modeling some critical cereal crops such as winter wheat. We added root growth, crop growth, and N uptake modules into SWMS_2D. The root growth model had two sub-models, one for root penetration and the other for root length distribution. The crop growth model used was adapted from EU-ROTATE_N, linked to the N uptake model. Soil-water limitation, nitrogen limitation, and temperature effects were all considered in dry-weight modeling. Field experiments for winter wheat in Bouwing, the Netherlands, in 1983-1984 were selected for validation. Good agreements were achieved between simulations and measurements, including soil water content at different depths, normalized root length distribution, dry weight and nitrogen uptake. This indicated that the proposed new modules used in the SWMS_2D model are robust and reliable. In the future, more rigorous validation should be carried out, ideally under 2D situations, and attention should be paid to improve some modules, including the module simulating soil N mineralization.

  16. Physiologically Based Modelling of Dioxins. I. Validation of a rodent toxicokinetic model

    NARCIS (Netherlands)

    Zeilmaker MJ; Slob W

    1993-01-01

    In this report a rodent Physiologically Based PharmacoKinetic (PBPK) model for 2,3,7,8-tetrachlorodibenzodioxin is described. Validation studies, in which model simulations of TCDD disposition were compared with in vivo TCDD disposition in rodents exposed to TCDD, showed that the model adequately p

  17. Validation of full cavitation model in cryogenic fluids

    Institute of Scientific and Technical Information of China (English)

    CAO XiaoLi; ZHANG XiaoBin; QIU LiMin; GAN ZhiHua

    2009-01-01

    Numerical simulation of cavitation in cryogenic fluids is important in improving the stable operation of he propulsion system in liquid-fuel rocket. It also represents a broader class of problems where the fluid is operating close to its critical point and the thermal effects of cavitation are pronounced. The present article focuses on simulating cryogenic cavitation by implementing the "full cavitation model", coupled with energy equation, in conjunction with iteraUve update of the real fluid properties at local temperatures. Steady state computations are then conducted on hydrofoil and ogive in liquid nitrogen and hydrogen respectively, based on which we explore the mechanism of cavitation with thermal ef-fects. Comprehensive comparisons between the simulation results and experimental data as well as previous computations by other researchers validate the full cavitation model in cryogenic fluids. The sensitivity of cavity length to cavitation number is also examined.

  18. Modelling and validation of multiple reflections for enhanced laser welding

    Science.gov (United States)

    Milewski, J.; Sklar, E.

    1996-05-01

    The effects of multiple internal reflections within a laser weld joint as functions of joint geometry and processing conditions have been characterized. A computer-based ray tracing model is used to predict the reflective propagation of laser beam energy focused into the narrow gap of a metal joint for the purpose of predicting the location of melting and coalescence to form a weld. Quantitative comparisons are made between simulation cases. Experimental results are provided for qualitative model validation. This method is proposed as a way to enhance process efficiency and design laser welds which display deep penetration and high depth-to-width aspect ratios without high powered systems or keyhole mode melting.

  19. Assessing uncertainty in pollutant wash-off modelling via model validation.

    Science.gov (United States)

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

    2014-11-01

    Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.

  20. Satellite information of sea ice for model validation

    Science.gov (United States)

    Saheed, P. P.; Mitra, Ashis K.; Momin, Imranali M.; Mahapatra, Debasis K.; Rajagopal, E. N.

    2016-05-01

    Emergence of extensively large computational facilities have enabled the scientific world to use earth system models for understating the prevailing dynamics of the earth's atmosphere, ocean and cryosphere and their inter relations. The sea ice in the arctic and the Antarctic has been identified as one of the main proxies to study the climate changes. The rapid sea-ice melting in the Arctic and disappearance of multi-year sea ice has become a matter of concern. The earth system models couple the ocean, atmosphere and sea-ice in order to bring out the possible inter connections between these three very important components and their role in the changing climate. The Indian monsoon is seen to be subjected to nonlinear changes in the recent years. The rapid ice melt in the Arctic sea ice is apparently linked to the changes in the weather and climate of the Indian subcontinent. The recent findings reveal the relation between the high events occurs in the Indian subcontinent and the Arctic sea ice melt episodes. The coupled models are being used in order to study the depth of these relations. However, the models have to be validated extensively by using measured parameters. The satellite measurements of sea-ice starts from way back in 1979. There have been many data sets available since then. Here in this study, an evaluation of the existing data sets is conducted. There are some uncertainties in these data sets. It could be associated with the absence of a single sensor for a long period of time and also the absence of accurate in-situ measurements in order to validate the satellite measurements.

  1. Development and validation of a liquid composite molding model

    Science.gov (United States)

    Bayldon, John Michael

    2007-12-01

    In composite manufacturing, Vacuum Assisted Resin Transfer Molding (VARTM) is becoming increasingly important as a cost effective manufacturing method of structural composites. In this process the dry preform (reinforcement) is placed on a rigid tool and covered by a flexible film to form an airtight vacuum bag. Liquid resin is drawn under vacuum through the preform inside the vacuum bag. Modeling of this process relies on a good understanding of closely coupled phenomena. The resin flow depends on the preform permeability, which in turn depends on the local fluid pressure and the preform compaction behavior. VARTM models for predicting the flow rate in this process do exist, however, they are not able to properly predict the flow for all classes of reinforcement material. In this thesis, the continuity equation used in VARTM models is reexamined and a modified form proposed. In addition, the compaction behavior of the preform in both saturated and dry states is studied in detail and new models are proposed for the compaction behavior. To assess the validity of the proposed models, the shadow moire method was adapted and used to perform full field measurement of the preform thickness during infusion, in addition to the usual measurements of flow front position. A new method was developed and evaluated for the analysis of the moire data related to the VARTM process, however, the method has wider applicability to other full field thickness measurements. The use of this measurement method demonstrated that although the new compaction models work well in the characterization tests, they do not properly describe all the preform features required for modeling the process. In particular the effect of varying saturation on the preform's behavior requires additional study. The flow models developed did, however, improve the prediction of the flow rate for the more compliant preform material tested, and the experimental techniques have shown where additional test methods

  2. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi

    2016-02-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  3. Hierarchical calibration and validation of computational fluid dynamics models for solid sorbent-based carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Canhai; Xu, Zhijie; Pan, Wenxiao; Sun, Xin; Storlie, Curtis; Marcy, Peter; Dietiker, Jean-François; Li, Tingwen; Spenik, James

    2016-01-01

    To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesian calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.

  4. Cultural consensus modeling to measure transactional sex in Swaziland: Scale building and validation.

    Science.gov (United States)

    Fielding-Miller, Rebecca; Dunkle, Kristin L; Cooper, Hannah L F; Windle, Michael; Hadley, Craig

    2016-01-01

    Transactional sex is associated with increased risk of HIV and gender based violence in southern Africa and around the world. However the typical quantitative operationalization, "the exchange of gifts or money for sex," can be at odds with a wide array of relationship types and motivations described in qualitative explorations. To build on the strengths of both qualitative and quantitative research streams, we used cultural consensus models to identify distinct models of transactional sex in Swaziland. The process allowed us to build and validate emic scales of transactional sex, while identifying key informants for qualitative interviews within each model to contextualize women's experiences and risk perceptions. We used logistic and multinomial logistic regression models to measure associations with condom use and social status outcomes. Fieldwork was conducted between November 2013 and December 2014 in the Hhohho and Manzini regions. We identified three distinct models of transactional sex in Swaziland based on 124 Swazi women's emic valuation of what they hoped to receive in exchange for sex with their partners. In a clinic-based survey (n = 406), consensus model scales were more sensitive to condom use than the etic definition. Model consonance had distinct effects on social status for the three different models. Transactional sex is better measured as an emic spectrum of expectations within a relationship, rather than an etic binary relationship type. Cultural consensus models allowed us to blend qualitative and quantitative approaches to create an emicly valid quantitative scale grounded in qualitative context.

  5. Validation of a numerical model of acoustic ceiling combined with TABS

    DEFF Research Database (Denmark)

    Rage, Nils; Kazanci, Ongun Berk; Olesen, Bjarne W.

    2016-01-01

    to understand to which extent a layer of hanging sound absorbers will impede the heating and cooling performance of the system, and how this translates on the thermal comfort for the occupants. In order to address these issues, this study focuses on validation of a new TRNSYS component (Type Ecophon Acoustic...... Elements) developed to simulate partially covered suspended ceilings such as hanging sound absorbers. The tool is validated by numerically modelling a set of similar experiments carried out in full-scale by a previous study. For this, a total of 12 scenarios from two case studies have been modelled......Thermally-Active Building Systems (TABS) have proven to be an energy-efficient and economical cooling and heating solution for commercial buildings. However, acoustic comfort is often jeopardized in such buildings, due to the thermal requirements of the system. More knowledge is required...

  6. Modeling for Optimal Control : A Validated Diesel-Electric Powertrain Model

    OpenAIRE

    Sivertsson, Martin; Eriksson, Lars

    2014-01-01

    An optimal control ready model of a diesel-electric powertrain is developed,validated and provided to the research community. The aim ofthe model is to facilitate studies of the transient control of diesel-electricpowertrains and also to provide a model for developers of optimizationtools. The resulting model is a four state three control mean valueengine model that captures the significant nonlinearity of the diesel engine, while still being continuously differentiable.

  7. Nonlinear dispersion effects in elastic plates: numerical modelling and validation

    Science.gov (United States)

    Kijanka, Piotr; Radecki, Rafal; Packo, Pawel; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.

    2017-04-01

    Nonlinear features of elastic wave propagation have attracted significant attention recently. The particular interest herein relates to complex wave-structure interactions, which provide potential new opportunities for feature discovery and identification in a variety of applications. Due to significant complexity associated with wave propagation in nonlinear media, numerical modeling and simulations are employed to facilitate design and development of new measurement, monitoring and characterization systems. However, since very high spatio- temporal accuracy of numerical models is required, it is critical to evaluate their spectral properties and tune discretization parameters for compromise between accuracy and calculation time. Moreover, nonlinearities in structures give rise to various effects that are not present in linear systems, e.g. wave-wave interactions, higher harmonics generation, synchronism and | recently reported | shifts to dispersion characteristics. This paper discusses local computational model based on a new HYBRID approach for wave propagation in nonlinear media. The proposed approach combines advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE). The methods are investigated in the context of their accuracy for predicting nonlinear wavefields, in particular shifts to dispersion characteristics for finite amplitude waves and secondary wavefields. The results are validated against Finite Element (FE) calculations for guided waves in copper plate. Critical modes i.e., modes determining accuracy of a model at given excitation frequency - are identified and guidelines for numerical model parameters are proposed.

  8. Lightweight ZERODUR: Validation of Mirror Performance and Mirror Modeling Predictions

    Science.gov (United States)

    Hull, Tony; Stahl, H. Philip; Westerhoff, Thomas; Valente, Martin; Brooks, Thomas; Eng, Ron

    2017-01-01

    Upcoming spaceborne missions, both moderate and large in scale, require extreme dimensional stability while relying both upon established lightweight mirror materials, and also upon accurate modeling methods to predict performance under varying boundary conditions. We describe tests, recently performed at NASA's XRCF chambers and laboratories in Huntsville Alabama, during which a 1.2 m diameter, f/1.2988% lightweighted SCHOTT lightweighted ZERODUR(TradeMark) mirror was tested for thermal stability under static loads in steps down to 230K. Test results are compared to model predictions, based upon recently published data on ZERODUR(TradeMark). In addition to monitoring the mirror surface for thermal perturbations in XRCF Thermal Vacuum tests, static load gravity deformations have been measured and compared to model predictions. Also the Modal Response(dynamic disturbance) was measured and compared to model. We will discuss the fabrication approach and optomechanical design of the ZERODUR(TradeMark) mirror substrate by SCHOTT, its optical preparation for test by Arizona Optical Systems (AOS). Summarize the outcome of NASA's XRCF tests and model validations

  9. Drilling forces model for lunar regolith exploration and experimental validation

    Science.gov (United States)

    Zhang, Tao; Ding, Xilun

    2017-02-01

    China's Chang'e lunar exploration project aims to sample and return lunar regolith samples at a minimum penetration depth of 2 m in 2017. Unlike such tasks on the Earth, automated drilling and sampling missions on the Moon are more complicated. Therefore, a delicately designed drill tool is required to minimize operational cost and enhance reliability. Penetration force and rotational torque are two critical parameters in designing the drill tool. In this paper, a novel numerical model for predicting penetration force and rotational torque in the drilling of lunar regolith is proposed. The model is based on quasi-static Mohr-Coulomb soil mechanics and explicitly describes the interaction between drill tool and lunar regolith. Geometric features of drill tool, mechanical properties of lunar regolith, and drilling parameters are taken into consideration in the model. Consequently, a drilling test bed was developed, and experimental penetration force and rotational torque were obtained in penetrating a lunar regolith simulant with different drilling parameters. Finally, theoretical and experimental results were compared to validate the proposed model. Experimental results indicated that the numerical model had good accuracy and was effective in predicting the penetration force and rotational torque in drilling the lunar regolith simulant.

  10. Validation of DWPF Melter Off-Gas Combustion Model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A.S.

    2000-08-23

    The empirical melter off-gas combustion model currently used in the DWPF safety basis calculations is valid at melter vapor space temperatures above 570 degrees C, as measured in the thermowell. This lower temperature bound coincides with that of the off-gas data used as the basis of the model. In this study, the applicability of the empirical model in a wider temperature range was assessed using the off-gas data collected during two small-scale research melter runs. The first data set came from the Small Cylindrical Melter-2 run in 1985 with the sludge feed coupled with the precipitate hydrolysis product. The second data set came from the 774-A melter run in 1996 with the sludge-only feed prepared with the modified acid addition strategy during the feed pretreatment step. The results of the assessment showed that the data from these two melter runs agreed well with the existing model, and further provided the basis for extending the lower temperature bound of the model to the measured melter vapor space temperature of 445 degrees C.

  11. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of

  12. Project Report of Virtual Experiments in Marine Bioacoustics: Model Validation

    Science.gov (United States)

    2010-08-01

    killer whale (Pseudorca crassidens). J. Acoust. Soc. Am. 98: 51-59. AU, W.W.L., R.H. PENNER, AND C.W. TURL. 1987. Propagation of beluga ...Atlantic Norfolk, VA Merel Dalebout University of New South Wales Sydney, Australia Robin W. Baird Cascadia Research Collective Olympia

  13. GEOCHEMICAL RECOGNITION OF SPILLED SEDIMENTS USED IN NUMERICAL MODEL VALIDATION

    Institute of Scientific and Technical Information of China (English)

    Jens R.VALEUR; Steen LOMHOLT; Christian KNUDSEN

    2004-01-01

    A fixed link (tunnel and bridge,in total 16 km) was constructed between Sweden and Denmark during 1995-2000.As part of the work,approximately 16 million tonnes of seabed materials (limestone and clay till) were dredged,and about 0.6 million tonnes of these were spilled in the water.Modelling of the spreading and sedimentation of the spilled sediments took place as part of the environmental monitoring of the construction activities.In order to verify the results of the numerical modelling of sediment spreading and sedimentation,a new method with the purpose of distinguishing between the spilled sediments and the naturally occurring sediments was developed.Because the spilled sediments tend to accumulate at the seabed in areas with natural sediments of the same size,it is difficult to separate these based purely on the physical properties.The new method is based on the geo-chemical differences between the natural sediment in the area and the spill.The basic properties used are the higher content of calcium carbonate material in the spill as compared to the natural sediments and the higher Ca/Sr ratio in the spill compared to shell fragments dominating the natural calcium carbonate deposition in the area.The reason for these differences is that carbonate derived from recent shell debris can be discriminated from Danien limestone,which is the material in which the majority of the dredging took place,on the basis of the Ca/Sr ratio being 488 in Danien Limestone and 237 in shell debris.The geochemical recognition of the origin of the sediments proved useful in separating the spilled from the naturally occurring sediments.Without this separation,validation of the modelling of accumulation of spilled sediments would not have been possible.The method has general validity and can be used in many situations where the origin ora given sediment is sought.

  14. The Development and Validation of the Social Networking Experiences Questionnaire: A Measure of Adolescent Cyberbullying and Its Impact.

    Science.gov (United States)

    Dredge, Rebecca; Gleeson, John; Garcia, Xochitl de la Piedad

    2015-01-01

    The measurement of cyberbullying has been marked by several inconsistencies that lead to difficulties in cross-study comparisons of the frequency of occurrence and the impact of cyberbullying. Consequently, the first aim of this study was to develop a measure of experience with and impact of cyberbullying victimization in social networking sites in adolescents. The second aim was to investigate the psychometric properties of a purpose-built measure (Social Networking Experiences Questionnaire [SNEQ]). Exploratory factor analysis on 253 adolescent social networking sites users produced a six-factor model of impact. However, one factor was removed because of low internal consistency. Cronbach's alpha was higher than .76 for the victimization and remaining five impact subscales. Furthermore, correlation coefficients for the Victimization scale and related dimensions showed good construct validity. The utility of the SNEQ for victim support personnel, research, and cyberbullying education/prevention programs is discussed.

  15. Development and validation of a realistic head model for EEG

    Science.gov (United States)

    Bangera, Nitin Bhalchandra

    The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients

  16. Validation of an instrument to measure patients' experiences of medicine use: the Living with Medicines Questionnaire.

    Science.gov (United States)

    Krska, Janet; Katusiime, Barbra; Corlett, Sarah A

    2017-01-01

    Medicine-related burden is an increasingly recognized concept, stemming from the rising tide of polypharmacy, which may impact on patient behaviors, including nonadherence. No instruments currently exist which specifically measure medicine-related burden. The Living with Medicines Questionnaire (LMQ) was developed for this purpose. This study validated the LMQ in a sample of adults using regular prescription medicines in the UK. Questionnaires were distributed in community pharmacies and public places in southeast England or online through UK health websites and social media. A total of 1,177 were returned: 507 (43.1%) from pharmacy distribution and 670 (56.9%) online. Construct validity was assessed by principal components analysis and item reduction undertaken on the original 60-item pool. Known-groups analysis assessed differences in mean total scores between participants using different numbers of medicines and between those who did or did not require assistance with medicine use. Internal consistency was assessed by Cronbach's alpha. Free-text comments were analyzed thematically to substantiate underlying dimensions. A 42-item, eight-factor structure comprising intercorrelated dimensions (patient-doctor relationships and communication about medicines, patient-pharmacist communication about medicines, interferences with daily life, practical difficulties, effectiveness, acceptance of medicine use, autonomy/control over medicines and concerns about medicine use) was derived, which explained 57.4% of the total variation. Six of the eight subscales had acceptable internal consistency (α>0.7). More positive experiences were observed among patients using eight or fewer medicines compared to nine or more, and those independent with managing/using their medicines versus those requiring assistance. Free-text comments, provided by almost a third of the respondents, supported the domains identified. The resultant LMQ-2 is a valid and reliable multidimensional measure of

  17. Estimating the predictive validity of diabetic animal models in rosiglitazone studies.

    Science.gov (United States)

    Varga, O E; Zsíros, N; Olsson, I A S

    2015-06-01

    For therapeutic studies, predictive validity of animal models - arguably the most important feature of animal models in terms of human relevance - can be calculated retrospectively by obtaining data on treatment efficacy from human and animal trials. Using rosiglitazone as a case study, we aim to determine the predictive validity of animal models of diabetes, by analysing which models perform most similarly to humans during rosiglitazone treatment in terms of changes in standard diabetes diagnosis parameters (glycosylated haemoglobin [HbA1c] and fasting glucose levels). A further objective of this paper was to explore the impact of four covariates on the predictive capacity: (i) diabetes induction method; (ii) drug administration route; (iii) sex of animals and (iv) diet during the experiments. Despite the variable consistency of animal species-based models with the human reference for glucose and HbA1c treatment effects, our results show that glucose and HbA1c treatment effects in rats agreed better with the expected values based on human data than in other species. Induction method was also found to be a substantial factor affecting animal model performance. The study concluded that regular reassessment of animal models can help to identify human relevance of each model and adapt research design for actual research goals.

  18. Validation of Eulerian modeling of gas-solid fluidized beds using nonlinear analysis

    Science.gov (United States)

    Norouzi, Y.; Norouzi, H. R.; Zarghami, R.

    2015-03-01

    In the present work, the new advanced method has been used for validation of two-fluid method (TFM) for modeling of the hydrodynamics of gas-solid fluidized beds. While many investigations were addressed validation of the CFD codes, less effort has been made to validate the nonlinear dynamics of fluidized beds with nonlinear methods. In this work new advanced nonlinear methods of recurrence plot (RP) and recurrence quantification analysis (RQA) have been used to validate the hydrodynamics modeling of the gas-solid fluidized beds. Pressure fluctuations of inside the bed were selected for comparing the nonlinear dynamics of fluidized bed in experiments and simulations. Pressure fluctuations were measured in a rectangle fluidized bed containing Geldart's group B particles with sampling frequency 400 Hz. Simulations were done with the same operating condition and pressure fluctuations were recorded with the same sampling frequency. The superficial air velocity range was 0.25-0.73 m/s (bubbling regime) and two aspect ratios 1 and 1.5 were used. The results of experiments and simulations were analyzed by RP and RQA. The values of laminarity, determinism, and recurrence rate for both experiments and simulations are close to each other. The experimental and simulation results showed high values for determinism and laminarity which show predictable and periodic behaviors of both systems. High values of determinism and laminarity are one of the most important characteristics of bubbling regime in which bubbles are periodically produced and move in the bed. In the entire gas velocity, the values of determinism and laminarity were not changed significantly that shows that no regime change occurred in the bed.

  19. Modelling and validation of spectral reflectance for the colon

    Science.gov (United States)

    Hidovic-Rowe, Dzena; Claridge, Ela

    2005-03-01

    The spectral reflectance of the colon is known to be affected by malignant and pre-malignant changes in the tissue. As part of long-term research on the derivation of diagnostically important parameters characterizing colon histology, we have investigated the effects of the normal histological variability on the remitted spectra. This paper presents a detailed optical model of the normal colon comprising mucosa, submucosa and the smooth muscle layer. Each layer is characterized by five variable histological parameters: the volume fraction of blood, the haemoglobin saturation, the size of the scattering particles, including collagen, the volume fraction of the scattering particles and the layer thickness, and three optical parameters: the anisotropy factor, the refractive index of the medium and the refractive index of the scattering particles. The paper specifies the parameter ranges corresponding to normal colon tissue, including some previously unpublished ones. Diffuse reflectance spectra were modelled using the Monte Carlo method. Validation of the model-generated spectra against measured spectra demonstrated that good correspondence was achieved between the two. The analysis of the effect of the individual histological parameters on the behaviour of the spectra has shown that the spectral variability originates mainly from changes in the mucosa. However, the submucosa and the muscle layer must be included in the model as they have a significant constant effect on the spectral reflectance above 600 nm. The nature of variations in the spectra also suggests that it may be possible to carry out model inversion and to recover parameters characterizing the colon from multi-spectral images. A preliminary study, in which the mucosal blood and collagen parameters were modified to reflect histopathological changes associated with colon cancer, has shown that the spectra predicted by our model resemble measured spectral reflectance of adenocarcinomas. This suggests that

  20. PIV validation of blood-heart valve leaflet interaction modelling.

    Science.gov (United States)

    Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

    2007-07-01

    The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated.

  1. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    Energy Technology Data Exchange (ETDEWEB)

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  2. Development and validation of a railgun hydrogen pellet injector model

    Energy Technology Data Exchange (ETDEWEB)

    King, T.L. [Univ. of Houston, TX (United States). Dept. of Electrical and Computer Engineering; Zhang, J.; Kim, K. [Univ. of Illinois, Urbana, IL (United States). Dept. of Electrical and Computer Engineering

    1995-12-31

    A railgun hydrogen pellet injector model is presented and its predictions are compared with the experimental data. High-speed hydrogenic ice injection is the dominant refueling method for magnetically confined plasmas used in controlled thermonuclear fusion research. As experimental devices approach the scale of power-producing fusion reactors, the fueling requirements become increasingly more difficult to meet since, due to the large size and the high electron densities and temperatures of the plasma, hypervelocity pellets of a substantial size will need to be injected into the plasma continuously and at high repetition rates. Advanced technologies, such as the railgun pellet injector, are being developed to address this demand. Despite the apparent potential of electromagnetic launchers to produce hypervelocity projectiles, physical effects that were neither anticipated nor well understood have made it difficult to realize this potential. Therefore, it is essential to understand not only the theory behind railgun operation, but the primary loss mechanisms, as well. Analytic tools have been used by many researchers to design and optimize railguns and analyze their performance. This has led to a greater understanding of railgun behavior and opened the door for further improvement. A railgun hydrogen pellet injector model has been developed. The model is based upon a pellet equation of motion that accounts for the dominant loss mechanisms, inertial and viscous drag. The model has been validated using railgun pellet injectors developed by the Fusion Technology Research Laboratory at the University of Illinois at Urbana-Champaign.

  3. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  4. Solar models, neutrino experiments, and helioseismology

    Science.gov (United States)

    Bahcall, John N.; Ulrich, Roger K.

    1988-01-01

    The event rates and their recognized uncertainties are calculated for 11 solar neutrino experiments using accurate solar models. These models are also used to evaluate the frequency spectrum of the p and g oscillations modes of the sun. It is shown that the discrepancy between the predicted and observed event rates in the Cl-37 and Kamiokande II experiments cannot be explained by a 'likely' fluctuation in input parameters with the best estimates and uncertainties given in the present study. It is suggested that, whatever the correct solution to the solar neutrino problem, it is unlikely to be a 'trival' error.

  5. Development, validation and application of numerical space environment models

    Science.gov (United States)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  6. Experiences using IAEA Code of practice for radiation sterilization of tissue allografts: Validation and routine control

    Energy Technology Data Exchange (ETDEWEB)

    Hilmy, N. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)], E-mail: nazly@batan.go.id; Febrida, A.; Basril, A. [Batan Research Tissue Bank (BRTB), Centre for Research and Development of Isotopes and Radiation Technology, P.O. Box 7002, JKSKL, Jakarta 12070 (Indonesia)

    2007-11-15

    Problems of tissue allografts in using International Standard (ISO) 11137 for validation of radiation sterilization dose (RSD) are limited and low numbers of uniform samples per production batch, those are products obtained from one donor. Allograft is a graft transplanted between two different individuals of the same species. The minimum number of uniform samples needed for verification dose (VD) experiment at the selected sterility assurance level (SAL) per production batch according to the IAEA Code is 20, i.e., 10 for bio-burden determination and the remaining 10 for sterilization test. Three methods of the IAEA Code have been used for validation of RSD, i.e., method A1 that is a modification of method 1 of ISO 11137:1995, method B (ISO 13409:1996), and method C (AAMI TIR 27:2001). This paper describes VD experiments using uniform products obtained from one cadaver donor, i.e., cancellous bones, demineralized bone powders and amnion grafts from one life donor. Results of the verification dose experiments show that RSD is 15.4 kGy for cancellous and demineralized bone grafts and 19.2 kGy for amnion grafts according to method A1 and 25 kGy according to methods B and C.

  7. Experiments of Laser Pointing Stability in Air and in Vacuum to Validate Micrometric Positioning Sensor

    CERN Document Server

    Stern, G; Piedigrossi, D; Sandomierski, J; Sosin, M; Geiger, A; Guillaume, S

    2014-01-01

    Aligning accelerator components over 200m with 10 μm accuracy is a challenging task within the Compact Linear Collider (CLIC) study. A solution based on laser beam in vacuum as straight line reference is proposed. The positions of the accelerator’s components are measured with respect to the laser beam by sensors made of camera/shutter assemblies. To validate these sensors, laser pointing stability has to be studied over 200m. We perform experiments in air and in vacuum in order to know how laser pointing stability varies with the distance of propagation and with the environment. The experiments show that the standard deviations of the laser spot coordinates increase with the distance of propagation. They also show that the standard deviations are much smaller in vacuum (8 μm at 35m) than in air (2000 μm at 200m). Our experiment validates the concept of laser beam in vacuum with camera/shutter assembly for micrometric positioning over 35m. It also gives an estimation of the achievable precision.

  8. Validation of an instrument to measure patients’ experiences of medicine use: the Living with Medicines Questionnaire

    Directory of Open Access Journals (Sweden)

    Krska J

    2017-03-01

    Full Text Available Janet Krska, Barbra Katusiime, Sarah A Corlett Medway School of Pharmacy, The Universities of Kent and Greenwich, Chatham Maritime, UK Background: Medicine-related burden is an increasingly recognized concept, stemming from the rising tide of polypharmacy, which may impact on patient behaviors, including nonadherence. No instruments currently exist which specifically measure medicine-related burden. The Living with Medicines Questionnaire (LMQ was developed for this purpose. Objective: This study validated the LMQ in a sample of adults using regular prescription medicines in the UK. Methods: Questionnaires were distributed in community pharmacies and public places in southeast England or online through UK health websites and social media. A total of 1,177 were returned: 507 (43.1% from pharmacy distribution and 670 (56.9% online. Construct validity was assessed by principal components analysis and item reduction undertaken on the original 60-item pool. Known-groups analysis assessed differences in mean total scores between participants using different numbers of medicines and between those who did or did not require assistance with medicine use. Internal consistency was assessed by Cronbach’s alpha. Free-text comments were analyzed thematically to substantiate underlying dimensions. Results: A 42-item, eight-factor structure comprising intercorrelated dimensions (patient–doctor relationships and communication about medicines, patient–pharmacist communication about medicines, interferences with daily life, practical difficulties, effectiveness, acceptance of medicine use, autonomy/control over medicines and concerns about medicine use was derived, which explained 57.4% of the total variation. Six of the eight subscales had acceptable internal consistency (α>0.7. More positive experiences were observed among patients using eight or fewer medicines compared to nine or more, and those independent with managing/using their medicines versus

  9. Hadronic Shower Validation Experience for the ATLAS End-Cap Calorimeter

    Science.gov (United States)

    Kiryunin, A. E.; Salihagić, D.

    2007-03-01

    Validation of GEANT4 hadronic physics models is carried out by comparing experimental data from beam tests of modules of the ATLAS end-cap calorimeters with GEANT4 based simulations. Two physics lists (LHEP and QGSP) for the simulation of hadronic showers are evaluated. Calorimeter performance parameters like the energy resolution and response for charged pions and shapes of showers are studied. Comparison with GEANT3 predictions is done as well.

  10. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  11. Numerical investigation and experimental validation of physically based advanced GTN model for DP steels

    Energy Technology Data Exchange (ETDEWEB)

    Fansi, Joseph, E-mail: jfansi@doct.ulg.ac.be [University of Liège, Departement ArGEnCo, Division MS2F, Chemin des Chevreuils 1, Liège 4000 (Belgium); Arts et Métiers ParisTech, LEM3, UMR CNRS 7239, 4 rue A. Fresnel, 57078 Metz cedex 03 (France); ArcelorMittal R and D Global Maizières S.A., voie Romaine, Maizières-Lès-Metz 57238 (France); Balan, Tudor [Arts et Métiers ParisTech, LEM3, UMR CNRS 7239, 4 rue A. Fresnel, 57078 Metz cedex 03 (France); Lemoine, Xavier [Arts et Métiers ParisTech, LEM3, UMR CNRS 7239, 4 rue A. Fresnel, 57078 Metz cedex 03 (France); ArcelorMittal R and D Global Maizières S.A., voie Romaine, Maizières-Lès-Metz 57238 (France); Maire, Eric; Landron, Caroline [INSA de Lyon, MATEIS CNRS UMR5510, 7 Avenue Jean Capelle, Villeurbanne 69621 (France); Bouaziz, Olivier [ArcelorMittal R and D Global Maizières S.A., voie Romaine, Maizières-Lès-Metz 57238 (France); Ecole des Mines de Paris, Centre des Matériaux, CNRS UMR 7633, BP 87, Evry Cedex 91003 (France); Ben Bettaieb, Mohamed [Ensicaen, 6 Boulevard du Maréchal Juin, 14050 CAEN Cedex 4 (France); Marie Habraken, Anne [University of Liège, Departement ArGEnCo, Division MS2F, Chemin des Chevreuils 1, Liège 4000 (Belgium)

    2013-05-01

    This numerical investigation of an advanced Gurson–Tvergaard–Needleman (GTN) model is an extension of the original work of Ben Bettaiebet al. (2011 [18]). The model has been implemented as a user-defined material model subroutine (VUMAT) in the Abaqus/explicit FE code. The current damage model extends the previous version by integrating the three damage mechanisms: nucleation, growth and coalescence of voids. Physically based void nucleation and growth laws are considered, including an effect of the kinematic hardening. These new contributions are based and validated on experimental results provided by high-resolution X-ray absorption tomography measurements. The current damage model is applied to predict the damage evolution and the stress state in a tensile notched specimen experiment.

  12. Modeling and Experimental Validation of a Transient Direct Expansion Heat Pump

    Directory of Open Access Journals (Sweden)

    Clément Rousseau

    2017-06-01

    Full Text Available Geothermal heat pump technology is currently one of the most interesting technologies used to heat buildings. There are two designs used in the industry: geothermal heat pump using a secondary ground loop and Direct Expansion (DX ground source heat pump. The latter is less used, possibly because less research has been carried out for the design of this kind of heat pump. In this paper, a transient model using the Comsol Multiphysic of a DX ground heat pump is presented in heating mode with R22, and a comparison with experimental results is presented with a 24-hour test. It is shown that the model was adequately validated by our experiment with only a maximum difference of 15%. Following this validation, a parametric analysis was realised on the geometry of the borehole. This study concluded that to have the best heat extraction of the ground, the pipes shank spacing need to be important without increasing the borehole diameter. Keywords: Direct Expansion geothermal heat pump, Modeling, R22 Article History: Received January 16th 2017; Received in revised form May 28th 2017; Accepted June 6th 2017; Available online How to Cite This Article: Rousseau, C., Fannou, J.L.C., Lamarche, L. and Kajl, S. (2017 Modeling and Experimental Validation of a Transient Direct Expansion Heat Pump. International Journal of Renewable Energy Develeopment, 6(2, 145-155. https://doi.org/10.14710/ijred.6.2.145-155

  13. Validation of the model of Critical Heat Flux COBRA-TF compared experiments of Post-Dryout performed by the Royal Institute of Technology (KTH); Validacion del Modelo de Critical Heat Flux de COBRA-TF frente a los Experimentos de Post-Dryout realizados por el Royal Institute of Technology (KTH)

    Energy Technology Data Exchange (ETDEWEB)

    Abarca, A.; Miro, R.; Barrachina, T.; Verdu, G.

    2014-07-01

    In this work is a validation of the results obtained with different existing correlations for predicting the value and location of the CTF code CHF, using them for experiments of Post-Dryout conducted by the Royal Institute of Technology (KTH) in Stockholm, Sweden. (Author)

  14. Validation of a Parcel-Based Reduced-Complexity Model for River Delta Formation (Invited)

    Science.gov (United States)

    Liang, M.; Geleynse, N.; Passalacqua, P.; Edmonds, D. A.; Kim, W.; Voller, V. R.; Paola, C.

    2013-12-01

    Reduced-Complexity Models (RCMs) take an intuitive yet quantitative approach to represent processes with the goal of getting maximum return in emergent system-scale behavior with minimum investment in computational complexity. This approach is in contrast to reductionist models that aim at rigorously solving the governing equations of fluid flow and sediment transport. RCMs have had encouraging successes in modeling a variety of geomorphic systems, such as braided rivers, alluvial fans, and river deltas. Despite the fact that these models are not intended to resolve detailed flow structures, questions remain on how to interpret and validate the output of RCMs beyond qualitative behavior-based descriptions. Here we present a validation of the newly developed RCM for river delta formation with channel dynamics (Liang, 2013). The model uses a parcel-based 'weighted-random-walk' method that resolves the formation of river deltas at the scale of channel dynamics (e.g., avulsions and bifurcations). The main focus of this validation work is the flow routing model component. A set of synthetic test cases were designed to compare hydrodynamic results from the RCM and Delft3D, including flow in a straight channel, around a bump, and flow partitioning at a single bifurcation. Output results, such as water surface slope and flow field, are also compared to field observations collected at Wax Lake Delta. Additionally, we investigate channel avulsion cycles and flow path selection in an alluvial fan with differential styles of subsidence and compare model results to laboratory experiments, as a preliminary effort in pairing up numerical and experimental models to understand channel organization at process scale. Strengths and weaknesses of the RCM are discussed and potential candidates for model application identified.

  15. SSC 40 mm short model construction experience

    Energy Technology Data Exchange (ETDEWEB)

    Bossert, R.C.; Brandt, J.S.; Carson, J.A.; Dickey, C.E.; Gonczy, I.; Koska, W.A.; Strait, J.B.

    1990-04-01

    Several short model SSC magnets have been built and tested at Fermilab. They establish a preliminary step toward the construction of SSC long models. Many aspects of magnet design and construction are involved. Experience includes coil winding, curing and measuring, coil end part design and fabrication, ground insulation, instrumentation, collaring and yoke assembly. Fabrication techniques are explained. Design of tooling and magnet components not previously incorporated into SSC magnets are described. 14 refs., 18 figs., 2 tabs.

  16. PIV-validated numerical modeling of pulsatile flows in distal coronary end-to-side anastomoses.

    Science.gov (United States)

    Xiong, F L; Chong, C K

    2007-01-01

    This study employed particle image velocimetry (PIV) to validate a numerical model in a complementary approach to quantify hemodynamic factors in distal coronary anastomoses and to gain more insights on their relationship with anastomotic geometry. Instantaneous flow fields and wall shear stresses (WSS) were obtained from PIV measurement in a modified life-size silastic anastomosis model adapted from a conventional geometry by incorporating a smooth graft-artery transition. The results were compared with those predicted by a concurrent numerical model. The numerical method was then used to calculate cycle-averaged WSS (WSS(cyc)) and spatial wall shear stress gradient (SWSSG), two critical hemodynamic factors in the pathogenesis of intimal thickening (IT), to compare the conventional and modified geometries. Excellent qualitative agreement and satisfactory quantitative agreement with averaged normalized error in WSS between 0.8% and 8.9% were achieved between the PIV experiment and numerical model. Compared to the conventional geometry, the modified geometry produces a more uniform WSS(cyc) distribution eliminating both high and low WSS(cyc) around the toe, critical in avoiding IT. Peak SWSSG on the artery floor of the modified model is less than one-half that in the conventional case, and high SWSSG at the toe is eliminated. The validated numerical model is useful for modeling unsteady coronary anastomotic flows and elucidating the significance of geometry regulated hemodynamics. The results suggest the clinical relevance of constructing smooth graft-artery transition in distal coronary anastomoses to improve their hemodynamic performance.

  17. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  18. Validation of the dermal exposure model in ECETOC TRA.

    Science.gov (United States)

    Marquart, Hans; Franken, Remy; Goede, Henk; Fransman, Wouter; Schinkel, Jody

    2017-08-01

    The ECETOC TRA model (presently version 3.1) is often used to estimate worker inhalation and dermal exposure in regulatory risk assessment. The dermal model in ECETOC TRA has not yet been validated by comparison with independent measured exposure levels. This was the goal of the present study. Measured exposure levels and relevant contextual information were gathered via literature search, websites of relevant occupational health institutes and direct requests for data to industry. Exposure data were clustered in so-called exposure cases, which are sets of data from one data source that are expected to have the same values for input parameters in the ECETOC TRA dermal exposure model. For each exposure case, the 75th percentile of measured values was calculated, because the model intends to estimate these values. The input values for the parameters in ECETOC TRA were assigned by an expert elicitation and consensus building process, based on descriptions of relevant contextual information.From more than 35 data sources, 106 useful exposure cases were derived, that were used for direct comparison with the model estimates. The exposure cases covered a large part of the ECETOC TRA dermal exposure model. The model explained 37% of the variance in the 75th percentiles of measured values. In around 80% of the exposure cases, the model estimate was higher than the 75th percentile of measured values. In the remaining exposure cases, the model estimate may not be sufficiently conservative.The model was shown to have a clear bias towards (severe) overestimation of dermal exposure at low measured exposure values, while all cases of apparent underestimation by the ECETOC TRA dermal exposure model occurred at high measured exposure values. This can be partly explained by a built-in bias in the effect of concentration of substance in product used, duration of exposure and the use of protective gloves in the model. The effect of protective gloves was calculated to be on average a

  19. Analysis of Fresh Fuel Critical Experiments Appropriate for Burnup Credit Validation

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1995-01-01

    The ANS/ANS-8.1 standard requires that calculational methods used in determining criticality safety limits for applications outside reactors be validated by comparison with appropriate critical experiments. This report provides a detailed description of 34 fresh fuel critical experiments and their analyses using the SCALE-4.2 code system and the 27-group ENDF/B-IV cross-section library. The 34 critical experiments were selected based on geometry, material, and neutron interaction characteristics that are applicable to a transportation cask loaded with pressurized-water-reactor spent fuel. These 34 experiments are a representative subset of a much larger data base of low-enriched uranium and mixed-oxide critical experiments. A statistical approach is described and used to obtain an estimate of the bias and uncertainty in the calculational methods and to predict a confidence limit for a calculated neutron multiplication factor. The SCALE-4.2 results for a superset of approximately 100 criticals are included in uncertainty analyses, but descriptions of the individual criticals are not included.

  20. Analysis of fresh fuel critical experiments appropriate for burnup credit validation

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.; Bowman, S.M.

    1995-10-01

    The ANS/ANS-8.1 standard requires that calculational methods used in determining criticality safety limits for applications outside reactors be validated by comparison with appropriate critical experiments. This report provides a detailed description of 34 fresh fuel critical experiments and their analyses using the SCALE-4.2 code system and the 27-group ENDF/B-IV cross-section library. The 34 critical experiments were selected based on geometry, material, and neutron interaction characteristics that are applicable to a transportation cask loaded with pressurized-water-reactor spent fuel. These 34 experiments are a representative subset of a much larger data base of low-enriched uranium and mixed-oxide critical experiments. A statistical approach is described and used to obtain an estimate of the bias and uncertainty in the calculational methods and to predict a confidence limit for a calculated neutron multiplication factor. The SCALE-4.2 results for a superset of approximately 100 criticals are included in uncertainty analyses, but descriptions of the individual criticals are not included.

  1. Neuroinflammatory targets and treatments for epilepsy validated in experimental models.

    Science.gov (United States)

    Aronica, Eleonora; Bauer, Sebastian; Bozzi, Yuri; Caleo, Matteo; Dingledine, Raymond; Gorter, Jan A; Henshall, David C; Kaufer, Daniela; Koh, Sookyong; Löscher, Wolfgang; Louboutin, Jean-Pierre; Mishto, Michele; Norwood, Braxton A; Palma, Eleonora; Poulter, Michael O; Terrone, Gaetano; Vezzani, Annamaria; Kaminski, Rafal M

    2017-07-01

    A large body of evidence that has accumulated over the past decade strongly supports the role of inflammation in the pathophysiology of human epilepsy. Specific inflammatory molecules and pathways have been identified that influence various pathologic outcomes in different experimental models of epilepsy. Most importantly, the same inflammatory pathways have also been found in surgically resected brain tissue from patients with treatment-resistant epilepsy. New antiseizure therapies may be derived from these novel potential targets. An essential and crucial question is whether targeting these molecules and pathways may result in anti-ictogenesis, antiepileptogenesis, and/or disease-modification effects. Therefore, preclinical testing in models mimicking relevant aspects of epileptogenesis is needed to guide integrated experimental and clinical trial designs. We discuss the most recent preclinical proof-of-concept studies validating a number of therapeutic approaches against inflammatory mechanisms in animal models that could represent novel avenues for drug development in epilepsy. Finally, we suggest future directions to accelerate preclinical to clinical translation of these recent discoveries. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  2. A predictive bone drilling force model for haptic rendering with experimental validation using fresh cadaveric bone.

    Science.gov (United States)

    Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen

    2017-01-01

    Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.

  3. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail: james.hoyes@hsl.gsi.gov.uk; Ivings, M.J.

    2016-12-15

    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  4. Refining Grasp Affordance Models by Experience

    DEFF Research Database (Denmark)

    Detry, Renaud; Kraft, Dirk; Buch, Anders Glent;

    2010-01-01

    We present a method for learning object grasp affordance models in 3D from experience, and demonstrate its applicability through extensive testing and evaluation on a realistic and largely autonomous platform. Grasp affordance refers here to relative object-gripper configurations that yield stabl...

  5. Bicycle Rider Control: Observations, Modeling & Experiments

    NARCIS (Netherlands)

    Kooijman, J.D.G.

    2012-01-01

    Bicycle designers traditionally develop bicycles based on experience and trial and error. Adopting modern engineering tools to model bicycle and rider dynamics and control is another method for developing bicycles. This method has the potential to evaluate the complete design space, and thereby dev

  6. Finds in Testing Experiments for Model Evaluation

    Institute of Scientific and Technical Information of China (English)

    WU Ji; JIA Xiaoxia; LIU Chang; YANG Haiyan; LIU Chao

    2005-01-01

    To evaluate the fault location and the failure prediction models, simulation-based and code-based experiments were conducted to collect the required failure data. The PIE model was applied to simulate failures in the simulation-based experiment. Based on syntax and semantic level fault injections, a hybrid fault injection model is presented. To analyze the injected faults, the difficulty to inject (DTI) and difficulty to detect (DTD) are introduced and are measured from the programs used in the code-based experiment. Three interesting results were obtained from the experiments: 1) Failures simulated by the PIE model without consideration of the program and testing features are unreliably predicted; 2) There is no obvious correlation between the DTI and DTD parameters; 3) The DTD for syntax level faults changes in a different pattern to that for semantic level faults when the DTI increases. The results show that the parameters have a strong effect on the failures simulated, and the measurement of DTD is not strict.

  7. Development and validation of the Measure of Indigenous Racism Experiences (MIRE

    Directory of Open Access Journals (Sweden)

    Paradies Yin C

    2008-04-01

    Full Text Available Abstract Background In recent decades there has been increasing evidence of a relationship between self-reported racism and health. Although a plethora of instruments to measure racism have been developed, very few have been described conceptually or psychometrically Furthermore, this research field has been limited by a dearth of instruments that examine reactions/responses to racism and by a restricted focus on African American populations. Methods In response to these limitations, the 31-item Measure of Indigenous Racism Experiences (MIRE was developed to assess self-reported racism for Indigenous Australians. This paper describes the development of the MIRE together with an opportunistic examination of its content, construct and convergent validity in a population health study involving 312 Indigenous Australians. Results Focus group research supported the content validity of the MIRE, and inter-item/scale correlations suggested good construct validity. A good fit with a priori conceptual dimensions was demonstrated in factor analysis, and convergence with a separate item on discrimination was satisfactory. Conclusion The MIRE has considerable utility as an instrument that can assess multiple facets of racism together with responses/reactions to racism among indigenous populations and, potentially, among other ethnic/racial groups.

  8. Development and validation of mathematical modelling for pressurised combustion

    Energy Technology Data Exchange (ETDEWEB)

    Richter, S.; Knaus, H.; Risio, B.; Schnell, U.; Hein, K.R.G. [University of Stuttgart, Stuttgart (Germany). Inst. fuer Verfahrenstechnik und Dampfkesselwesen

    1998-12-31

    The advanced 3D-coal combustion code AIOLOS for quasi-stationary turbulent reacting flows is based on a conservative finite-volume procedure. Equations for the conservation of mass, momentum and scalar quantities are solved. In order to deal with pressurized combustion chambers which are usually of cylindrical shape, a first task in the frame of the project consisted in the extension of the code towards cylindrical co-ordinates, since the basic version of AIOLOS was only suitable for cartesian grids. Furthermore, the domain decomposition method was extended to the new co-ordinate system. Its advantage consists in the possibility to introduce refined sub-grids, providing a better resolution of regions where high gradients occur (e.g. high velocity and temperature gradients near the burners). The accuracy of the code was proven by means of a small-scale test case. The results obtained with AIOLOS were compared with the predictions of the commercial CFD-code FLOW3D and validated against the velocity and temperature distributions measured at the test facility. The work during the second period focused mainly on the extension of the reaction model, as well as on the modelling of the optical properties of the flue gas. A modified submodel for char burnout was developed, considering the influence of pressure on diffusion mechanisms and on the chemical reaction at the char particle. The goal during the third project period was to improve the numerical description of turbulence effects and of the radiative heat transfer, in order to obtain an adequate modelling of the complex processes in pressurized coal combustion furnaces. Therefore, a differential Reynolds stress turbulence model (RSM) and a Discrete-Ordinates radiation model were implemented, respectively. 13 refs., 13 figs., 1 tab.

  9. STORMVEX: The Storm Peak La